Every job comes with occupational hazards; in a perfect world, companies and employees work together to reduce those risks and foster success. In December 1970, the Occupational Safety and Health Act (OSHA) was enacted by Congress, primarily “to assure safe and healthful working conditions for working men and women.”
Even as once-standard benefits like employer-paid healthcare slowly go extinct, companies still have a legal responsibility to provide safe workplaces (even under Trump … for now). And though full-time employment opportunities increased in 2017, an ever-shrinking number of companies take pride in being a great place to work, going above and beyond what the law requires.
Except, of course, in Silicon Valley. The tech industry has long had a reputation for providing some of the best workplace perks in the country.
Everyone knows by now that tech workers in Silicon Valley get lavish perks such as round-the-clock free food and unlimited vacation days. But as competition to recruit and retain the world’s best software engineers has increased, so has the quality of the benefits.
But not all tech workers are created equal.
While engineers race to build the perfect algorithms and artificial intelligence that can quickly and accurately identify harmful content on the internet, tech companies such as Facebook and Google can’t hesitate in the face of increasing pressure to make the internet safer for everyone.
So in the meantime, tens of thousands of real people, not robots, do that dirty work of sifting through everything from hate speech to violent pornography. Known as content moderators, these folks have what is quite possibly the worst job in tech.
Despite doing one of the most grueling jobs at some of the largest companies in the world, the Wall Street Journal reports that most content moderators are contractors, and thus don’t receive even a fraction of the sweet perks one might expect as a full-time staffer.
Facebook decided years ago to rely on contract workers to enforce its policies. Executives considered the work to be relatively low-skilled compared with, say, the work performed by Facebook engineers, who typically hold computer-science degrees and earn six-figure salaries, plus stock options and benefits.
Despite their work being viewed as unskilled and unworthy of over-the-top office luxuries, spending 40 hours a week with the worst of the internet takes a huge toll on a person.
Former content moderators recall having to view images of war victims who had been gutted or drowned and child soldiers engaged in killings. One former Facebook moderator reviewed a video of a cat being thrown into a microwave.
A former content moderator at Google says the content moderators he worked with were hit hardest by images of child sexual abuse. “The worst part is knowing some of this happened to real people,” he says.
Talk about a workplace hazard. At Facebook alone, over a million user reports come in each day, so employees are expected to make decisions in seconds, then move on to the next report, no matter what they’ve seen, or how poorly equipped they might be to cope.
Sarah Katz says she reviewed as many as 8,000 posts a day, with little training on how to handle the distress, though she had to sign a waiver warning her about what she would encounter.
As this field grows—Facebook and Google are both rapidly expanding their content moderation teams—so does the need to address the psychological dangers to employees doing this work, rather demand they sign more paperwork.
Some companies are trying to do more, which isn’t hard if you were doing nothing to begin with. But by outsourcing most of these workers, the behemoths are able to shift the responsibility to staffing agencies.
Facebook requires that its content moderators be offered counseling through PRO Unlimited, which actually employs many of those workers. They can have as many as three face-to-face counseling sessions a year.
You read that right: as many as three whole counseling sessions a year. How generous.
Content moderators aren't the only ones who face daily trauma in the workplace. 911 dispatchers have been fighting for years to be reclassified as first responders, rather than clerical staff, in order to access much-needed support and resources for the “indirect trauma” they face every single day. In 1995, the term “vicarious trauma” was created to describe the psychological difficulties faced by social workers and law enforcement professionals who work closely with victims of violence and abuse.
Yet the phenomenon, also known as “the cost of caring,” can’t be found in any OSHA handbook, because OSHA doesn’t actually consider it a workplace hazard.
Meanwhile, at Microsoft, two content moderators, Harry Soto and Greg Blauert, are fighting back.
Rebecca Roe, a lawyer for Blauert, says tech companies should be held accountable for the well-being of content moderators. Contractors are especially at risk because they have little job security and are less likely to seek help, she says.