“We’re a constantly bombarded first line of defense against potential trauma.” Ex-Facebook content analyst talks about stress, insomnia, and work-related paranoia
A former employee of Accenture, a company that provides content moderation services to Facebook, wrote a detailed letter about the problems his/her colleagues face. They spend eight hours a day moderating the most disgusting content, and some are forced to self-medicate their stress with drugs and alcohol. Their bosses respond to their complaints with advice to take a break and breathe, and if that doesn’t work, to quit.
The letter was posted on Twitter. AIN.UA selected the key points.
Insomnia, paranoia, and mental disorders
The author of the letter worked at Accenture as a Facebook content moderator for 2.5 years. He/she saw a lot of bad things during that time.
“This is a job that’s impossible not to take home with you. It is common to experience fatigue, weight changes, and intrusive thoughts, and I’ve heard stories from people on child exploitation queues who started seeing every adult with a child as a potential predator.
I know people who went on psychiatric medication for the first time while working as a content analyst and others who self-medicate with alcohol and drugs,” the letter says.
Are you upset? Just stop being upset!
Employee health support programs are ineffective. Special coaches work with the analysts to provide emotional support. The author notes that they know their job well, but it is not easy to get an appointment because of their remote work and management attitudes.
Most managers believe that coping with psychological problems is very easy – you just need to stop getting upset!
“If you see something that upsets you, you should be able to deal with it by stepping away and doing a breathing exercise; if the problem is really bad, you can talk to a wellness coach.”
However, as the author points out, the real problem is not the lingering impact of individual images. There are thousands of such pictures that Facebook moderators deal with every day.
“Content analysts are paid to look at the worst of humanity for eight hours a day. We’re the tonsils of the internet, a constantly bombarded first line of defense against potential trauma to the userbase.”
Employees who can no longer bear the stress are offered to quit their jobs. However, the author notes that many are not ready to take such a step because if they lose their jobs during a pandemic, they will lose the opportunity to feed their families. Getting another job is difficult – since all moderators sign NDAs with not very clear terms of non-disclosure. People do not know for sure what they can and cannot hypothetically disclose about their previous job during an interview. That makes it very difficult to get hired.
The author notes that he/she has not been able to access an NDA to examine since he/she first signed it.
How to deal with it
The author gives some recommendations on how to make life easier for Facebook moderators.
- Give managers training that impresses upon them the fact that mental health can’t be dealt with mechanically.
- Give analysts more wellness time and write this into contracts.
- Allow content analysts to claim therapy as an expense or give vouchers covering the cost of therapy.
- Relax the NDA.
- The most important one: Reduce the amount of time that people spend in safety queues. All workers should have the option to move to another queue every six months. Further, they shouldn’t spend all of their work time in these queues. They should be trained on something else so that they only spend about half of their work hours on safety.
The system is bad not only for content analysts but for Facebook itself
“Those who spend the most time in the queues have the least input as to policy. Analysts are able to raise issues to QAs who can then raise them to Facebook FTEs. It can take months for issues to be addressed, if they are addressed at all.”
The author of the letter says that it would be easier if the analysts could communicate directly with those responsible for designing policy. It would be ideal if there were regularly scheduled meetings between analysts and policy designers in addition to avenues for communication in case of emergencies.
Dream nightmare job
In 2019, The Verge talked to 12 former Facebook content moderators to learn how much they were paid for their work. It turned out that the conditions, to put it mildly, were not very good.
The median salary at Facebook is $240,000 a year, while moderators in Arizona make only $28,800. They work in cramped offices with very few bathroom stalls. Meanwhile, working hours are strictly regulated: they have two 15-minute breaks and one 30-minute lunch break. Also, they can spend another nine minutes a day of “wellness time,” which employees are supposed to use if they feel traumatized by content.
Moderators often see terrible things – racist jokes, videos of murders, or sex with animals. The ability to cope with stress is further complicated by the fact that they cannot share this with anyone – even their closest family members. Curiously, the current employees with whom journalists managed to talk said that they do not frequently see disturbing content – in most cases, those are safe posts.
In 2018, a former Facebook moderator in California sued the company, saying her job as a contractor had left her with post-traumatic stress disorder (PTSD). There are many cases like this; it’s just not all of them reach the courts. Former employees complain of paranoia, suicidal thoughts, and panic attacks.