Facebook moderators in Barcelona get sick from the atrocious content to review

Among all the images that La Vanguardia must warn that, even in writing, the description is very harsh.

Oliver Thansan
Oliver Thansan
05 October 2023 Thursday 10:21
41 Reads
Facebook moderators in Barcelona get sick from the atrocious content to review

Among all the images that La Vanguardia must warn that, even in writing, the description is very harsh. It is the video of a man, drunk, who murders his son, a month-old baby, on camera. He sticks a knife in her chest, rips it open, rips out her heart. He bites the heart. “The baby's screams. Blood, a lot of blood…” X. takes a breath. They are inconceivable images that, once seen, cannot be erased.

X. was 19 years old when, having recently arrived from Brazil, in 2018 he started working at CCC Barcelona Digital Services, in the Glòries tower. The company, acquired in 2020 by the Canadian group Telus International, is subcontracted by Meta for content moderation on Facebook and Instagram. It employs 2,030 people in Barcelona; at least 20% are sick. In large part, due to traumas similar to those of X. They are the guardians of social networks.

To X. it seemed like a leap from the waiter and call center jobs he had had until then. “They didn't ask for training or experience. They paid almost two thousand euros, the offices were impressive and it was like getting a foot in Facebook,” she recalls. They told him that he had to study the policies of Mark Zuckerberg's company and based on them make quick decisions about whether a photo, video or text should be removed. He was not informed, he claims, of the volume or hardness of the content to which he would be exposed.

The dream soon turned into a nightmare. After two months, due to his good results, he was assigned to the “high priority group”, the team in charge of the wildest content. “Brutal murders, dismemberments, torture, rape, live suicides, children being abused. The worst of humans,” she summarizes. A few weeks were enough for him to see that she didn't want to be there. She asked to go back to soft content, she was denied. The nightmares, the insomnia, the violent daydreams began. He needed the money and he held on.

Until the first panic attack came. “It was a direct suicide. Facebook has a strict policy on when to notify the police, it can't just be because someone says they are going to kill themselves, there must be a gun, a rope, an open window in sight. It often happens that the person takes out a hidden gun and kills himself without you being able to do anything. That day was like that, but worse, because the guy used a very big weapon. It was all blood and brains. I fell to the ground, I couldn't breathe. I asked to see the company psychologist, she saw me after an hour. Only I spoke. When I finished she told me that she did a very important job for society and I had to be strong. And that time was up and I had to return to my position.”

Five years later, he is still under psychiatric treatment, takes five different medications, and one day is good if he has no more than two panic attacks. “In my head now there is only death. Death, blood, pain, fear. Scared all the time. Afraid of going out on the street, afraid of staying home, afraid of remembering, afraid of sleeping,” he says.

While he fights to heal, he is immersed in another battle, a judicial one. In 2020, already outside the company, he filed a complaint with the Labor Inspection, which resulted in a penalty of 40,985 euros to Telus for not evaluating the worker's psychosocial risks imposed in November 2022; The company has appealed. At the same time,

Another judicial front is being opened for Telus – and Meta as the ultimate party responsible – today, because a worker plans to file a criminal complaint for damages. The lawsuit accuses both companies of crimes against workers' rights, injuries due to serious recklessness and against moral integrity. Other employees who are currently litigating against Telus on the labor front will probably be added to this avenue.

“We have decided to go through criminal proceedings because non-compliance is bloody. The company is fully aware of what is happening, that it is generating masses of sick young people, and it neither takes responsibility nor does anything to remedy it,” says Francesc Feliu, from the Feliu Fins firm and the plaintiff's lawyer.

There are no precedents in Spanish justice. In Ireland, where Meta has its European headquarters, 35 moderators have filed complaints with the High Court. Some are employees of CCC in Barcelona, ​​confirms lawyer Diane Treanor, from the Coleman Legal firm. In the US, where the class action lawsuit exists, in 2020 Facebook compensated more than 11,000 moderators with $52 million. In Kenya, another 184 have denounced Meta and two subcontracted companies.

Accused of promoting misinformation, violent attacks and even genocide with its algorithms, designed to prioritize interaction (and economic profit) over security, Meta has invested billions and hired 40,000 people, including 15,000 moderators through third-party companies. . “We have more than 20 centers in various countries, where these teams can review content in more than 50 languages,” she says on her website.

Meta reports every quarter on the volume of content it removes. In the second of 2023, there were 7.2 million content for sexual abuse of children, another 6.4 million for suicide and self-harm on Facebook, 6.2 million for violence on Instagram, 17.5 million for hate speech on Facebook . The list goes on.

The path taken by X.'s complaint can serve as a precedent for hundreds of people. In addition to the reviewers who have passed through Telus, a company called Majorel does the same for Tik-Tok, the Chinese social network, from 22@ in Barcelona.

La Vanguardia has interviewed nine other CCC/Telus moderators who report mental disorders. In their medical reports there is post-traumatic stress, anxiety, insomnia, depression or suicide attempts. Some are still in the company, on leave; others have left. Some consider reporting, others have enough of their own.

The company is not going through a good time. Telus has undertaken an ERTE (in March) and an ERE (August), which it has justified by the drop in demand from Meta, within the framework of the technology crisis. He does not link it to the mental health epidemic in his workforce. However, he admitted in May, in a letter to his workers announcing that he was stopping paying the disability supplement, that the absenteeism rate was 19.8%. According to union sources, there have been spikes of 25%. It is more than double the average for the Contact Center sector, which is already high. In the letter, the company suggests that many casualties are false, since "as a result" of the ERTE they skyrocketed "in an exorbitant manner."

They have hired Q-ready, an “absenteeism management” company from the Quirón group. “We have involved an external provider to support our team members who are on medical leave and determine what additional support they need. They are also working with us to develop a reintegration plan for these individuals with their health and safety as a priority,” Roger Clancy, vice president of operations at Telus, stated via email.

The current workforce in Barcelona is 2,030 people, about 400 less than at the beginning of the year, after the ERE and a policy to encourage voluntary departure. Moderators are divided into markets, depending on the language or cultural area they deal with. Portuguese (includes Portugal, Brazil and some African countries) and Latin American in Spanish are the largest, with about 800 moderators each. Other markets are French, Italian, Nordic, Hebrew or Spanish-Catalan. Salaries range between 24,000 and 30,000 euros gross; They charge less in Portuguese and Latin, more in Nordic. There is a restaurant ticket, transportation voucher and private medical insurance.

There are, of course, those who are happy with the job. The Mexican P. confesses that he is “perhaps less sensitive to certain content, perhaps because of the reality of my country, where there are murders every day.” He also emphasizes that he has not been at Telus since the beginning, but rather for just over a year and with time – and the work of the reviewers – “the worst content has been looking for other platforms, such as Only Fans or Telegram”; he checks Instagram.

Not everyone is exposed to the same volume of graphic content. It varies depending on the “queues” to which they are assigned and especially the market. In Latin America there is a lot of “terrorism”: videos posted by criminal groups, with torture and savage executions, to intimidate. It is no coincidence that the majority of traumatized moderators interviewed are from these markets.

“Those who come to me are the ones who can't stand it,” confides a psychiatrist who has treated six moderators. The first months they are happy with the salary and the work seems easy to them. I had a French patient who was delighted until they made him moderate content from French-speaking African countries.”

“It is easy for them to develop nightmares, intrusive images, sleep or concentration disorders, depression or anxiety with avoidance of the workplace,” says Benedikt Amann, director of the psychiatry research unit at the Forum Center at Hospital del Mar. “Although There are people who go to that job with an ethic, of filtering evil and improving the world, the company should inform in advance what exactly it will consist of, and should warn that with a history of vulnerability they have a high and real risk of suffering post-traumatic stress. ”.

There are differences between those who joined the company at the beginning and the latest batches. While the former denounce that they were not reliably informed of the nature of the work, those hired in the last two years knew what they were doing, but underestimated the effect it would have on them. This is the case of a 30-year-old Brazilian woman. As a child she had depression, linked to bullying. The company never asked her about her history when hiring her, although she herself admits that, so many years later, she didn't think she was at risk either. They put her in the “live suicide queue.” She agreed that she separated from her partner. She went into depression. She had two suicide attempts.

It is a key element in the lawsuits in Ireland, says lawyer Treanor: “The company does not screen or look at whether it is a person with a history of self-harm or suicide attempts and who will then be exposed to this type of content, when any psychologist will tell you that it is a trigger.”

Lawyer and activist Cori Crider, who had just defended Guantánamo detainees, admits that when a few years ago she came across the first stories of sick moderators, it was difficult for her to assimilate that someone could develop post-traumatic stress simply from having viewed violent images on a computer. “It is a job that can have serious neurological effects. Like a pesticide factory: not everyone develops cancer, but it considerably increases the chances and we all understand that protective measures must be taken,” argues Crider, co-founder of Fox Glove, a London-based NGO that ensures fair technology.

Companies know this, Crider points out. Accenture, which also moderates content for Facebook, has its employees sign an informed consent that says: “I understand that the content I will have to review will be disturbing. “It may have an impact on my mental health and could even cause post-traumatic stress.”

“The work of moderators is necessary and important, and unfortunately I don't think we will ever be able to completely derive it from artificial intelligence. What we demand is that technology companies ensure that it is safer work, and it is not enough to tell employees to protect themselves. It's his responsability. It involves properly selecting workers. Train them. Give them psychological treatment and the breaks they need. And also a salary commensurate with the risks,” Crider claims.

The lawyer points out, for example, that, in the United Kingdom, police officers who track pedophilia on the Internet "have an army of psychiatrists and strict limits on the amount of content they can see, precisely because the effect it causes is well known."

On its website, Meta ensures that its moderators “learn how to access resilience and well-being resources, and how to contact a professional if they need additional help.” Terms similar to those used by the Telus executive: “We have designed a resilience and well-being program that includes a comprehensive benefits program for access to physical and mental health services,” says Roger Clancy.

The moderators interviewed consider that the psychological care offered is very insufficient. “We have 40 minutes a week of what they call wellness. It means you can talk to someone. But if you tell them that you are wrong, what they tell you is to look for another job,” says a moderator. “The company is used to people who do not resist taking sick leave and then leaving the company. “Unclaimed.”

Cori Crider applauds the fact that there are moderators who individually sue the company and obtain compensation. But she is convinced that change will not come here. “I understand that people want justice. But our priority is to change this job. This requires committed and organized unions, and also more regulation. Governments must understand the problem and regulate, and perhaps that also means preventing technology companies from being so large and powerful. Fines and compensation are not going to make Facebook change its policy. The same day that the Federal Trade Commission fined Facebook $5 billion, its shares rose on the stock market.”

Do you want to share any confidential information with the A Fondo team? Write to us at afondo@lavanguardia.es