Thumbs down. The dark side of Facebook

Rapes, beheadings, self-mutilation by depressed young people: Facebook’s urge to grow is so big that their employees burn out after viewing such images.

By Pieter Beens

“For every like, I will not eat for five hours #sad #sad girl #fear #anorexia #depression #eating disorder #dead #late me.”

A desperate young woman shares the above message. It is one of thousands passing through the renovated Siemens factory complex in Spandau every day. Content moderators work in shifts to filter an incessant stream of reported posts to keep Facebook fun for its users. The content: child pornography, videos of suicide, failed challenges, beheadings. The salary: 8.90 euros per hour. The average duration of employment: a few months. Intense images and poor employee guidance are a guarantee for many post-traumatic stress disorders.

Who are these people who sacrifice their mental health for a meager wage?

They are fathers and mothers, desperate job seekers and a decayed workforce, Sjarrel de Charon writes in The Flipside of Facebook, which was released last year. Driven by the promise of financial security or curiosity, content moderators go to Arvato Bertelsmann, Accenture or Cognizant. Facebook uses these three employment agencies to clean up the social medium for next to nothing. Giving up is often not an option: no job means no bread. To make ends meet every month, the employees combine their work at the Facebook cleaning team with various other (cleaning) jobs. They do not have job security: Facebook has already moved part of the work to Manila, where cheap labor does the dirty work for an even poorer wage.

In the former Siemens factory, they see images that they will never be able to forget. De Charon himself sought refuge in a combination of drink and medicine, ended up with a psychologist and psychiatrist within 2 months and experienced panic attacks in the subway after his departure.

Ambiguous policy

“Stop it. You're so worthless. Nobody wants you anymore.”

Purging Facebook seems like a hopeless task. Swearing, death threats and shocking images are the order of the day. After a crash course on Facebook policy, the content moderators perform the ungrateful task of passing judgment: does the message concern a person from a protected or unprotected category, is there discrimination based on race, origin or other characteristics or is this a typical case of hate speech? Each category has its subcategories and the logic of Facebook is baffling. The Dutch-language part of Facebook also has a number of unique exceptions to the official policy. 'Cancer' is such a common swear word that it is impossible to address messages with that term. The term is therefore banned in almost all countries, except in the Netherlands.

According to De Charon, Dutch Facebook is in any case worse than in any other country. While his Southern European colleagues could regularly chat or play games, his row of tickets overflow with intolerance, hatred and atrocities every day. Due to the enormous backlog, timely intervention was often not possible. In addition, Facebook's ambiguous policy has been especially brutal for victims of pornography and bullying.

Fragment uit het continu veranderende beleid voor content moderators bij Facebook.

An excerpt from the ever-changing policy for content moderators at Facebook.

Because the more fun the social medium is, the longer users linger

That exception rule may take a little pressure off the boiler, but it still leaves plenty of work to be done. In the Netherlands alone, the list of reported messages is growing daily by around 8,000 items. Facebook turns out to be a sewer through which the dirtiest water flows. An average user sees little of that. Artificial intelligence automatically filters out the worst excesses or users flag shocking messages.

The content moderators, however, see the worst. Behind the scenes, they must ensure that the medium remains fun. Because the more fun the social medium is, the longer users linger. After all, that is best for viewership and click figures - and therefore advertising income. Bottom line for Facebook, only the annual figures count.

False world

“Even feeling pain is better than going mad.”

Facebook uses an opaque system for the evaluation of messages. Messages criticizing race, ethnicity, country of origin, religious belief, sexual orientation, gender (identity) and serious illnesses will be removed. However, a message about these protected categories in combination with content from an unprotected category is allowed. "Christianity is a backward religion" is allowed, but "All Christians are backward" is not allowed.

It is precisely this opacity that ensures a double standard. Moreover, the biweekly policy updates do not improve the transparency of the policy. Some nudity is allowed, some definitely not. The rules are also quite strict. A photo with a Hitler salute should be removed, but if the greeting arm is deviating from the criteria established by Facebook, the message may remain.

It just goes to show that it is impossible to capture division and discord in an algorithm. Complex issues are difficult to answer with binary rules. When, for example, is an emoji of a roar of laughter a signal that indicates hatred or malicious pleasure?

Attempts by Facebook to block certain hashtags are useless: users are resourceful and choose other hashtags to make their post visible.

The policy creates through Facebook a make-believe world in which a group of policy officers determines what can and cannot be tolerated. The poignant reality - full of hatred, cruelty and unfathomable grief - is hidden from view and ends up on signs in Berlin, Manila, Lisbon and Dublin.

Alcohol and diazepam

The combination of shocking images, a continuous growth of new messages and ever-changing policies in which current trends seem to predominate, take their toll. Even though moderators are confronted with a message for less than 10 seconds, some messages remain forever on their minds. Talking about it with friends or acquaintances is not an option: strict nondisclosure agreements forbid moderators from speaking out. Mobile devices and paper and pen are also prohibited for fear of leaking trade secrets. Dyson hand dryers in the toilets in Spandau are intended to prevent employees - no matter how haphazardly - from putting something on paper.

It sends some employees to the business psychologist. But in Spandau, there is only 1 available for 600 employees. So content moderators tend to resort to booze or diazepam - or a combination of both. Sometimes the work is even fatal for those involved. In 2018, a moderator in North America died after a cardiac arrest due to the stress. A colleague in Manila attempted suicide.

The cat-and-mouse game between Facebook and its users, the technology giant's inability to keep gruesome content at bay, the lack of employee guidance combined with a meager salary all lead to continuous employee dissatisfaction among those who have to keep Facebook clean. They feel like playthings of political and business decisions and have become increasingly vocal in their dissatisfaction--this despite their voluminous confidentiality contracts. They are increasingly supported by their managers. Still, all this proves not enough to get recognition for the problem. Facebook itself is now focusing on artificial intelligence as the successor to the moderators, but employees of Facebook have little confidence in it as a solution, says a developer: “2 billion messages are received every day. Even if you evaluate 99 percent of those messages well, many errors remain.”

Photo: © Wikimedia Commons

Reporters Online publishes work of independent journalists and helps them sell it to their readers.


Support the exchange of international journalism with a donation of any size

Your support helps protect the storybank, an independent platform which encourages and facilitates the exchange of journalistic publications worldwide. It means a great deal to us if we can, with your help, deliver a fully operating and innovative tool to provide quality journalism for everyone, wherever you are and from wherever the stories come. Support us based on what you feel this article is worth to you.

Related posts