As social media becomes ubiquitous in our lives, a new kind of problem confronts us. The ease with which images and videos can be put up on social media sites has led to users uploading content that is often disturbing or offensive. The internet unwittingly becomes a platform for the display of the worst kind of human depravity.
Despite the many algorithms developed to ensure a positive experience on the internet, the task of policing the web still falls on people. And one of these tasks, is filtering obscenities, abuse and violence. This is a job that is still entrusted to humans and specifically carried out by people in countries like India and the Philippines, as Andy Greenberg writes in the Wired.
What that means is that behind companies like Google and Facebook, there is an army of invisible labourers that is performing the tedious task of going through obscene and violent images and videos, and manually removing them from the sites.
In a recently released documentary titled The Moderators, directed by Ciaran Cassidy and Adrian Chen, we are given a glimpse of the insides of a Bengaluru-based company that moderates content for dating sites in the US, Europe, and India.
The 20-minute documentary follows the 2014 investigative piece by Chen for Wired titled “The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed”. The article, according to the New Republic is a deep dive into the hidden brigades of workers, mostly based in South and Southeast Asia, who manually remove offensive material from our social media
The documentary takes us through a training session in progress in a sterile office, where many young men and women starting their first jobs, listen with incredulity to the trainers, as he/she takes them through a series of images that require removal. They are taught to judge and recognise what constitutes obscenity, starting from nudity, to child pornography, to violent videos of self-harm and beheading.
The documentary gives us some mind-boggling numbers: the number of content moderators scrubbing the world’s social media sites and mobile apps, for instance, is over 1,50,000. That is more than twice the headcount of Google and nearly 9 times that of Facebook.
The Moderators puts a human face to the process that we often think is automated.
To know more about how Facebook moderates its content read: AI Isn’t Smart Enough (Yet) to Spot Horrific Facebook Videos.
For the newbie moderators, the content they encounter is shocking. But they are trained to look past their cultural context to judge the images.
“Our sense of judgement can be clouded by the community or religion we belong to. For instance, in India dating is a taboo. But in a dating website people will post pictures wearing bikinis. But the moderator in India might come from a conservative background and might not find it appropriate,” one of the trainers says in the documentary.
The moderators are expected to distance themselves from the content and approach it clinically.
One of the moderators states that they have to maintain a speed of 2,000 photos her hour of which 20% photos are vulgar. A single employee of this firm goes through 5,000 profiles in a day.
When your job comprises looking at disturbing and violent images all day long, it is bound to leave behind some psychological impact. The psychological trauma these moderators encounter is highlighted in Chen’s article where he details the interviews he's done with some of them. One of the moderators, a young woman, says, “I get really affected by bestiality with children... I have to stop. I have to stop for a moment and loosen up, maybe go to Starbucks and have a coffee.”
The psychological consequences of the job have however not deterred technology companies from outsourcing them to poorer countries, where they are often paid much less than their American counterparts.
Chen’s interview with Jane Stevenson, head of the occupational health and welfare department for Britain’s National Crime Squad, the UK equivalent of the FBI, reveals the challenges facing content moderators. Stevenson has advised social media companies that their content moderators will have to deal with the repercussions of being exposed to disturbing imagery in the same way as that of law enforcers who investigate child-pornography and anti-terrorism.
“From the moment you see the first image, you will change for good,” Stevenson says, in Chen’s article.
But she says that many technology companies are yet to grasp the seriousness of the problem.
For a country like India, which exercises tight control over material that it considers obscene, it is strange to picture a bunch of people shifting through images that are deemed harmful for the consumption of any human being.
However, the odd nature of the job does not appear to have deterred the young moderator, who says nonchalantly in the documentary, “It’s our job. You don’t have to take it personally.”
Watch the documentary here: The Moderators