A fresh-faced resident of the Manila slums who supports his mother and family. A devout Catholic committed to stopping the spread of sin. A young woman whose mother warned that dedicated study was the only way to avoid becoming a trash scavenger. A father determined not to let what he sees online affect his outlook on life.
A man who killed himself because he let it affect him too much.
These are the ‘cleaners,’ the low-wage workers tasked with policing the images and content that everyone can view on social media. They work under the radar, wading through 25,000 images a day, taking mere seconds to decide what is and isn’t appropriate for platforms such as Facebook, YouTube and Twitter. Only three errors are permitted per month, and the content they view can include anything from beheadings and assaults to political cartoons. They are generally trained for just three to five days and bring to the job their own experiences and belief systems –subsequently influencing the lives and opinions of billions worldwide.
The existence and impact of these workers – most of whom are based in the Philippines – are investigated in new documentary The Cleaners, which is hoping for an Oscar nod. The film offers a dark and alarming look into the underbelly of the Internet – and it wasn’t easy to make, directors Hans Block and Moritz Riesewieck tell DailyMail.com.
‘Getting in contact with the workers was one of the biggest challenges of the whole filmmaking process,’ Block says via email. ‘It’s a very secretive and hidden industry. The companies do everything they can to keep the work secret. The companies use code words to hide which companies they are working for.’
Facebook, for example, is referred to as the ‘Honeybadger Project,’ the directors were told by the ‘cleaners’, as content moderators are known.
‘The workers must keep their job secret. Otherwise they are sued,’ Block continues. ‘There are private security firms that pressure workers not to talk to strangers. They control their social media accounts and there are all kinds of reprisals. It took us quite a while to get in touch with the workers. We collected all the information to get a bigger picture about what is going on there.
‘But when we eventually did it (in collaboration with local students) we were surprised how proud many of them were to do this job. They told us: Without us social media would be a complete mess. On the other hand many of them silently suffer from the impact the work has on their mental health. After a very long period of research, in which we built up a relationship of trust with the workers, we worked together with our protagonists of the film to find out how to make a film about this work.’
The process of securing the participation of such workers – men and women with firsthand knowledge of how the system works – was one that had to be carefully orchestrated. Riesewieck tells DailyMail.com how colleagues were secretly ‘photographed while talking to possible subjects and then the photo was sent to all the employees in order to threaten them in case they were willing to talk to us.
‘The offices are protected by airport scanners at the doors and surveilled by private security firms who sometimes even control the private social media behavior of the workers – just to make sure they are not in touch with journalists. So we really needed to find creative ways how to make a film about this completely hidden work.
‘We eventually did by finding an abandoned office of this very type of work in a business district of Manila with all the typical work stations still inside where we could invite our subjects in order to learn more about their work. This protected space made it possible for them to share experiences about their traumatizing work and their feelings, in which way they are affected and so on.’
The content moderators are affected in vastly different ways; some are proud and see their role as guardians, while others are deeply affected, such as the formerly innocent devout Catholic who finds it hard to get images of male genitalia out of her head. The film also includes the story of one moderator specializing in images of self-harm; he repeatedly asked to be transferred but was denied, and he ended up taking his own life.
The pitfalls of the job are, clearly, manifold.
But The Cleaners also explores the wider societal implications of their work. Sanitizing content means that images of war and violence – such as bombings in Syria – are often designated as terrorist-related, deleted and kept from online audiences, preventing them from being fully informed.
Abdulwahab Tahhan, who works with the NGO Airwars, laments in the film how summarily deleting such content hinders the efforts of his organization – and prevents global citizens from receiving vital information.
‘Without our work, the armies or the governments would have a straight pass without anybody challenging them,’ he says in The Cleaners. ‘There would be, I think, more civilians getting killed; there would be less closure; we would not be able to know what’s going on. These videos are needed. These videos are part of the war. They provide information; they provide evidence for the future.
‘But the problem is, these videos are quite often classified as ISIS videos. It’s been seen that this is very graphic, so they get deleted. Now the censorship that’s taking place on YouTube, it’s affecting a lot of organizations.’
The cultural, educational and political backgrounds of the content moderators inherently affects what’s deleted and what’s allowed to remain; without context, it’s wildly difficult for such workers to make informed decisions about images and footage.
Without comment or judgement, a few examples are woven into the film. The devout Catholic content moderator deletes any images of nudity, period. A moderator deleted the award-winning Vietnam War photo depicting a naked young girl running from a napalm attack – because it showed an unclothed minor. Another moderator deleted an image he believed to be a young child killed in a flood accident – when, in reality, it was an image of a toddler victim of the Syrian conflict, likely a drowned refugee.
Artist Kahled Barakeh had posted that photo and others on facebook to highlight the atrocities in his native country – only for the post to be deleted after three days.
‘There was no explanation,’ he says in the film. ‘I tried to, in every possible way, to be in touch, but it was impossible … Sometimes I try to think about it: Who decides what is the policy of Facebook? I would love to meet these people, know who they are – because by logic, their background, who they are, it affects how they think. That child didn’t die out of a normal situation.’
Such harsh realities should be shared on social media, he says, because ‘we should keep disturbing the world’.
The documentary hammers home the point that major companies, and not just the content moderators, should be facing major ethical and procedural questions.
‘A decision about what constitutes terror content is a really context-based decision,’ says Nicole Wong, a former policy maker at Google and Twitter – and former deputy US chief technology officer under President Obama – who was interviewed by the filmmakers.
‘It may show up on one platform as a true threat, and it may show up on another platform as news or as satire or criticism commenting on a piece of content … it’s hard to know whether something should be removed or not.’
The Cleaners interviews former executives from major companies such as Facebook and Google – but it proved veritably impossible to get current employees to talk, the filmmakers tell DailyMail.com.
‘From the beginning, we wanted to involve Silicon Valley executives in our film to discuss the issue of global networks and how to handle the moderation process of the platforms,’ Block tells DailyMail.com. ‘It would be so easy to blame or insult the social media for all their mistakes. But we never did. Much more, we wanted to encourage an open dialogue to think about the problem of the global network.
‘But getting in contact with the executives of the big social media companies wasn’t any easier as there is this code of silence in Silicon Valley. None of the employees of Facebook or Google talks about what’s happening inside. We contacted several dozens of people and never got an answer. We’ve even sent the final cut of our film to all the major companies and asked for a public statement. There was no response either.
‘When we tried to promote the film in Germany and France via social media, we got the message after a short time that the trailer violated the community guidelines and was deleted. Sad but true! Intransparency is one of the main characteristics of the companies. That was very annoying and upsetting.
‘Eventually we got in touch with some of the former representatives of Facebook and Google/ YouTube/Twitter. Many of the former representatives regret that they helped to build technology with so many terrible effects.’
The power struggle with individual governments, as well, is investigated by the film, which touches upon censorship in countries such as Turkey and issues with information spread in places like Myanmar – where most people get all of their news from platforms such as Facebook and are arguably less able to distinguish between fact and fiction. That reality contributes to major problems such as genocide and marginalization, with The Cleaners pointing out that Facebook has been used to promote hatred of the ethnic minority Rohingya people in Myanmar.
But the documentary makers themselves also had to struggle with what they should and shouldn’t show in the film.
Riesewieck explains how they ‘decided only to show a tiny part of the disturbing content itself and whenever possible rather to show the reactions of the content moderators on the content. It is clearly not our aim to shock the audience.
‘But in cases like with the war in Syria we still need to ask ourselves what gives us the right not to be disturbed by the human victims of this terrible war which so many countries worldwide are included in. We want to make awareness that the so called “healthy environment” on social media comes at a huge price: A very perfidious mental abuse of young low wage workers in the developing world and the loss of a lot of ambivalent, challenging, eye-opening material.’
The filmmakers made a psychologist available to the content moderators they interviewed, and they hope that their final product prompts thought and discussion among social media users, platforms and governments alike. Their film focuses on an aspect of the Internet that is rarely discussed or investigated – or thought about by the billions using social media every day for information and communication.
‘What was really surprising was, how many grey areas remain despite the guidelines the content moderators have to follow,’ Riesewieck tells DailyMail.com. ‘Content moderators told us they don’t dare to forward every second case in which they feel unsure about which guideline to apply or how to recognize the context of a picture or video and therefore just use their „gut feelings“.
‘According to several different content moderators in Manila there is a golden rule everybody has to learn in their three to five-day-training for the job: “Don’t overthink!” Just think about the consequences for activists, civil rights organizations, minorities and others if their content is deleted from Facebook, Youtube, Twitter just because a young low wage worker didn’t know about the context and purpose of the post and intuitively got it wrong.’
His co-director says: ‘Most of the billions of people in the world who use social media don’t have any clue who does the clean-up-work for them. Most of us take it for granted we don’t have to see videos of beheadings, rape or torture in our newsfeeds. We don’t know there are thousands of young workers in the developing world who sacrifice themselves in order to keep us „safe“ and provide us with a „healthy environment“ when we go online.
‚At the same time, hundreds of thousands of newsworthy pictures and videos disappear every single day without our notice. A lot of individuals as well as whole groups of our societies are secretly silenced with huge effects on our democracy.
‘We all should question the power social media has on all of us and become much more aware of the impact it has on today’s societies and all our minds. Instead of just passively consuming the net and allowing a few companies to dictate the rules for our digital public we should claim a much more participative, diverse net – in the hands of the public.’
‘We should become aware about how social media has hacked our desire to be loved by making us seek likes with every post we make. And by constantly asking us what we think about this and that people have familiarized with the idea of constantly having opinions. All these triggers and nudges make us produce more and more content in order to keep the level of positive feedback. The first step to limit the addiction to these technologies and dealing with them in a more mindful and meaningful way is becoming aware about those effects on our minds and on behavior.’