Future Talent Awards

Have you ever wondered what happens when an offensive or disturbing video posted on social media is suddenly taken down? It doesn’t happen often, so it’s easy for us to not really consider the people who have to make the judgement on what happens to this content, the content moderators, but the reality of what goes on behind the scenes is almost as disturbing as the images themselves.

Facebook employs 15,000 people to sift through these videos and images daily. They are expected to look at around 25,000 images per 8-10 hour shift, inspecting whether to ignore or delete the images they see in front of them. Some of them are only 18 years old and subject to contract, meaning they find it hard to quit the job after being employed.

So where are these people?

There’s a facility in Arziona, America where employees of Cognizant spend their days looking through thousands of images and videos. It’s in no particular order and can be anything from terrorist propaganda, to child pornography, or suicide live streams.

The article “The Trauma Floor,” by Verge investigated just how damaging this job can be. The journalist Casey Newton followed new starters in their role at the company and found that their training consisted of showing traumatic videos and having the staff decide whether this needed to be deleted or was appropriate to be kept on. The first video showed a man being stabbed to death, the next was a drone firing shots and killing people on the ground. One trainee ran out of the room and had a panic attack. She received a hug from senior and was encouraged that she could do the job, reassured that when she is fully trained, she could have more control over the videos. By more control, they meant that employees can pause the video or watch with the volume down. But they would still need to watch the content and pausing too much eats into their viewing time, which staff are pulled up on if team leaders can see that they aren’t viewing as much as they should.

Abodus - Live until 29th Sep 24

The trainee explained that she felt like she had no choice – a fresh graduate trying to get a job, and this one paying more than any retail job she applied for. She will make $15 an hour, $4 more than the minimum wage in Arizona.

The employees are given two 15 minute breaks and a lunch break, as well as a 9 minute “wellness” break daily where they are encouraged to give their eyes a rest from the screens. However, the way they use this time is micro managed as it was reported that Muslim employees were stopped from praying during this time.

It has also been reported that the staff are struggling to cope with their job so much, that they resort to taking drugs – smoking weed in their breaks. Employees have even been caught having sex in the toilets, in what the they call “trauma bonding.”

It’s not just America that has this facility. In the Phillipines, this is a much bigger issue, as it’s one of the biggest and fastest-growing hubs of such work. The training and management is very similar, with employees feeling like they are trapped because of the strict contract and the low living wage in the Phillipinnes. They feel like it is the only option they have.

The BBC 4 film, “The Internet’s Dirtiest Secrets – The Cleaners,” revealed how one employee here witnessed someone committing suicide on a Facebook live stream. He said how he thought he was joking and continued to let the stream go live, as it’s in the rules that a stream isn’t allowed to be ended unless someone actually commits suicide, since the employee could be blamed for terminating the content short for no reason.

The aftercare they receive as employees ends as soon as they quit, or are let go from the company. Another employee spoke about how she feels like a totally different person since starting her job and that she sees the world differently, saying that we are “living in a dangerous time.”

It seems like something in an alternative reality, something maybe on an episode of Black Mirror, but sadly, it isn’t. It’s real people behind the screens, viewing damaging images that are leaving them with PTSD symptoms.

It was reported that some of the counsellors at the company believed that being exposed to these types of images could be a positive as it is “positive traumatic growth,” referring to an ex-employee who felt the need to sleep with a gun by his bed. A result of constant exposure to watching terrorist attacks.

Last year, Facebook made a promise to make this kind of work more transparent to new employees, to be more honest about the kind of content they are expected to see and to also offer after-care support for when the employees no longer work for the company.

But these people are putting their mental health at risk to protect us from the horrors of the dark side of social media, raising questions about how far this will go. How many people will these companies continue to employ? With a platform like Facebook, constantly growing with new members, how can they cope to monitor over two billion people’s content, 24/7?