Facebook may seem like it is doing the right thing by hiring more moderators to monitor its live-stream, it needs to make sure those moderators are treated right. (Courtesy photo)

In the age of live-streamed murders and rapes, who moderates the moderators?

The “Facebook Live Killer” is dead, but he lives on. While Steve Stephens was proven to have not live-streamed his cold-blooded murder in Cleveland on Facebook Live, the nickname — and his legacy — stuck. The imagery of a murderer sharing his horrific act in real time with his friends and family captured the national imagination, as Stephens evaded a national manhunt for two days and subsequently committed suicide.

It certainly caught Facebook’s attention. Last week, Facebook announced it was going to hire 3,000 moderators who will leaf through all Facebook Live videos to flag any that violates the company’s standards before Facebook users see the horrors themselves. The social network already has 4,500 moderators around the globe.

The announcement was Facebook’s most proactive action in curbing the horrors broadcast on Facebook Live — and it was a long time coming. While the worst Facebook Live broadcasts make only the tiniest fraction of the ever-popular live-stream feature, the details put together read like a hellish house of horrors: a Thai man killing his 11-month old daughter; a gang-rape in Sweden; four people torturing a mentally disabled man in Chicago; and an ISIS terrorist stabbing a Parisian policeman to death.

Considering Facebook will not take preemptive actions in limiting who can use Facebook Live or when and where it can be used, beefing up the moderation team may look like the best available solution. But this solution comes with a heavy psychological toll on the moderators, and Facebook has not been openly discussing this issue so far.

Content moderation is a booming online business that is largely outsourced to developing countries. Whether it be for Facebook Live broadcasts or YouTube videos, moderators filter through mountains of gruesome videos. Artificial intelligence — while getting better — has yet to reach the sophistication required for the job, so humans do the dirty work.

Humans are not designed to absorb shocking images so quickly and so frequently. Many moderators often leave their jobs with post-traumatic stress disorder, intense paranoia and alcohol dependency, while often being paid poor wages and offered little to no mental health services. In January, two Microsoft employees on the “online safety team” sued the company for forcing them to watch sexual assault and brutality videos which resulted in severe PTSD.

In 2014, tech journalist Adrian Chen wrote for Wired about the work conditions of these content moderators in the Philippines, an increasingly popular place where tech companies outsource these jobs because of their cheap wages and relatively close cultural ties to the United States. Paid as low as $300 a month, the workers stare into the darkest, most sadistic moments of humanity for several hours a day. Most last a few months before burning out.

Despite these people being the unsung heroes that keep our internet palatable at great mental sacrifice, the general public knows almost nothing about them. Many moderators are forced to sign non-disclosure agreements so they are forbidden to discuss their work. Tech companies have not disclosed how many content moderators are out there (one estimate says well over 100,000 worldwide), how moderators are working and what sort of help they are provided to cope with witnessing bestiality, pedophilia, torture and other unspeakable acts on a regular basis.

Most people choose to believe robots do the work, but that is simply not true. “The public needs to understand that this work is not being done by a computer,” the attorney for the Microsoft content moderators tells The Guardian. The free, open internet comes with a heavy price, and the first step in mitigating the price is for you and I to acknowledge these people exist.

The second step is for tech companies to be transparent and treat these digital foot-soldiers with the respect and care they need. After announcing its expansion of the Facebook Live moderation team, Facebook declined to comment on whether the jobs will be outsourced, what their pay and benefits will be, and whether these moderators will be full-time employees or independent contractors.

Facebook founder Mark Zuckerberg emphasized that the expansion was to keep the Facebook community “safe.” The safety of the moderators should be a high priority as well.

I am often wary of artificial intelligence and algorithms taking human jobs but content moderation is not one of them because the job itself is inhumane. The technology isn’t there yet to fully replace all human moderators. Even if we reach that point, we may still need human eyes to be the final arbiter.

But if we are to reach that point someday, we need a better system to protect the moderators from slipping into the same void where the demented, attention-hungry killers and rapists already reside.

The Nexus covers the intersection of technology, business and culture in San Francisco and beyond. Write to Seung at seungylee14@gmail.com.

If you find our journalism valuable and relevant, please consider joining our Examiner membership program.
Find out more at www.sfexaminer.com/join/

Just Posted

Star of 49ers win, Mostert consoled injured teammate: “We’re brothers”

Raheem Mostert, who broke his arm against the Oakland Raiders in 2018, was able to empathize with some of the pain that Coleman was experiencing

The Niners faithful recall a different time as they celebrate their team’s victory

Minutes from Candlestick Park, the ancestral home of the 49ers, several of… Continue reading

The Niners are going to the Super Bowl

With a 37-20 win over the visiting Green Bay Packers, the 49ers punched their tickets to Miami, FL

Photos: San Francisco 49ers fans tailgate at Levi’s

San Francisco 49ers fans tailgate at Levi’s Stadium before the big game

SF Lives: Divorcing San Francisco

“This is a town that thrived on being eclectic. I don’t think we can recover,”

Most Read