Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

'Without Our Work, Facebook Is Unusable': Content Moderators Demand Safer Offices

Facebook's content moderators say the company is putting their health at risk by pressuring them to return to the office.
Josh Edelson AFP via Getty Images
Facebook's content moderators say the company is putting their health at risk by pressuring them to return to the office.

Updated at 3:12 p.m. ET

More than 200 Facebook workers say the social media company is making content moderators return to the office during the pandemic because the company's attempt to rely more heavily on automated systems has "failed."

The workers made the claim in an open letter Wednesday to Facebook CEO Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg as well as the heads of two companies, Accenture and CPL, to which Facebook subcontracts content moderation.

Advertisement

"Without our work, Facebook is unusable," they wrote. "Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can."

The letter was organized by Foxglove, a law firm representing content moderators, and was signed by both contractors and full-time Facebook employees.

In August, Facebook told employees they could keep working from home until July 2021. The open letter accused the company of putting content moderators' lives at risk by pressuring them back into offices. It follows a report from The Intercept that a worker at a Facebook content moderation facility in Austin, Texas, tested positive for the coronavirus just days after returning to the office in October.

The workers called on the company and its outsourcing partners to improve safety and working conditions. Their list of demands includes hazard pay for moderators who must return to the office. They also want Facebook to hire all of its moderators directly, let those who live with high-risk people work from home indefinitely and offer better health care and mental health support. They also said Facebook should "maximize" the amount of work that can be performed at home.

Facebook relies on more than 15,000 people around the world to review which posts break its rules and must come down, and which can stay up — ranging from spam and nudity to hate speech and violence. Most moderators are not Facebook employees but contractors who work for third parties, including Accenture and CPL.

Advertisement

Facebook spokesman Drew Pusateri said in a statement that the majority of content moderators continue to work from home during the pandemic and that "Facebook has exceeded health guidance on keeping facilities safe for any in-office work."

"We appreciate the valuable work content reviewers do and we prioritize their health and safety," he said. Moderators have access to health care and "confidential wellbeing resources" from their first day on the job, he added.

Accenture said in a statement: "We are gradually inviting our people to return to offices, but only where there is a critical need to do so and only when we are comfortable that we have put the right safety measures in place, following local ordinances. These include vastly reduced building occupancy, extensive social distancing and masks, daily office cleaning, individual transportation and other measures." The company is also finding alternative arrangements for workers who are vulnerable or live with someone who is vulnerable.

CPL did not respond to a request for comment.

As the coronavirus began spreading earlier this year, Facebook, like other tech companies, sent most of its workers home, including contractors.

The company said that decision meant it would increase its use of automated systems to flag violating content — and Zuckerberg acknowledged those systems would not always get things right. "We may be a little less effective in the near term while we're adjusting to this," he told reporters in March.

Workers who signed the open letter said the pandemic had revealed the automated systems' shortcomings. "Important speech got swept into the maw of the Facebook filter — and risky content, like self-harm, stayed up," they wrote. "Facebook's algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there."

Facebook has invested significantly in artificial intelligence and machine learning to review content, and the company has made no secret of the fact that it wants automated systems to take on more work over time.

It is under pressure to keep its platform free of harmful and misleading content — and has faced extensive criticism over how it treats contractors, who say they perform a grueling job that takes a big toll on their mental health.

"Facebook needs us," the workers wrote in Wednesday's letter. "It is time that you acknowledged this and valued our work."

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2020 NPR. To see more, visit https://www.npr.org.