About 95% of hate speech on Facebook is captured by algorithms before anyone can report it, Facebook said in its latest Community Standards Enforcement report. The remaining 5% of the roughly 22 million messages reported in the last quarter were reported by users.
This report also tracks a new measure of hate speech: prevalence. Basically, to measure prevalence, Facebook takes a sample of content and then finds out how often the thing it measures – in this case, hate speech – is considered a percentage of seen content. Between July and September of this year, the figure was between 0.10% and 0.11%, or about 10 to 11 views in 10,000.
Facebook also pointed out – both in its press release and in a call for the press – that while its internal AI is making progress in several categories of content application, COVID-19 continues to have an effect on its ability to moderate the content.
“As the COVID-19 pandemic continues to disrupt our content review workforce, we are seeing some enforcement actions returning to pre-pandemic levels,” the company said. “Even with reduced review capacity, we still prioritize the most sensitive content for review, which includes areas like suicide, self-harm, and child nudity.
Used labor
Critics are critical, Facebook vice president of integrity Guy Rosen said on an appeal. “People are an important part of the equation of applying content,” he said. “These are incredibly important workers who do an incredibly important part of the job. ”
Full-time Facebook employees who are employed by the company itself are told to work from home until July 2021 or maybe even permanently.
During the call with reporters, Rosen pointed out that Facebook employees who are required to physically come to work, such as those who manage essential functions in data centers, are brought in with strict safety precautions and equipment. personal protective equipment, such as hand sanitizers, available.
Moderation, Rosen said, is one of those jobs that can’t always be done at home. Some content is just too sensitive to review outside of a dedicated workspace where other family members could see it, he explained, claiming that some Facebook content moderators are being brought back to offices. “To ensure that this balance between people and AI works in those areas” that require applied human judgment.
However, the majority of Facebook content moderators don’t work for Facebook. They work for third party contract firms around the world, often with sadly insufficient support to do their jobs. Reporters from The Guardian, The Verge, The Washington Post, and BuzzFeed News, among others, spoke to these contractors from around the world, who describe relentless expectations and widespread trauma at work. Earlier this year, Facebook agreed to a $ 52 million settlement in a class action lawsuit brought by former content moderators who alleged the work gave them “debilitating” post-traumatic stress disorder.
It was all before COVID-19 has spread around the world. Faced with the pandemic, the situation looks even worse. More than 200 moderators who are told to return to the office have signed an open letter accusing Facebook of “risking the lives of moderators unnecessarily” without even paying a risk premium for workers who are fired.
“Now, in addition to psychologically toxic work, keeping the job means walking in a hot zone,” the letter read. “In several offices, several cases of COVID have occurred in the field. The workers have asked the executives of Facebook and the executives of your outsourcing companies like Accenture and CPL, to take urgent action to protect us and to value our work. You refused. We are publishing this letter because we have no choice. ”
“This raises a difficult question,” the letter adds. “If our work is so essential to the business of Facebook that you will ask us to risk our lives for the sake of the Facebook community – and for profit – aren’t we, in fact, the heart of your business? “
Control grows
Meanwhile, state and federal control of Facebook is only growing. This week, the company’s CEO Mark Zuckerberg testified before the Senate for the second time in just three weeks. Members of the House also complain that Facebook has failed to moderate content properly or safely amid rampant election-related misinformation.
Other regulators will likely come for Facebook – and soon. Many antitrust investigations that began in 2019 are coming to an end, according to media reports. The Federal Trade Commission plans to file a complaint within the next two weeks, and a coalition of nearly 40 states, led by New York Attorney General Letitia James, will likely follow in December. These lawsuits are likely to argue that Facebook is unfairly stifling competition through its acquisition and data strategies, and it may end up trying to force the company to divest Instagram and WhatsApp.