Activists welcomed the new policies, but said it would be up to Facebook to enforce them. And some experts were skeptical that they will really make a difference.
Siva Vaidhyanathan, a Facebook expert at the University of Virginia, said the company once again proved its inability to effectively eliminate dangerous fake news last week by failing to suppress posts from right-wing militia organizers urging supporters with guns converge on Kenosha, Wisconsin. .
“Facebook’s biggest problem has always been law enforcement,” he said. “Even when he creates reasonable policies that seem well intentioned, he is defeated by his own scale. So I am not optimistic that it will be terribly effective. “
Concerns about civil unrest
Facebook and other social media companies are under scrutiny for how they handle disinformation, given issues with US President Donald Trump and other candidates posting false information and Russia’s continued attempts to interfere in American politics.
WATCH | The 5 big Canadian banks join the advertising boycott of Facebook:
Facebook has long been criticized for failing to verify the facts about political ads or for limiting how they can be targeted to small groups of people.
With the nation divided and election results potentially taking days or weeks to be finalized, Zuckerberg said there could be “an increased risk of civil unrest across the country.”
Civil rights groups said they directly invited Zuckerberg and other Facebook executives to make most of the changes announced Thursday.
“These are really important steps, but it will all depend on the application,” said Vanita Gupta, who was the head of the civil rights division of the Obama Department of Justice and now leads the Leadership Conference on civil and human rights. “I think they’ll be tested on it soon. ”
In July, Trump refused to publicly commit to accepting the results of the next election, as he scoffed at polls that showed him lagging behind Democratic rival Joe Biden.
Trump has also made false claims that increased use of postal voting due to the coronavirus pandemic would allow voter fraud.
This has raised concerns about the willingness of Trump and his supporters to respect the election results.
As part of the new measures, Facebook said it will ban politicians and campaigns from running new election ads in the week leading up to the election. However, they can still run existing ads and change their targeting. And many voters are expected to vote by post well before election day.
Trump campaign spokeswoman Samantha Zager criticized the ban on new political ads, saying it would prevent Trump from defending himself on the platform in the last seven days of the presidential campaign.
Posts containing obvious misinformation about voting policies and the coronavirus pandemic will also be removed. Users can only forward articles to up to five other people on Messenger, Facebook’s messaging app.
The company will also work with Reuters to provide official election results and make the information available both on its platform and with push notifications.
Internal dissent may have prompted action
After being caught off guard by Russia’s efforts to interfere in the 2016 US presidential election, Facebook, Google, Twitter and other companies put in place safeguards to prevent this from happening again .
This includes the removal of posts, groups, and accounts that engage in “coordinated inauthentic behavior” – defined by Facebook as “when groups of pages or people work together to mislead others about who they are or what they are. they do ”- and strengthening procedures for checking political advertisements.
Last year, Twitter completely banned political ads.
WATCH | Facebook Removes Trump Post Following COVID-19 False Claim:
Zuckerberg said Facebook has removed more than 100 networks around the world engaging in such interference in recent years.
“This week alone, we took down a 13-account, two-page network that tried to mislead Americans and amplify the division,” he said.
But experts and Facebook’s own employees say the measures are not enough to stop the spread of disinformation – including from politicians and in the form of edited videos.
This internal dissent among Facebook employees may have helped influence Zuckerberg’s decision to do something, said Joan Donovan, a disinformation researcher at Harvard University.
“It’s a huge about-face for Facebook right now, because for so long they’ve been saying they don’t want to moderate the political talk and now at this point they’re drawing very sharp lines, and I think that’s because their business can’t outlive four others. one-year scandal, ”she said.
Facebook had previously been criticized for its advertising policy, which cited free speech as the reason politicians like Trump would post false information about the vote.