SAN FRANCISCO — Facebook is planning to enact new measures to make it more difficult for election misinformation to spread virally across its platform, two people with knowledge of the matter said Thursday, as the outcome of the presidential race remained uncertain.
Facebook plans to add more “friction” — such as an additional click or two — before people can share posts and other content, said the people, who requested anonymity because they were not authorized to speak publicly. The company will also demote content on the News Feed if it contains election-related misinformation, making it less visible, and limit the distribution of election-related Facebook Live streams, the people said.
The measures, which could be rolled out as soon as Thursday, are a response to heightened strife and social discord on Facebook after the election on Tuesday, these people said. They said there had been more activity by users and Facebook groups to coordinate potentially violent actions over issues such as voter fraud. President Trump has falsely claimed on social media and in remarks from the White House over the past few days that the election was being “stolen” from him, even while a final result remained unclear.
The changes would be some of the most significant steps taken by Facebook, which has in the past tried to make sharing information as easy as possible so that it can increase engagement on its site. The moves would most likely be temporary, said the people with knowledge of them, and were designed to cool down angry Americans who are clashing on the network.
“As vote counting continues, we are seeing more reports of inaccurate claims about the election,” Facebook said in a statement. As a result, it said, it is “taking additional temporary steps.”
Facebook has been more proactive about clamping down on misinformation in recent months, even as its chief executive, Mark Zuckerberg, has said he does not want to be the arbiter of truth. The company prepared for months for the election. It ran through dozens of possibilities of what might happen on Nov. 3 and afterward in case political candidates or others tried to use the platform to delegitimize the results. The new measures were part of this planning, the people said.
This week, Facebook also suspended political advertising for an indefinite period, and introduced notifications at the top of the News Feed that said no winner had been called in the election.
Other social media companies have also made changes to slow down the way information flows on their networks and to highlight accurate information on their sites. Twitter, which Mr. Trump uses as a megaphone, had labeled 38 percent of his 29 tweets and retweets since early Tuesday with warnings that said he made misleading claims about the electoral process, according to a tally by The New York Times. Last month, Twitter also made it more arduous for people to retweet posts or share links to articles that users had not yet read.
TikTok said it was broadening its fact-checking partnerships for election disinformation, and was updating its policy to better represent what types of content are not allowed on the app. YouTube has used its home page to show people accurate information about the election.
Republicans and Democrats have long criticized Facebook and Mr. Zuckerberg for their stance on misinformation. Mr. Trump and other Republicans have accused Facebook of suppressing and censoring conservative speech, while Democrats have railed against the tech companies for not doing enough to clean up the glut of toxic online misinformation.
On Thursday, as part of a heightened campaign against election-related disinformation and calls to violence, the company also took down a new Facebook group, Stop the Steal, which had more than 320,000 members.
Facebook said that the group had been “organized around the delegitimization of the election process,” and that a number of the group’s members had originated calls for real-world violence.
Some of Facebook’s new measures have precedents. In June, the company added more context about the coronavirus and highlighted accurate information about Covid-19 from health authorities, to prevent falsehoods about it from spreading. WhatsApp and Messenger, two messaging apps owned by Facebook, have capped the number of times a message can be forwarded and have limited reshares of private messages to a maximum of five people.