Facebook is tightening its rules on content concerning the U.S. presidential election next month.


AFP/Getty Images

Facebook Inc. teams have planned for the possibility of trying to calm election-related conflict in the U.S. by deploying internal tools designed for what it calls “at-risk” countries, according to people familiar with the matter.

The emergency measures include slowing the spread of viral content and lowering the bar for suppressing potentially inflammatory posts, the people said. Previously used in countries including Sri Lanka and Myanmar, they are part of a larger tool kit developed by Facebook
FB,
+2.39%

to prepare for the U.S. election.

Facebook executives have said they would only deploy the tools in dire circumstances, such as election-related violence, but that the company needs to be prepared for all possibilities, said the people familiar with the planning.

The potential moves include an across-the-board slowing of the spread of posts as they start to go viral and tweaking the news feed to change what types of content users see, the people said. The company could also lower the threshold for detecting the types of content its software views as dangerous.

Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence and misinformation, said the people familiar with the measures. But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said.

An expanded version of this report appears on WSJ.com.

Also popular on WSJ.com:

Health agency halts coronavirus ad campaign, leaving Santa Claus in the cold.

Loathe your loved one’s politics? Here’s some advice.

Let’s block ads! (Why?)