"They are deliberately creating this toxic atmosphere," Facebook's researchers wrote of the administrators of the " Kayleigh McEnany Fan Club," named after -- but not associated with -- the Trump administration press secretary. The researchers said the Group largely functioned as a distribution system for "low-quality, highly divisive, likely misinformative news content" from a handful of partisan publishers.

The comments in the Group included death threats against Black Lives Matter activists and members of Congress, researchers said, and Facebook had flagged it 174 times for misinformation within three months. In the comments under one post about U.S. Rep. Ilhan Omar (D., Minn.) that the presentation included, comments included:

"I hope someone shoots her but she lives and is paralyzed."

"Maybe a bullet would do her good."

"Bring back public hangings."

The Journal contacted five club administrators -- most of whom appear to be affiliated with for-profit right-wing digital-media sites -- through Facebook and the contacts listed by those outlets when available but didn't not receive a response.

Facebook's public-policy team balked at taking action against prominent conservative Groups, and managers elsewhere in the company questioned proponents of the proposed restrictions about the effects on growth, according to internal documents and the people familiar with the decisions. To try overcoming resistance to further crackdowns, Facebook integrity staffers began sending daily analyses to Mr. Rosen and other senior executives, showing how Facebook's methods for policing major Groups were failing to catch obvious violations of the company's community standards.

Facebook's platform-wide rules forbid hate speech and speech that incites violence. The company advises Groups moderators on how to maintain community rules. But rather than helping foster a civil tone, leaders of major politics-focused Groups encouraged members to break Facebook's rules, threatened to ban anyone who reported such content and directed users to post their most outrageous material as comments on other posts -- a tactic meant to confuse Facebook's automated moderation systems.

Facebook declined to discuss the specifics of its handling of the researchers' findings.

On October 20, the Mozilla Foundation, which makes the Firefox browser and says it promotes a healthy internet, ran a full-page ad in the Washington Post calling for Facebook to disable its algorithmic Group recommendation systems. "Countless experts -- and even some of your own employees -- have revealed how these features can amplify disinformation, " said the letter, which also urged Twitter Inc. CEO Jack Dorsey to suspend its algorithmically-driven Trending Topics feature.

Twitter didn't suspend the feature, though it has sought to add more context and has manually intervened to remove incendiary trends such as "Hang Mike Pence." A Twitter spokesman said the company had moved quickly to take down calls for Mr. Pence's death and begun adding factual context to its trending topics feature.

Ashley Boyd, Mozilla's vice president for advocacy and engagement, said she had discussed the foundation's concerns with employees from Facebook's public policy, product development and communications staff before the letter's publication. "They didn't say we were crazy," she said. "They said, 'This is very similar to conversations we're having internally.' "

Even before Mozilla published its letter, Facebook had temporarily stopped making algorithmic recommendations to Groups dedicated to political or civic issues, a Facebook spokesman said.

Facebook also halted showing previews of Group content to prospective new members, capped the daily number of invitations members could send each day, and began freezing comment threads when they repeatedly triggered automated filters for hate speech and violence, internal documents show. Mr. Rosen confirmed the pre-election moves.

The new rules, which Facebook designed to be temporary and largely didn't announce publicly, couldn't contain the viral growth of some Groups after the election. Most notably, a Group called "Stop the Steal" that was organizing election protests around the country grew to 361,000 members in less than 24 hours without any promotion from Facebook's algorithms. When Facebook took it down Nov. 5, the company said it "was organized around the delegitimization of the election process, and we saw worrying calls for violence from some members of the group."

In response to rising fears of political bloodshed, Mr. Zuckerberg that day approved an additional "break glass" emergency measures including further restrictions on Groups with a history of bad behavior, according to internal documents and people familiar with the decisions.

After violence related to ballot counting failed to materialize in the following days, Facebook began to loosen some of the restrictions on Groups, according to internal documents. It reminded employees and reporters that the measures had always been temporary.

On Jan. 6, after the rally organized by Amy and Kylie Kremer -- the mother-daughter creators of the original "Stop the Steal" Group, which Facebook closed Nov. 5 -- a group of Trump supporters stormed the Capitol. The Kremers didn't respond to requests for comment. In the wake of the riot, Facebook deleted other Groups that had cropped up using "Stop the Steal" in their names and espoused the same purpose.

Mr. Zuckerberg approved instituting the break-glass measures Facebook had recently lifted and added more restrictions on Groups, internal documents show. In a public blog post, he blamed President Trump for trying to use Facebook "to incite violent insurrection." Facebook required administrators to approve more posts in Groups with histories of violating its rules -- a technique Facebook's integrity staff had recommended in August but that the company hadn't fully implemented.

Facebook Chief Operating Officer Sheryl Sandberg publicly cast blame for the riot's organization on smaller social-media platforms, even as the company continued to rein in Groups. The company has dissolved 40 of the top 100 groups listed in the August presentation. She declined to comment.

Beyond the permanent ban on algorithmic civic and health Group recommendations, Facebook will prevent Groups of any sort from being promoted within their first 21 days of existence. Other temporary measures -- such as the freezing of comment threads classified as turning vile and daily limits on Group invitations -- remain in place and may become permanent.

Facebook itself restricted political discussions on its own internal messaging boards last year amid debate over the platform and the US presidential election, handing oversight to the professional moderators, according to two people familiar with the decision.

"Growing fast isn't in and of itself an indication of something good or bad," Mr. Rosen said. When it comes to managing the risks of Facebook products, he said, "the balance moves all the time."

Write to Jeff Horwitz at Jeff.Horwitz@wsj.com

(END) Dow Jones Newswires

01-31-21 1731ET