Subscribe
Employees at Mark Zuckerberg’s Facebook took to a chat room to question policies that helped enable the Jan. 6 insurrection on the Capitol building.

Employees at Mark Zuckerberg’s Facebook took to a chat room to question policies that helped enable the Jan. 6 insurrection on the Capitol building. (Drew Angerer/Getty Images/TNS)

WASHINGTON (Tribune News Service) — As rioters breached barricades and bludgeoned police with flagpoles before storming the U.S. Capitol on Jan. 6, some employees at Facebook Inc. took to an internal discussion board to express shock and outrage.

Many of the posts were imbued with a dawning sense that they and their employer — whose platforms for weeks had spread content questioning the legitimacy of the election — bore part of the blame. “I’m struggling to match my value to my employment here,” said an employee as the violence of the afternoon continues. “I came here hoping to affect change and improve society, but all I’ve seen is atrophy and abdication of responsibility.”

The remarks, directed at Chief Technology Officer Mike Schroepfer, were included in a package of disclosures provided to Congress in redacted form by lawyers representing Frances Haugen, a former Facebook product manager. The documents, obtained by a consortium of news organizations including Bloomberg, provide a unique window into a Facebook few outsiders ever see: a stunningly profitable technology company showing signs of sagging morale and internal strife.

“All due respect, but haven’t we had enough time to figure out how to manage discourse without enabling violence?” said one employee, according to a copy of posts by staff from Jan. 6. “We’ve been fueling this fire for a long time, and we shouldn’t be surprised it’s now out of control.” The remark was previously quoted in an article in The Wall Street Journal.

In a statement, Facebook said it ran the largest voter information campaign in U.S. history and took numerous steps to limit content that sought to delegitimize the election, including suspending Donald Trump’s account and removing content that violated company policies.

“In phasing in and then adjusting additional measures before, during and after the election, we took into account specific on-platforms signals and information from our ongoing, regular engagement with law enforcement. When those signals changed, so did the measures,” according to a Facebook representative.

Facebook spent more than two years preparing for the 2020 election with more than 40 teams across the company and removed more than 5 billion fake accounts that year, according to a company statement. In addition, from March to Election Day, the company removed more than 265,000 pieces of Facebook and Instagram content in the U.S. for violating voter interference policies. It also deployed measures before and after Election Day to keep potentially harmful content from spreading before content reviewers could assess them, which the company likened to shutting down an entire town’s road and highways to respond to a temporary threat, according to the statement.

Over the 24 hours that followed the insurrection, employees — whose names are redacted in the documents— used the internal version of Facebook to debate the company’s performance in frank terms. Among the criticisms: that Facebook failed to aggressively act against “Stop the Steal” groups that coalesced around the false notion that former President

Donald Trump had won the election. And that the company’s leaders repeatedly let down rank-and-file employees fighting to more aggressively curtail misinformation and other harms.

Some of those observations were later backed by research carried out by Facebook earlier this year, according to the documents. Enforcement policies that focused on individual posts weren’t applied quickly enough to a coordinated movement of users and groups spreading quickly across the platform, a 2021 analysis of company failures around Stop the Steal found.

“It wasn’t until later that it became clear just how much of a focus point the catchphrase would be, and that they would serve as a rallying point around which a movement of violent election delegitimization could coalesce,” the analysis concluded.

Stop the Steal and other election misinformation was also spread on other social media sites beside Facebook.The company’s look back at Jan. 6 emphasized that few people, including crisis managers at Facebook, had expected the day’s events to explode in violence. (In fact, the level of violence surprised nearly everyone including government officials and law enforcement.)

But it also listed a number of failures of policy and technology, as well as ways that Stop the Steal organizers gamed Facebook’s enforcement mechanisms.On Nov. 5, the company shut down the very first Stop the Steal Facebook group after multiple violations for posts inciting violence, and began tracking its replacements.

The pages that emerged in its wake had some of the most rapid growth for Facebook groups in history. Organizers appeared to be consciously avoiding known enforcement triggers, internal documents say, by carefully using language and, at least in one case, deploying the Facebook Stories feature, with posts that disappear after 24 hours.

The disclosures also highlight other decisions made by Facebook before Jan. 6 that, in the view of some employees, fueled the rancor that spilled over in the insurrection. Facebook had set up safeguards that were aimed at combating misinformation and other forms of platform abuse in the run-up to the 2020 election, but it dismantled many of them by mid-December, the documents indicate. Some measures, like a war room stood up for the November election, remained in place until after the inauguration.

And in early December, Facebook disbanded a 300-person squad known as Civic Integrity, which had the job of monitoring misuse of the platform around elections. Those experts were dispersed elsewhere even as efforts to delegitimize the election intensified.

Meanwhile, Stop the Steal groups were “amplifying and normalizing misinformation and violent hate in a way that delegitimized a free and fair election,” Facebook’s internal analysis concluded. “Early focus on individual violations made us miss the harm in the broader network.”

Haugen’s cache of documents also suggest that Facebook failed to apply the most powerful levers it uses to slow the spread of harmful content — what it calls “break-the-glass” protocols — until after the violence began. Rioters breached the final police barricades and began entering the Capitol just after 2 p.m. Eastern time on Jan. 6.

At emergency meetings later that day, Facebook managers approved several additional measures, including restricting certain videos from rapidly going viral and “demoting” posts that incite violence, a tweak to the algorithm that keeps them from spreading quickly between users, according to one document, called “Capitol Prote-st BTG Response.”

By that evening, Facebook announced that it would ban then President Trump from the platform for 24 hours, an unprecedented step for the company, and one which it extended the following day to Jan. 20. Several employees said that the response was too little, too late.

“I do acknowledge that a 24-hour ban is a pretty big deal, but that’s only because up until now, our response has been completely tepid,” one person wrote.

A few hours later, another employee pointed the group to a fateful policy exception the company made several years earlier.

“Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering the U.S. We determined that it violated our policies, and yet we explicitly overrode the policy and didn’t take the video down,” the employee said on the chat. “There is a straight line that can be drawn from that day to today, one of the darkest days in the history of democracy and self-governance.

“Would it have made a difference in the end?” the person asked. “We can never know, but history will not judge us kindly.” ©2021 Bloomberg L.P. Visit bloomberg.com.

Distributed by Tribune Content Agency, LLC.

Sign Up for Daily Headlines

Sign up to receive a daily email of today's top military news stories from Stars and Stripes and top news outlets from around the world.

Sign Up Now