Subscribe

Facebook said Friday that it plans to suspend former president Donald Trump for two years following his comments inciting violence in the wake of the Capitol insurrection on Jan. 6.

Facebook said Friday that it plans to suspend former president Donald Trump for two years following his comments inciting violence in the wake of the Capitol insurrection on Jan. 6.

The social media giant will only reinstate him "if the risk to public safety has receded," according to a blog post on the company's website.

Facebook's new policy refers specifically to the behavior of public figures during periods of heightened violence or unrest, according to the blog post. Facebook says it will now initiate a series of time-bound suspensions for violators, starting with a one-month suspension, and look to experts to help reevaluate the situation at the end of each period.

The announcement, part of a set of responses to the Facebook Oversight Board's recommendations in May regarding its suspension of the former president, is likely to have major implications for how the platform treats controversial public figures going forward.

In ruling on whether the social network should reinstate Trump's account on its service, the largely independent Facebook-funded Oversight Board said the social media company was correct in suspending him in the moment but lacked a clear rationale for keeping him off the platform indefinitely.

The company's announcement Friday is an attempt to clarify Trump's penalty and make the procedures of the powerful social network, which is used by 3.45 billion people globally on a monthly basis, appear less arbitrary and opaque to the public. It is also the first major test of how a nongovernment watchdog might act as a check on Facebook's power.

"We know that any penalty we apply — or choose not to apply — will be controversial," said Nick Clegg, Facebook's vice president of global affairs, in the post. "There are many people who believe it was not appropriate for a private company like Facebook to suspend an outgoing President from its platform, and many others who believe Mr. Trump should have immediately been banned for life. We know today's decision will be criticized by many people on opposing sides of the political divide — but our job is to make a decision in as proportionate, fair and transparent a way as possible, in keeping with the instruction given to us by the Oversight Board."

President Donald Trump speaks to a gathering of mayors from around the U.S. in the East Room of the White House, January 25, 2020.

President Donald Trump speaks to a gathering of mayors from around the U.S. in the East Room of the White House, January 25, 2020. (STARS AND STRIPES)

Trump said in an emailed statement the ruling was an insult to the people who voted for him last year. Facebook "shouldn't be allowed to get away with this censoring and silencing, and ultimately, we will win," he added. "Our Country can't take this abuse anymore!"

Facebook on Friday fell short of saying it would comply with another board recommendation to publish a full public accounting of its role in fomenting the events that took place on Jan. 6. Instead, the company said it had created a partnership to exchange data about what took place on the platform during the election with 20 academic researchers, and would continue to cooperate with law enforcement. The researchers are planning to make their findings public.

The company also said it would no longer automatically give politicians a pass when they break the company's hate speech rules, a major reversal after years of criticism that it was too deferential to influential people during the Trump presidency.

Since the 2016 election, the company has applied a test to political speech that weighs the newsworthiness of the content against its propensity to cause harm. Now the company will throw out the first part of the test and will no longer consider newsworthiness as a factor.

But Facebook doesn't plan to end the newsworthiness exception entirely. In the cases where an exception is made, the company will now disclose it publicly — after years of such decisions being closely held. And it will also become more transparent about its strikes system for people who violate its rules, committing to telling users how many strikes they have along with the consequences.

Facebook's critics said the two-year ban didn't go far enough, and noted that the timing would allow Trump to come back onto the platform before the 2024 election. That would allow him not only to rebuild a passionate audience, but to use the service for the fundraising, list-building, and event promotion that is key to political campaigns.

"He will be back just in time to load a hundred million into Facebook ads," said Joan Donovan, research director of the Technology and Social Change Research Project at the Shorenstein Center on Media, Politics and Public Policy at Harvard's Kennedy School. "Even if he doesn't run, he is a huge bank for the GOP so he will be a shot caller."

James Steyer, CEO and Founder of Common Sense, an advocacy group for children and their relationship to technology and a frequent Facebook critic said in an email the timeline only goes past the 2022 election cycle.

It "does not protect Americans from his interference in the next presidential election, which is why Facebook should, and can, permanently ban Trump," he said. "He incited a violent attack on our Capitol that resulted in five deaths. There is no justification to ever reinstate Trump on Facebook. Period."

Soon after Jan. 6, Facebook turned its decision on Trump — which it said would be enforced indefinitely — over to the Oversight Board to decide whether the company made the right call.

After four months of deliberations, the Oversight Board unexpectedly kicked the Trump decision back to the social network, giving it six months to decide whether to ban Trump permanently or reinstate him. It issued 19 recommendations, including that the company publish a report about its role in the Jan. 6 riot and make changes to its newsworthiness exception. Facebook plans to full implement 15 of them.

Publicly, Facebook executives have deflected blame for the events at the Capitol onto other companies. The Washington Post and others have reported that rioters used Facebook to help organize.

The Post reported last year that the newsworthiness exemption was first created in response to Trump's inflammatory remarks about Muslims during his candidacy. Since then, the company has maintained that it rarely used the exception and has only acknowledged using it six times. Those incidents were all outside the United States, and include political speech in Hungary, Vietnam and Italy. Facebook has not disclosed the names of the political figures who were given exceptions, despite repeated requests for the information.

In practice, however, Facebook has appeared to give politicians and political leaders a pass in many more instances. In 2019, CEO Mark Zuckerberg said the company would not apply its fact-checking to political ads, for example.

And throughout his presidency, Trump repeatedly flooded the platform with misinformation. He promoted baseless claims of voter fraud and repeatedly stated without evidence that the 2020 election was stolen. Facebook chose to append a generic label to most of that content rather than ban it.

In its responses to the Oversight Board, Facebook also acknowledged that it had inaccurately told the board that it had never applied the newsworthiness exception to Trump. Facebook said that in August 2019, the company issued a newsworthiness exception for Trump when he insulted a man at a New Hampshire rally, saying, "That guy's got a serious weight problem."

The Post reported last year that the newsworthiness exception was initially crafted in response to Trump's behavior, citing documents and several sources. Facebook denied that report.

Even more so than the newsworthiness exception, the strikes system is another opaque area of Facebook's policies and practices. Users can be censored or demoted after a certain number of strikes for breaking rules. But the company has said it does not want to share its policing strategies for fear that it will enable loopholes.

The result was what was criticized as an arbitrary system, however. People whose content was removed often did not know what rule they had broken, and seemingly routine violators sometimes appeared to be treated with kid gloves.

Facebook's response to the Oversight Board is being watched as a key test for the possibility of self-regulation by powerful social media companies. Facebook and other Silicon Valley giants are facing a wave of potential new regulation over issues such as privacy and algorithmic transparency all over the world, as well as a major antitrust lawsuit in the United States.

If the Facebook-created board is viewed as a legitimate check on the company's power, experts have said it could become a model for countries looking at ways to regulate how social media companies police content on their platforms, or for other companies in a similar position. But it also could make the need for regulation seem less urgent because a solution already exists, they said.

In its responses, Facebook noted that the Oversight Board should not be seen as a "replacement" for regulation.

In 2018, Zuckerberg — under immense political pressure over the company's content moderation practices — presented the idea for an independent body that would oversee controversial decisions made by the social network. The idea was to put a check on the social network's power, which was being roundly criticized by government officials, academics and the public over allowing the spread of Russian disinformation, inflammatory political discourse and hate speech.

Facebook funded the Oversight Board through an independent trust and selects its members but has given it the power to make binding decisions on content that the board determines has been wrongly removed or kept up. The 20-member board also can issue voluntary policy recommendations. Members include a Nobel laureate, free-speech experts, and a former Danish prime minister.

Trump also has been suspended indefinitely from YouTube, the gaming platform Twitch, Snapchat and other platforms, and has been banned from Twitter over the same set of comments from Jan. 6.

Trump built one of the world's most powerful and passionate online audiences during his tenure as president. But researchers have shown that he has not been able to garner the same level of online attention since he was taken off mainstream platforms. He recently turned to using his own website to put out statements, but his team shut it down this week.

The public is divided about whether Trump should be reinstated across social media, according to the nonpartisan Pew Research Center. In April, Pew published a report finding that 50 percent of Americans think Trump should not be permanently banned while 49 percent believe in a permanent ban.

In response to the decision, President Joe Biden's press secretary Jen Psaki said that the administration continues to believe that "every platform, whether it's Facebook, Twitter and any other platform that is disseminating information to millions of Americans, has a responsible to crack down on disinformation, to crack down on false information, whether it's about the election or even about the vaccine."

Regarding Trump's demonstrated use of social media, she added that it "feels pretty unlikely that the zebra is going to change his stripes over the next two years."

The Washington Post's Gerrit De Vynck, Heather Kelly, Rachel Lerman and Donna Cassata contributed reporting.

Sign Up for Daily Headlines

Sign up to receive a daily email of today's top military news stories from Stars and Stripes and top news outlets from around the world.

Sign Up Now