Subscribe

The Supreme Court ruled for Google and Twitter in a pair of closely watched liability cases Thursday, saying families of terrorism victims had not shown the companies helped foster attacks on their loved ones.

“Plaintiffs’ allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack,” Justice Clarence Thomas wrote in a unanimous decision in the Twitter case. The court adopted similar reasoning in the claim against Google.

The court’s narrowly focused rulings sidestepped requests to limit a law that protects social media platforms from lawsuits over content posted by their users, even if the platform’s algorithms promote videos that laud terrorist groups. That law, Section 230, has emerged as a lightning rod in the politically polarized debate over the future of online speech, as tech companies come under increased pressure to police offensive, harmful and violent posts on their platforms.

The court’s decision was a victory for tech companies and their surrogates, which have been running extensive lobbying and advocacy campaigns to defend Section 230 in Washington. Changes to the law, they say, could open a floodgate of litigation that would squash innovation and make it impossible for many popular websites to operate.

The claim against Google specifically focused on whether Section 230 protects recommendation algorithms. Tech companies said any novel interpretations of the legal shield could have wide-ranging effects on the technology that underlies almost every interaction people have online, from innocuous song suggestions on Spotify to prompts to watch videos about conspiracy theories on YouTube.

“Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” Google general counsel Halimah DeLaine Prado said in a statement. “We’ll continue our work to safeguard free expression online, combat harmful content, and support businesses and creators who benefit from the internet.”

Section 230 has been denounced by politicians from both parties who are increasingly concerned over the power that tech companies have over what posts and videos people see online.

Democrats, wary of the ways social media has been weaponized to spread falsehoods about elections and public health, want to change the provision to ensure that tech companies have more responsibility for harmful and offensive content on their websites. Republicans are concerned that Section 230 protects companies from lawsuits over decisions to remove content or suspend accounts, especially since the companies took the historic step of suspending Donald Trump and individuals involved in the Jan. 6, 2021 attacks on the U.S. Capitol. (Meta, YouTube and Twitter have reinstated the former president’s account in recent months).

It was clear at oral arguments that the justices were reluctant to make significant changes to the law. “We’re a court,” Justice Elena Kagan said at the time, adding that she and her colleagues “are not like the nine greatest experts on the internet.”

In the Twitter case, American relatives of Nawras Alassaf said the company failed to properly police its platform for Islamic State-related accounts in advance of a Jan. 1, 2017, attack at the Reina nightclub in Turkey that killed Alassaf and 38 others. In the Google case, the family of an exchange student killed in an Islamic State attack in Paris said Google’s YouTube should be liable for promoting content from the group.

The relatives in both cases based their lawsuits on the Anti-Terrorism Act, which imposes civil liability for assisting a terrorist attack. At issue was whether the company provided substantial assistance to the terrorist group.

But Thomas, writing in the Twitter case, said the link was too attenuated.

“As alleged by plaintiffs, defendants designed virtual platforms and knowingly failed to do ‘enough’ to remove ISIS-affiliated users and ISIS related content — out of hundreds of millions of users worldwide and an immense ocean of content — from their platforms,” he wrote. “Yet, plaintiffs have failed to allege that defendants intentionally provided any substantial aid to the Reina attack or otherwise consciously participated in the Reina attack - much less that defendants so pervasively and systemically assisted ISIS as to render them liable for every ISIS attack.”

The Google case specifically raised the issue of Section 230. But the short, unsigned decision said the justices “decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief.”

The statute was enacted in 1996, years before YouTube, Facebook and other social networks existed. It has proved a potent legal shield for the companies, who regularly use it to seek the dismissal of lawsuits. Both Trump and President Biden and former president Donald Trump have criticized Section 230, at times calling for it to be revoked.

Despite a flurry of congressional hearings, however, there’s been little consensus among lawmakers about how to change it.

Sen. John Cornyn (R-Texas) said that the court’s decisions on Thursday put the onus back on Congress to take action. “One reason [they declined to take it up] might be that they want the Congress to do our job,” Cornyn said in a brief interview. “It’s a complex issue and I hope we take them up on it.”

Sen. Ron Wyden (D-Ore.), who co-wrote Section 230 as a member of the House nearly three decades ago and filed a brief in its defense in these cases, said he appreciated the court’s “thoughtful rulings that even without Section 230, the plaintiffs would not have won their lawsuits.”

“Despite being unfairly maligned by political and corporate interests that have turned it into a punching bag for everything wrong with the internet, the law … remains vitally important to allowing users to speak online,” Wyden said in a statement.

Tech industry-funded groups also celebrated the court’s decision. Chamber of Progress, which receives funding from Meta, Google and other companies and filed a brief supporting Google in the case, called the ruling an “unambiguous victory for online speech and content moderation.”

“While the Court might once have had an appetite for reinterpreting decades of Internet law, it was clear from oral arguments that changing Section 230’s interpretation would create more issues than it would solve,” Jess Miers, a lawyer for the group, said in a statement.

Even some legal experts who submitted briefs in support of the Gonzalez family said they were pleased with the Supreme Court’s opinion. Mary Anne Franks, the president of the Cyber Civil Rights Initiative, had called for the court to more narrowly interpret Section 230, after she argued lower courts had used it wrongly.

But she said that the justices’ opinion in the Twitter case shows that such litigation can be decided in the regular course of the law, taking “the wind out of the sails” of industry arguments that companies need a dedicated shield to protect them from “bad” lawsuits.

She also said the ruling provides a “green light” to Congress to move forward.

“It does clear up some of the underbrush when it comes to Section 230 reform,” she said. “If we were holding off to see what court might do, the court has answered.”

The cases are Twitter v. Taamneh and Gonzalez v. Google.

The Washington Post’s Cristiano Lima contributed to this report.

The Supreme Court building in Washington.

The Supreme Court building in Washington. (Jonathan Newton/The Washington Post)

Sign Up for Daily Headlines

Sign up to receive a daily email of today's top military news stories from Stars and Stripes and top news outlets from around the world.

Sign Up Now