Clumsy or even careless content moderation is no crime, the Supreme Court held in a unanimous opinion posted Thursday—the latest in a long line of court rulings to uphold Section 230 of the Communications Decency Act.
That 1996 statute says online forums generally aren’t liable for the content their users post—the users are—and can police their spaces as aggressively as they want without becoming responsible for what shows up there.
Thursday’s case, Twitter, Inc. v. Taamneh, could have cut a large hole in “CDA 230” protections. It alleged that by not evicting the ISIS terrorist cult and then automatically recommending its content, Twitter, Facebook, and Google provided material help that led to an ISIS follower murdering 39 people(Opens in a new window) on Jan. 1, 2017, at an Istanbul nightclub called Reina. Under the 2016 Justice Against Sponsors of Terrorism Act(Opens in a new window), such assistance—if established—would open those platforms to lawsuits for damages from victims.
As Justice Clarence Thomas writes in the court’s opinion (PDF(Opens in a new window)): “The key question, therefore, is whether defendants gave such knowing and substantial assistance to ISIS that they culpably participated in the Reina attack.”
The court’s answer: No, that didn’t happen under any reasonable understanding of what it means to “aid and abet” a criminal act or enterprise.
“Yet, there are no allegations that defendants treated ISIS any differently from anyone else,” Thomas writes. “Rather, defendants’ relationship with ISIS and its supporters appears to have been the same as their relationship with their billion-plus other users: arm’s length, passive, and largely indifferent.”
And that includes any automated recommendation or monetization of ISIS content (the original complaint, appealed by Twitter to the Supreme Court, included an allegation that Google’s YouTube shared some advertising revenue with an ISIS account).
“As presented here, the algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content,” the opinion continues. “The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting.”
The court’s opinion doesn’t mention CDA 230, although the subtext there seems clear enough: Even if that shield didn’t exist, there still would be no grounds for a successful lawsuit here.
The only crack in the court’s unanimity came in a concurring opinion from Justice Ketanji Brown Jackson, in which she warns against taking the court’s decision too far in discussions of liability for terrorism: “Other cases presenting different allegations and different records may lead to different conclusions.”
The Supreme Court applied the same logic to dismiss a related case, Gonzalez v. Google LLC, in an unsigned opinion (PDF(Opens in a new window)) that said the complaint “states little if any claim for relief.”
Recommended by Our Editors
Sen. Ron Wyden (D-OR), who co-wrote “CDA 230” while representing Oregon in the House, endorsed the court’s opinion in a statement posted Thursday(Opens in a new window).
“I appreciate the Supreme Court’s thoughtful rulings that even without Section 230, plaintiffs would not have won their lawsuits,” he says. “As is the case with a majority of suits blocked on 230 grounds, the First Amendment or an inability to prove the underlying claims would lead to the same result.”
Wyden didn’t let social platforms off the hook for failing to boot terrorist content but said Congress should resist the temptation to punish them by taking away 230’s protections:
“I urge Congress to focus on things that will truly address abusive practices by tech companies, including passing a strong consumer privacy law, reigning in unethical data brokers,f and tackling harmful design elements in ways that don’t make it harder for users to speak or receive information,” he says.
The Supreme Court may soon have another chance to weigh in on 230 if, as most court watchers expect(Opens in a new window), it takes up challenges to Texas and Florida laws that ban social platforms from moderating content based on political viewpoints—statutes that can easily be read to compel a Facebook, Google, or Twitter to host ISIS advocacy.
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Hits: 0