Social media firms score Supreme Court win in case that could have changed the web
- Oops!Something went wrong.Please try again later.
- Oops!Something went wrong.Please try again later.
- Oops!Something went wrong.Please try again later.
- GOOG
- GOOGL
- META
Internet platforms notched a major Supreme Court victory Thursday when the high court's justices unanimously decided a case that could have rewritten the rules of the web. The court also sent a parallel case back to a lower court.
Both cases had to do with whether social media companies could be held liable for hosting and promoting terrorist content on their platforms.
Both cases, Twitter v. Taameneh and Gonzalez v. Google, challenged Section 230 of the Communications Decency Act, a law that broadly insulates social media platforms, including major platforms like Google's (GOOG) (GOOGL) YouTube, Facebook (META), Instagram (META) and Twitter (TWTR) from legal liability when they show, recommend, and promote content posted by their users.
"Defendants’ mere creation of their media platforms is no more culpable than the creation of email, cell phones, or the internet generally," the court stated in its opinion in the Taameneh case. In characterizing the company's algorithms as "agnostic" to the nature of hosted content, the justices reasoned that they could not impose liability on YouTube for knowingly aiding and abetting terrorism.
Separately in the Gonzalez opinion, the court sidestepped questions concerning YouTube's Section 230 liability, and sent the case back to the Ninth Circuit Court of Appeals to review the case in light of the Taameneh ruling. The court, however, noted its position that the plaintiffs' claims that Twitter aided and abetted terrorism would likely fail.
Anna Diakun, an attorney for the Knight First Amendment Institute at Columbia University, said "The Court correctly recognized in Taamneh that the platforms' alleged conduct was too attenuated and passive to rise to the level of aiding and abetting."
Although the attack was terrible and tragic, Diakun said, imposing liability on the platforms in these circumstances would have had far-reaching implications for free speech online. Still, Diakun expects that the court will eventually have to answer important questions that it avoided in today’s opinions.
"Questions about the scope of platforms’ immunity under Section 230 are consequential and will certainly come up soon in other cases," she said.
The claims in the two cases arose from separate ISIS terrorist attacks and were brought by victims of the attacks. The plaintiffs in Gonzalez claimed that YouTube caused injuries sustained in an attack because the platform aided and abetted terrorism by knowingly sharing and promoting ISIS content.
In the Taamneh case, the plaintiffs argued that Facebook, Twitter, and Google should be held liable for providing material aid to ISIS terrorists.
According to the plaintiffs, the companies should be stripped of Section 230 Immunity because they failed to block the allegedly harmful posts.
In the Taamneh decision, the court went on to say that the plaintiffs failed to allege that the platforms here do more than "transmit information by billions of people—most of whom use the platforms for interactions that once took place via mail, on the phone, or in public areas..."
If the justices ruled in favor of the plaintiffs, it could have upended portions of Section 230 of the Communications Decency Act, which provides online platforms with broad liability protections against content their users post.
In there arguments, plaintiffs claimed that YouTube's act of creating thumbnails — images that appear in internet search results as representations of available third-party content — converted the company from a passive host of third-party content into a recommender of content, more akin to publishers or speakers that are not covered by Section 230's liability shield.
But the court found that the plaintiffs' arguments didn't mean YouTube was liable for the attack.
"Defendants’ mere creation of their media platforms is no more culpable than the creation of email, cell phones, or the internet generally," Thomas wrote. "And defendants’ recommendation algorithms are merely part of the infrastructure through which all the content on their platforms is filtered. Moreover, the algorithms have been presented as agnostic as to the nature of the content."
In Gonzalez, family members and the estate of Nohemi Gonzalez, a 23-year-old U.S. citizen killed in a December 2015 ISIS shooting at Paris' La Belle Equipe bistro, argue that Google should be held at least partially liable for her death. That's because, they allege, the company's YouTube service knowingly permitted and recommended, via algorithms, inflammatory ISIS-created videos that allegedly played a key role in recruiting the attackers.
Google presented arguments before the U.S. Supreme Court Tuesday in a case that could reform the internet – and especially the business models of social media companies – by further defining how much risk comes with hosting third-party content.
During arguments in February, justices questioned the plaintiffs’ contention that site owners, under Section 230, could be held liable when their organizational algorithms recommend particular content by generating thumbnails for suggested videos—and Google's claim that the company's mere organizational choices can't strip it of Section 230 protection.
The justices centered many of their questions around how to draw a line between publishing and hosting.
"The question is what you do with the algorithm," Gonzalez's lawyer Eric Schnapper told the court. "It's the recommendation practice that we think is actionable."
Google's lawyer Lisa Blatt objected, arguing that for content providers, there's no way around making choices to organize and provide search results to their users.
Before the Supreme Court took up the Gonzalez case, the U.S. District Court for the Northern District of California dismissed the lawsuit at Google's request, concluding that Section 230 barred the claims because ISIS, not Google, created the videos. Meanwhile, judges in separate jurisdictions, faced with similar claims, applied varying interpretations to Section 230's liability shield.
Alexis Keenan is a legal reporter for Yahoo Finance. Follow Alexis on Twitter @alexiskweed.
Daniel Howley is the tech editor at Yahoo Finance. Follow him @DanielHowley
Follow Yahoo Finance on Twitter, Facebook, Instagram, Flipboard, LinkedIn, and YouTube
Find live stock market quotes and the latest business and finance news