Google warns US Supreme Court that tampering with Section 230 of the Communications Decency Act could bankrupt the internet and trigger devastating fallout

Google warns US Supreme Court that tampering with Section 230 of the Communications Decency Act could bankrupt the internet and trigger devastating fallout


Last Thursday, Google filed a key defense brief in a US Supreme Court case that could reshape the legal landscape for online publishers and services. Google told the court that altering Section 230 of the Communications Decency Act, which protects companies from liability for content their users post, would “undermine a central element of the Internet.” González v. Google, the case the Supreme Court will hear next month, will decide whether Section 230 protections apply to the algorithms that YouTube and other platforms use to select content to show users. An unfavorable ruling on Google in this case by the Supreme Court regarding YouTube’s recommendation engine could have unintended consequences for much of the internet, the search giant said.

Section 230 of the Communications Decency Act, which protects companies from liability for the content that their users have, allows online platforms to engage in good faith in content moderation while protecting them from responsibility for the messages of their users. Tech platforms argue that this is essential protection, especially for smaller platforms that could otherwise face costly legal battles, as the nature of social media platforms makes it difficult to promptly address any harmful message.

But the law has been hotly debated in the US Congress, with lawmakers on both sides saying liability protections should be significantly limited. While many Republicans believe the law’s content moderation provisions should be scaled back to reduce what they see as censorship of conservative voices, many Democrats question how the law can protect platforms that host misinformation and hate speech.

The Supreme Court case known as Gonzalez v. Google was introduced by family members of US citizen Nohemi Gonzalez, who was killed in a 2015 Paris terror attack for which ISIS claimed responsibility. The complaint alleges that Google-owned YouTube failed to prevent ISIS from serving content on the video-sharing site enough to support its propaganda and recruitment efforts. The plaintiffs sued Google under the Anti-Terrorism Act of 1990, which allows US nationals harmed by terrorism to seek damages. The law was updated in 2016 to add secondary civil liability to anyone who aids and abets, knowingly providing substantial assistance in an act of international terrorism.

Today, the Gonzalez family hopes the high court will agree that Section 230 protections, designed to protect websites from liability for hosting third-party content, should not be extended to also protect the right of platforms recommend harmful content. But Google thinks that’s exactly how liability protection should work. In the lawsuit, Google argued that Section 230 protects YouTube’s recommendation engine as a legitimate tool to facilitate other people’s communication and content.

Section 230 broadly protects technology platforms from lawsuits related to content moderation decisions made by companies. However, a Supreme Court ruling saying that AI-based recommendations lack these protections could threaten essential functions of the internet, Google writes in its filing. Websites like Google and Etsy depend on algorithms to sift through mountains of user-generated content and display content that is likely relevant to each user. If plaintiffs could evade Section 230 by targeting how websites sort content or trying to hold users accountable for liking or sharing articles, the Internet would become a disorganized mess and a minefield. for litigation, writes the company.

Faced with such a decision, websites might have to choose between intentionally over-moderating their sites, stripping them of virtually anything that could be perceived as objectionable, or doing no moderation at all to avoid the risk of liability, said argue Google. In its petition, Google said YouTube abhors terrorism and cited its increasingly effective actions to limit the spread of terrorist content on its platform, before insisting that the company cannot be sued for recommended the videos because of its liability protection under Section 230.

Gonzalez v. Google is considered a benchmark in content moderation and one of the first Supreme Court cases to examine Section 230 since it was passed in 1996. Several Supreme Court justices have expressed a desire to rule on the law, that has been widely interpreted by the courts, championed by the tech industry, and sharply criticized by politicians of both parties.

Google argues that it is not the Supreme Court that makes decisions to reform Section 230, but Congress. In a legal brief released last month, the Biden administration stressed that Section 230 protections should not extend to recommendation algorithms. President Joe Biden has long called for changes to Section 230, saying technology platforms should take more responsibility for the content that appears on their websites. As recently as Tuesday, Biden published an op-ed in The Wall Street Journal urging Congress to change Section 230.

But in a blog post on Thursday, Halimah DeLaine Prado, Google’s general counsel, argued that narrowing Section 230 would increase the threat of litigation against online businesses and small businesses, which would curb freedom of movement. expression and economic activity on the Internet. Services could become less useful and less reliable as efforts to eliminate scams, fraud, conspiracies, malware, violence, harassment and more would be stifled, DeLaine Prado wrote.

Source: US Supreme Court

And you?

What is your opinion on the subject?

See as well :

Supreme Court blocks Texas social media moderation ban, legal battle over HB 20 continues



New algorithm bill could force Facebook to change how News Feed works without changing Section 230



Experts Tell US Senators Social Media Algorithms Threaten Democracy, But Facebook, YouTube and Twitter Challenge This Allegation