Washington –
Internet giants such as Google, Facebook, YouTube, and Twitter owe much of their success to the legal shield that was erected by Congress in 1996.
Known as Section 230, it has been called the base that launched Big Tech. Although it did not gain much attention at the time, it is now seen as one of the pillars of the massively open, global Internet we know today.
While newspapers and television stations can be held liable for any false and harmful content they publish or broadcast, online platforms are treated differently under Section 230.
Congress passed a special free speech rule to protect the new world of online communications. It said: “No provider or user of an interactive computer service will be treated as a publisher or spokesperson for any information provided by another information content provider.”
Law professor and author Jeff Kosev called Section 230 “The 26 Words That Created the Internet” because it allowed websites to develop freely as platforms for other people’s words, images, and videos.
The matter has gone unchallenged in the Supreme Court – until now.
This week, judges will hear two cases that may finally breach that legal shield and dramatically change the rules of the online game.
They are expected to hear a third case later this year related to Internet companies’ First Amendment rights amid government efforts to regulate them.
the The case is scheduled to be heard on Tuesday It began with a lawsuit filed by a California family against Google and YouTube for allegedly aiding and abetting an act of international terrorism. Their daughter, Naomi Gonzalez, was killed in Paris in November 2015 when Islamic State terrorists opened fire on a restaurant where the 23-year-old student was dining with two of his friends. It was part of an ISIS attack in the city that killed 129 people.
Their lawsuit alleged that Google, which owns YouTube, “knowingly allowed ISIS to publish hundreds of extremist videos inciting violence and recruiting potential supporters to join ISIS forces.” Furthermore, they claimed that YouTube “recommended endorsement of ISIS videos to users”.
The subject of contention on Tuesday is only their second claim. Can YouTube be sued for the algorithms it created to direct users to similar content – in this case allegedly directing potential terrorists to other ISIS videos? Or does Article 230 protect them from such allegations?
More than forty technology companies, internet scholars and free speech advocates filed amicus curiae briefings arguing that internet companies should not be held responsible for using computer programs that direct users to content they may find interesting.
“It’s the recommendation algorithms that make it possible to find the needles in humanity’s largest haystack,” said Washington attorney Lisa S. Platt, Google and YouTube representative. She warned that opening the door to lawsuits over algorithms “threatens to upend the modern internet”.
A federal judge dismissed the family’s lawsuit based on Section 230, and the divided Ninth Circuit Court of Appeals affirmed that decision in 2021.
Until this period, the Supreme Court had refused to hear appeals regarding the law. However, Justice Clarence Thomas has on several occasions called for the “curtailment of the blanket immunity that courts have read in Section 230”, particularly in cases where websites have been known to be spreading dangerous falsehoods or criminal schemes.
Some prominent liberals, including Ninth Circuit Court Justices Marsha Pierzon and Ronald Gould, have also called for reducing the scope of Section 230.
They are joined by advocates — liberal and conservative — who portray the Internet as a hotbed of misinformation and hate speech, home to stalkers and scammers and a contributor to teen suicides and mass shootings. Critics also say that social media companies are getting rich and keeping viewers online by amplifying the most extreme claims and the most angry voices.
Google and other tech companies were caught off guard in October when the Supreme Court voted for the first time to hear a direct challenge to Section 230 and determine whether websites like YouTube could be sued for their use of algorithms and targeted recommendations.
Their alarm grew in December when the Biden administration sided with plaintiffs in Gonzales v. Google and said YouTube could be sued over algorithms that “recommend” more videos to viewers.
The DOJ attorneys said the Ninth Circuit court made an error in dismissing the suit, and asked for a new understanding of Section 230. They agreed that the websites are protected from liability for displaying content provided by others, including ISIS videos, but they said they were. Not protected for their “behaviour” in recommending more videos to watch.
And they wrote in their file: “When YouTube presents a user with a video that you did not request to view, it implicitly tells the user that they will be interested in that content based on the video, account information, and its characteristics.”
Several experts in Internet law said they were baffled by the Supreme Court’s decision to hear the case and disturbed by what it might mean.
“The Internet needs regulation. We need to be able to find what we’re looking for,” said Eric Goldman, a professor of law at Santa Clara University. If websites can’t sort content based on algorithms, he said, “the network won’t be efficient internet.
Platt, a Google attorney, said, “YouTube does not endorse videos in the sense of endorsing them, any more than Google Search endorses search results. YouTube shows videos that may be more relevant to users.”
On Wednesday, the court will hear a related case, but one focused solely on whether Facebook, Google and Twitter can be sued for aiding international terrorists.
Congress in 2016 expanded the Anti-Terrorism Act to allow lawsuits to be brought by victims or survivors against anyone who “knowingly provided substantial assistance” to someone who committed an act of international terrorism.
The American family of a Jordanian national who was killed in an ISIS attack on the Reina nightclub in Istanbul in 2017 has sued Facebook, Twitter and YouTube, accusing them of aiding and abetting the killing. They said ISIS maintained public accounts on all three social media platforms and used them to recruit members.
The Ninth Circuit cleared the allegation moving forward, but the Department of Justice and social media companies said that was a mistake. They said the suit should be dismissed because the plaintiffs could not prove that the online platforms provided “significant assistance” to the terrorist who carried out the mass shooting.
It’s not entirely clear why the court agreed to hear the second case, Twitter vs. Tamna, but the justices may have decided they faced two questions: Can a social media site be prosecuted for aiding terrorists? And if so, can it be held responsible for directing viewers to ISIS videos?
It’s unclear whether the justices will split along the usual ideological lines when it comes to debating Section 230, which includes liberals and conservatives on both sides.
The biggest question still pending before the court may be: Can states regulate the internet and penalize social media companies for what they post or remove from their sites?
That clash began with a sharp partisan tone. Republican leaders in Texas and Florida adopted laws two years ago that allow fines and damages suits against Facebook, Twitter and other large social media sites if they “censor” or discriminate against conservatives. In signing the measure, Florida Governor Ron DeSantis said the law was intended to “protect against Silicon Valley elites”.
Before the laws became effective, they were challenged on free speech grounds and suspended based on the First Amendment, not Section 230.
The justices will almost certainly agree to review one or both of the laws because appeals court judges, both appointed by President Trump, have been divided on a key constitutional issue.
Judge Kevin Newsom of the Eleventh Circuit Court in Atlanta blocked most of the Florida law from going into effect. He said the First Amendment “restricts government agencies and protects private agencies.” Social media sites are private companies, “and, with minor exceptions, the government simply cannot tell a private person or entity what or how to say it.”
Shortly thereafter, Judge Andrew Oldham of the Fifth Circuit Court in New Orleans upheld the Texas law because the state sought to protect Texas’ free speech rights. Oldham, a former counsel to Texas Governor Greg Abbott and law clerk to Judge Samuel Alito Jr., said it was a “rather strange inversion on the First Amendment” to say that social media platforms have “the right to silence speech. … We reject the idea that corporations First Amendment right freely to censor what people say.”
Last month, the Supreme Court asked the Department of Justice to consider the issue, and that will cause cases to be delayed until the fall.
If, as expected, the U.S. Attorney’s office presents its view of the issue by June, the justices will likely schedule one or both cases for a hearing in the fall.