Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Supreme Court assesses tech company protections in Google case

Supreme Court assesses tech company protections in Google case

WASHINGTON — In a case that could change the very structure of the Internet, the Supreme Court on Tuesday explored the limits of a federal law that protects social media platforms from legal liability for what users post on their sites.

The judges appeared to view the positions taken by both sides as too extreme and expressed doubts about their own competence to find common ground. “These are not the top nine Internet experts,” Justice Elena Kagan said of the Supreme Court.

Others had practical concerns. Judge Brett M. Kavanaugh said the court should not “crush the digital economy”.

The case was brought by the family of Nohemi Gonzalez, a 23-year-old student killed in a restaurant in Paris during the November 2015 terrorist attacks, which also targeted the Bataclan concert hall. Eric Schnapper, a family lawyer, argued that YouTube, a Google subsidiary, was responsible because it used algorithms to push Islamic State videos to interested viewers, using information that the company had collected about them.

“We focus on the recommender function,” Schnapper said.

But Judge Clarence Thomas said the recommendations were key to making internet platforms useful. “If you’re interested in cooking,” he said, “you don’t want light jazz vignettes.” He later added, “I see these as suggestions and not really as recommendations.”

The federal law at issue in the case, Section 230 of the Communications Decency Act, also protects online platforms from lawsuits over their decisions to remove content. Case gives judges the chance to narrow the scope of the shield and expose platforms to legal action over whether they directed people to posts that promote extremism, advocate violence, damage reputations and cause emotional distress.

Mr Schnapper said YouTube should be responsible for its algorithm, which he said consistently recommended videos that incite violence and support terrorism. The algorithm, he said, was YouTube talk and separate from what users of the platform had posted.

Judge Kagan stressed to Mr. Schnapper the limits of his argument. Did he also challenge the algorithms used by Facebook and Twitter to generate user feeds? Or with search engines?

Mr Schnapper said anyone who could lose their protection under certain circumstances, an answer that seemed to surprise Judge Kagan.

Judge Amy Coney Barrett asked if Twitter users could be prosecuted for retweeting ISIS videos. Mr Schnapper said the law at issue in the case could allow for such a lawsuit. “It’s the content that you created,” he said.

Section 230 was enacted in 1996, at the very beginning of the Internet. It was a reaction to a ruling holding an online message board responsible for what a user had posted because the service had engaged in some moderation of content.

The provision stated: “No provider or user of an interactive computer service shall be deemed to be the publisher or speaker of information provided by another information content provider.”

This provision allowed the rise of social networks such as Facebook and Twitter by ensuring that the sites did not take legal responsibility for each post.

Malcolm L. Stewart, a Biden administration attorney, argued for the family in Gonzalez v. Google, no. 21-1333. He said successful prosecutions based on recommendations would be rare, but Section 230 immunity was generally not available.

Google attorney Lisa S. Blatt said the provision gives the company complete protection from lawsuits like the one brought by Ms. Gonzalez’s family. YouTube’s algorithms are a form of editorial curation like search engine results or Twitter feeds, she said. Without the ability to deliver interesting content to users, she said, the internet would be a useless mess.

“Any post requires organization,” she said.

A ruling against Google, she said, would force sites to remotely remove problematic content or allow content, no matter how vile. “You’d have ‘The Truman Show’ versus a horror show,” she said.

Judge Kagan asked Ms Blatt whether Section 230 would protect a “pro-ISIS” algorithm or an algorithm that encouraged defamatory speech. Ms. Blatt said yes.

Section 230 has been the subject of criticism across the political spectrum. Many liberals say this has shielded tech platforms from liability for misinformation, hate speech and violent content. Some conservatives say this provision has allowed the platforms to become so powerful that they can effectively shut out right-wing voices from the national conversation.

Judges will hear arguments on Wednesday in a related case, also linked to a terror attack. This case, Twitter v. Taamneh, No. 21-1496, was brought by the family of Nawras Alassaf, who was killed in a terrorist attack in Istanbul in 2017.

At issue in this case is whether Twitter, Facebook and Google can be prosecuted under the Terrorism Act 1990, on the theory that they encouraged terrorism by allowing the Islamic State to use their platforms. If the judges were to say no, the case against Google litigated Tuesday could be moot.

Whatever happens in the cases being argued this week, both of which involve the interpretation of statutes, the court is very likely to agree to consider a looming First Amendment issue arising from laws enacted in Florida and Texas: States can stop big social media companies from deleting posts based on the opinions they express?

Tech

The post Supreme Court assesses tech company protections in Google case appeared first on AfroNaija.



This post first appeared on AfroNaija.Com, please read the originial post: here

Share the post

Supreme Court assesses tech company protections in Google case

×

Subscribe to Afronaija.com

Get updates delivered right to your inbox!

Thank you for your subscription

×