The future of the Internet is at stake in the Supreme Court of the United States | technology

This Tuesday and Wednesday, lawyers for Google, Facebook and Twitter come to the defense of their companies in the Supreme Court of the United States. With them, the future of the Internet has a date before the judges. Two oral sessions are held, those for cases Gonzalez vs. Google And Twitter against Tamna. They question the scope of Section 230, the rule that has been the cornerstone of the Internet as we know it today. This rule essentially gives tech companies the ability to modify the content their users generate, but at the same time protects them from liability for it.

Both cases are related to terrorism and the question behind them is: Aren’t social networks like YouTube (owned by Google), Facebook and Twitter responsible for preventing terrorist propaganda from spreading on the Internet? The choice of these two cases indicates that the justices want to clarify the liability exemption that the law grants to technology companies for third-party content.

The code statement in Section 230 of the Communications Decency Act reads: “No provider or user of an interactive computer service shall act as the publisher or publisher of information provided by another provider of information content.” Accordingly, the platforms are absolved from liability for the contents of their users. It’s a law from 1996, when internet companies were still small and it seemed appropriate to protect them.

The rule applies to social networks like Facebook, YouTube, Twitch, or Twitter, but it goes much further than that. Many features of Google, TripAdvisor, Yelp, Reddit, Craigslist, Apple or Microsoft depend in some way on the contributions of their users and this shield of responsibility has been key to their content flourishing. These companies have appeared in cases to defend their position on a common front.

See also  The United Kingdom will send a squadron of Challenger main battle tanks to Ukraine

Nohemi Gonzalez, a 23-year-old American university student, was one of 131 people killed by Islamic State terrorists in a series of attacks that rocked Paris on November 13, 2015, at the Bataclan concert hall and other locations in the capital. French language . Gonzalez was killed in a restaurant where she was dining that day, and her relatives sued Google.

Reynaldo Gonzalez criticizes that not only has YouTube played a passive role, allowing users to simply search for what to watch, but that its algorithm recommends videos based on each user’s history. In doing so, those who watched Islamic propaganda videos received more content of this kind, which facilitated their radicalization process. Nohemi’s relatives complain that the Google group company, whose parent is now Alphabet, has allowed the spread of extremist propaganda videos that incite violence. The victim’s family believes that Google violated anti-terrorism law by allowing these videos to be posted, to include ads, and to share revenue.

Gonzales was defeated in the lower courts. The question for the Supreme Court is whether the waiver of liability reaches those recommendations made by the algorithm. Google argues in its latest court brief that algorithms are the only way to organize the vast amount of information being dumped daily on the web. “Categorizing and grouping videos is the core of publishing.” If its regulation removes the liability shield, the company says, there will be no way to save “search recommendations and other essential search tools.” Programming That organizes a raft of websites, videos, reviews, messages, product lists, files, and other information that was impossible to navigate.

Dystopia risk

According to Google, if the company is held responsible, the Internet “will become a dystopia in which service providers will face legal pressure to censor any objectionable content.” “Some may comply; others may try to evade responsibility by turning a blind eye and letting everything go public, no matter how objectionable. This court must not undermine an essential element of the modern internet.”

See also  Online egos and friendships: How hundreds of classified documents were leaked on Discord | technology

The other case under consideration this week, in this case on Wednesday, Twitter against Tamna, It has nothing to do with the algorithm’s recommendations, but rather in general questions about whether social networks can be prosecuted for alleged complicity in an act of terrorism, for hosting content from users generally expressing support for the group behind the violence. It does not indicate a specific attack.

The lawsuit relates to the terrorist attack on an Istanbul nightclub in which 39 people were killed at a New Year’s Eve party in 2016, New Year’s Eve in 2017. Although the case bears the name of the social network that Elon Musk owns, along with Twitter in the name of OK They are part of Google and Facebook. In this case, the judges ruled against the technology companies, which have appealed to the Supreme Court.

Several Supreme Court justices, including governors Clarence Thomas and Samuel Alito, have expressed interest in accepting cases related to online content moderation. Oral arguments on Tuesday will let us know their positions, though they will have until the end of June to issue a sentence. The two sentences and their accompanying creed could have enormous depth and open the way to an avalanche of lawsuits if they opened a crack in that traditional shield.

Tech companies have also been in the crossfire between political parties for some time. Republicans accuse them of practicing censorship with a progressive bias. Democrats, led by President Joe Biden, are slamming this shield that absolves them of responsibility when they spread hate speech or misinformation. Biden posted a platform last month at The Wall Street Journal, The conservative-leaning Business Daily, in which he called on Republicans and Democrats to “unite against the abuses of Big Tech.” He made clear his position in the debate over Section 230, which he asked to be fixed: “We need to bigtech Take responsibility for the content they post and the algorithms they use.”

See also  AmRest sells La Tagliatella to the KFC restaurant chain in Russia for 100 million

Now the liability shield is on the line, one of two great advantages tech companies have. The other, the power to decide its own moderation policies on what to post and what not to post, is also at stake. Florida and Texas have passed laws preventing platforms from refusing to broadcast certain political content.

On the other hand, content is not the only battle front. Big tech companies are under increasing regulatory, tax and competition scrutiny, with episodes ranging from a Justice Department lawsuit against Google for abuse of a dominant position, a challenge to Microsoft’s purchase of Activision or a lawsuit from some countries against social media for contributing to a youth mental health crisis.

In principle, Section 230 does not affect intellectual property rights and is not a sign letter for its infringement, although in practice these social networks have based their success on systematic copyright infringement. Millions of photos and videos that users don’t have rights to are shared every day with near complete impunity. In practice, only extreme cases of pirated content of high economic value such as sports broadcasts and first-run movies are prosecuted.

You can follow country technology in Facebook And Twitter Or sign up here to receive The weekly newsletter.

Subscribe to continue reading

Read without limits

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button