Uncategorized

The United States Supreme Court is inclined to keep social networks protective of their content | technology



Justice Elena Kagan opened Tuesday: “We’re a court. We really don’t know about this stuff. You know, these aren’t the Internet’s nine best experts.” The United States Supreme Court was analyzing the scope of “the 26 words that created the Internet today,” as defined by Google’s attorney (copied an expression used several times). This is the article that allows platforms and social networks to veto content they deem inappropriate which at the same time protects them from liability for content uploaded by third parties. In that session on Tuesday and that session on Wednesday, the judges seemed inclined to keep that shield.

Two different cases are discussed. On Tuesday, I was in the spotlight if YouTube’s algorithm recommendations (and by extension any social network) are protected in the same way as third-party content. Relatives of Nehemiah Gonzalez, one of the victims of the Islamic State attacks that shocked Paris on November 13, 2015, at the Bataclan music hall and other venues in the French capital, sued Google, the owner of YouTube, for publishing Islamic State videos.

In the case that held the hearing on Wednesday, it was analyzed whether social networks in general, and Twitter as the first defendant in particular, favored the development of certain terrorist organizations. In this case, the lawsuit was filed by relatives of one of the victims of the terrorist attack on the Reina nightclub in Istanbul in which 39 people were killed on New Year’s Eve in 2016, New Year’s Eve in 2017.

The rule under discussion, on whose interpretation the future of the Internet depends, is the famous Section 230 of the Communications Decency Act of 1996, which states: “No provider or user of an interactive computer service shall be treated as a publisher or broadcaster of information provided by another provider of information content.” “.

The Supreme Court justices won’t be the internet’s greatest experts, Kagan admitted, but they’ve made their case. The fact that they accepted both cases at the same time and the doubts expressed in the past about this discharge of responsibility lead one to believe that they were dealing with cases willing to change the interpretation of the rule. However, from what had been heard in the past two days, it could be concluded that they tended to maintain this protection.

in the first case, Gonzalez vs. GoogleAnd The lawyer for Nohemi Gonzalez’s relatives changed the arguments. Ultimately, their complaint centered on the way YouTube’s algorithm invites viewers of Islamic State videos to watch similar videos. “In some circumstances, the way third-party content is organized or presented could convey other information from the defendant itself,” he said, reducing his argument to thumbnails the platform suggests while watching another video.

See also  Masonry worker, the most scarce profession in all of Europe

The judges immediately showed their doubts. Clarence Thomas and other justices said they found it normal for YouTube to show cat videos to those who watch cat videos, or cooking to those who watch cooking or racing or ISIS videos… “I think you have to give us a clearer example of what exactly that means.” , challenge. Chief Justice John Roberts also said he had difficulty holding Google accountable if there was a generic algorithm, and not one intended to promote terrorist content.

Elena Kagan spoke along similar lines: “This is a pre-algorithmic condition and everyone is doing their best to figure out how to apply it in a post-algorithm world,” she said, later concluding that “algorithms are so endemic to the Internet, that every time someone looks at something What on the Internet, there is an algorithm involved.

Judge Sonia Sotomayor admonished the lawyer who asserted in his lawsuit, paragraph by paragraph, that TouTube was responsible for not removing the video from its platform, however, at the hearing, he has now admitted that he did not have to do so and that the problem was that his algorithm She recommended him to see her. Internet providers can be responsible for what? To show me the following video similar to the one I’m watching? ‘ I asked skeptically.

Although the Gonzalez family’s attorney has insisted that when the platform starts suggesting things it hasn’t explicitly asked for, it should no longer be protected, it didn’t seem to convince the judges. “I don’t see how a neutral proposal about something you’ve expressed interest in is collusion. [con el terrorismo]. I’m trying to get him to explain to us how something pretty standard on YouTube for almost anything you’re interested in suddenly equals complicity because you’re in the Islamic State category.”

Judge Samuel Alito also seemed to reject this argument: “I’m afraid I’m totally confused with the argument you’re making. (…) he acts [Youtube] As a publisher just to display these thumbnails of ISIS videos after searching for an ISIS video? “

Sonia Sotomayor and Elena Kagan said there is probably a middle ground between that weak argument for thumbnails and not setting any boundaries. Faced with the need to draw that line, Kagan asked, “Isn’t this something that Congress should do, not the court?”

See also  Fake media publishes 1,000 articles a day: AI reaches content farms | technology

The rule was issued in 1996, when there was no Google, YouTube, Facebook, or Twitter. Democrats and Republicans have asked to change it with different arguments. And in this, as Justice Neil Gorsuch has pointed out, the next step is AI: “In a post-algorithmic world, AI can generate some form of content. The content, its selection, analysis or digestion, and this is not protected by the famous Article 230 in his opinion.

Judge Brett Kavanaugh expressed concern over all opinions contributed to the case that removing such protections endangered the Internet as we know it, as to whether “doing so, now withdrawing from the interpretation that was in force, would create a significant economic shock, It would really break the digital economy with all sorts of impacts on workers, consumers, retirement plans…”, he noted, concluding: “We are not in a position to explain that (…) Are we really the proper body to move away from what was coherent text and interpretation in Appellate courts?”, also referring to Congress.

Roberts noted, in addition, that he fears an excessive increase in lawsuits if disclaimers of liability to technology companies for third-party content are withdrawn.

Google attorney Lisa Platt has defended the use of algorithms and recommendations as inherent to the Internet services business. “Each post requires organization and inherently conveys the same implicit message,” he said. According to Platt, Section C 1. Section 230 “reflects Congress’ decision to protect websites for posting speeches of others, even if they knowingly post speech that is harmful to other people.” Congress made this decision to prevent lawsuits from choking the Internet in its early days. The result was revolutionary. Innovators opened up new horizons for the world to share infinite information, and websites necessarily pick, choose, and curate the third-party information that users see first,” he added, warning: “Exposing websites to responsibility for implicitly recommending text-defying third-party content threatens today’s Internet.”

Capture the terrorists

wednesday status, Twitter against Tamna, It has both common and different elements. The lawsuit, filed by relatives of one of those killed in the Istanbul attack, asserts that Twitter, Facebook and Google are responsible for the attack because the Islamic State group used social networks to gain its reputation, transmit its messages and ultimately capture the terrorists. A lower court accepted the lawsuit and the judges of the higher court must decide whether or not to proceed.

Again, the justices were more on the side of the tech companies than the other, though Justice Kagan intervened, in which she compared tech companies to banks, which are persecuted in money laundering or terrorist financing cases. We are used to thinking of banks as very important service providers to terrorists. We may not be used to it, but it seems true that various types of social media platforms also provide services to terrorists.”

See also  Mario Vargas Llosa loves his freedom

Most of the justices seemed to lean more towards the position presented by Judge Alito, who noted that it would not make sense to hold phone companies responsible for the criminal activity of people who use their phones. What if the company knows that a certain person has a criminal record and is likely to be involved in criminal activity and is using the phone to communicate with other gang members? Is this complicity in the crimes they commit?

The Anti-Terrorism Law punishes anyone who knowingly helps a person to commit a terrorist act. Judge Amy Coney Barrett stressed that there were not enough concrete facts to support the claim, other than public statements that the networks, particularly Twitter, were used for public recruitment or radicalization of followers. But there is no concrete evidence in the lawsuit such as direct messages, comment threads or other indications that the network was being used to coordinate activities for a terrorist attack.

At one point, Judge Sonia Sotomayor asked Twitter, Facebook, and Google attorney Seth Waxman to help her determine what an opinion would look like if the court ruled in favor of her clients. He told her to “write it to me”. Justice Brett Kavanaugh summed up Waxman’s argument this way: “Where there is a legitimate business that provides services on a widely available basis … you will not be liable under this article, even if you know that bad people are using your services. Your services. For bad things.” Waxman agreed: “It is true, unless you have specific knowledge, in which case, it would be accounts or messages, which, in fact, are being used to plan or commit a terrorist act, including an attack like the one that hit the plaintiff.. That is, you must That there be concrete knowledge in this context. That is our standard.”

Judgment in the two cases is expected next June, as the judicial session ends before the holiday. It is the month when the most relevant sentences tend to come into focus.

You can follow The Country Technology in Facebook And Twitter Or sign up here to receive The weekly newsletter.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button