The European Union wants to force internet companies to review private communications for child sexual abuse | technology



The line between security and privacy on the Internet is very fine. The conflict between the extent to which the first guarantee can be violated is permanently open. The European Union will discuss and agree during the Spanish presidency, if the scheduled timeline is met, a (directly applicable) regulation obligating web service companies to “assess, mitigate and, if necessary, detect, report and eliminate sexual abuse of minors in their services”. European Union institutions and dozens of associations related to children’s rights support the initiative in facing a problem that affects “at least one out of every five children”, according to the study conducted by the Council of Europe. one in five. Conversely, entities such as Xnet, a network of digital rights advocates and other professionals, believe the proposal is not feasible, it seeks to end confidentiality in communications and threatens fundamental freedoms. The regulation itself considers that these will be affected, but justifies its limitations.

The regulation proposed by the Commission has been defended by Swedish Social Democratic member Ylva Johansson in her capacity as EU Commissioner for Home Affairs, while the rapporteur of the regulation is Spanish MP Javier Zarzalios (PP). The main argument is that, according to the proposal’s explanatory note, “one in three participants admitted during a study that they had been asked to do something sexually explicit online during their childhood and that more than half had experienced some form of abuse.” sexual.” In the past 10 years, the infection rate has increased by 6,000%, according to the platform of the Eurochild Association. “We, as a group of organizations campaigning for the rights, safety and protection of children online and offline, support the European Commission’s proposal as a decisive step towards better protection of minors,” We emphasize organizations in an open letter.

Until now, in Europe, the identification of a child sexual abuse suspect is voluntary by companies. The European Union considers that “the vast majority of complaints come from a handful” and “a large number of them take no action”. On the contrary, the problem of online abuse continues to grow. The National Center for Missing & Exploited Children (NCMEC), to which US providers are required to report child abuse, receives 30 million reports annually, of which more than 1 million are for EU member states.

The regulation states that the ECPC will facilitate access to “reliable technologies” and what “indicators” companies will look for to “detect, report, block and remove sexual abuse material from minors”. These search algorithms, according to the standard, will try to avoid false positives or false complaints and will comply with standards “verified by independent judicial or administrative authorities in member states”.

See also  "It works very well, but it's not magic": This is ChatGPT, the new artificial intelligence that pushes boundaries | technology

However, among the regulatory options, it includes the EU’s most comprehensive, which requires companies to detect not only known (confirmed) substances, but also new substances, noting those that “could constitute material of abuse, but were not disclosed.” They are yet to be confirmed as such by authority,” as well as those intended to recruit minors.

“The measures contained in the proposal affect, first of all, the exercise of the fundamental rights of users of the services concerned,” acknowledges the text of the aforementioned regulation, “respect for privacy (including confidentiality of communications, as part of the right to respect private and family life), data protection personality, freedom of expression and information.”

Rights restrictions

“Despite their great importance, none of these rights is an absolute privilege and must be viewed in accordance with their role in society (…). Article 52, paragraph 1, of the Charter of Fundamental Rights of the European Union allows their limitation, according to the conditions laid down in the provision aforementioned”, says the European Union.

In this sense, the proposed regulation states that the technologies applied will be the “least intrusive to privacy” and will be implemented “without anonymity.” It also “guarantees the right to effective judicial protection at all stages, from disclosure to adjudication, and limits the preservation of material and data to what is absolutely necessary.”

The EU also recognizes that “corporate freedom comes into play”, as it subjects operators to “an excessive burden and may affect the free choice of customers and suppliers or contractual freedom”. This right does not constitute an absolute privilege either; the EU insists that it is possible to embark on a wide range of interventions that place restrictions on the conduct of economic activity in the public interest.

rejection

The initiative was rejected by companies in the sector and groups in defense of digital rights under the premise that “abuse of minors is appalling, but the regulation does not solve the problem, but rather establishes mass surveillance, abolishes the sanctity of communications, weakens effective judicial protection, attacks small and medium technology in favor of monopolies and will generate False positives can affect anyone and hinder investigators,” according to Xnet.

The idea is old: get rid of internet encryption and communications privacy

Sergio Salgado, Xnet Activist

Sergio Salgado, one of the members of this platform, doubts that the initiative to combat child sexual abuse on the Internet, and says: “The idea is old: to end the encryption of the Internet and the privacy of communications.” Salgado describes the regulation as a perversion of the first rule, which was called Chat Control 1: “It’s the same, but it makes control compulsory: it’s the old dream of power in which encryption and privacy are generally an issue.”

See also  Ivan Turgenev: From Prince of European Culture to Russian Despair

There is only one fundamental right that does not allow any kind of restriction: freedom of thought. The rest accept them, but they must be minimal and under effective judicial protection. In no way can the Internet be turned into a space of exception. It’s like saying the police can enter anyone’s home without authorization, but they promise they won’t.”

Salgado points to false positives as one of the biggest problems. “Innocent people will be harmed,” he says, noting that “these unsubstantiated accusations will also sink the investigators who are effectively prosecuting child molestation.”

False positives

Google suspended the accounts of two users in the US who had sent pictures of their children to pediatricians to monitor for infection. In Spain, Valencian professor David Barbera was left without access to thousands of private files in the cloud because of alleged images that the same company considered suspicious.

Big corporations will become a private police force and will have conservative standards: they don’t want problems or legal risks. In case of doubt, the article is deleted. Preserving free speech and open conversation on the Internet will not be your priority. In addition, effective judicial protection is based on the fact that you are the one who has to appear in court,” says Salgado.

“It is legislation that uses pedophilia to limit digital rights and it just doesn’t work that way; it concludes that freedom cannot be sacrificed for security.”

European Digital Rights, an international group of pro-civil rights organisations, agrees with Xnet and is calling for the EU’s regulation to be withdrawn: “It would force providers of all our digital conversations, messages and emails to know what we write and share at all times and would remove the possibility of anonymity from many spaces.” legitimate online.

Favor

Javier Zarzalgos, the European Parliament’s rapporteur for regulations, disagrees with all the precautions and defends the benefits of the proposal. He stressed that all companies would have to “do a risk analysis” and that this was not the same in all cases: “A payment service where there are effective age-verification mechanisms is not like services like chat that attracts business. It is that one can interact with Any person freely and anonymously. It will be the competent authority in each country that assesses whether the risk analysis is sufficient and realistic and will have to agree to protection measures. There are no actions that companies can take without the permission of the competent authorities.”

See also  Elon Musk corrects one of his CEOs within hours and rebrands official Twitter accounts | technology

It also rejects that the obligation to screen communications violates its privacy. “The encryption is preserved. The technology that is applied is very reliable and works in a very similar way to the one he discovered Spam emails [correo no deseado]. According to some indicators, algorithms detect cases. There is no access to the content of the connection and they act as classifiers.”

All messages go through a scanner to prevent them from carrying illegal products. The fact that it goes through the scanner doesn’t mean that Correos knows what I’m writing in a message

Javier Zarzalgos, European Parliament rapporteur for the proposed regulation

“Companies should introduce human verification to determine pattern detection consistent with a child sexual abuse case. Suspicions should be referred to state security agencies and agencies. For a sentence or two or for sending pictures of my grandchildren on the beach, no one will conclude that we are dealing with a minor who is being harassed or attracted for sexual assault purposes,” the MEP added. In this sense, Zerzalius establishes an analogy: “All letters pass through a scanner to prevent them from carrying illegal products. The fact that they pass through the scanner does not mean that the post office knows what I write in a letter.”

It is also consistent with the approach of the European Union, which provides for the possibility of imposing restrictions on freedoms. There are no absolute fundamental rights and here we are talking about a very serious crime.” Fortunately, we have a very broad framework of safeguards and jurisdiction. The idea is that we have a sufficiently adaptable regulation, proof of technological development, and provide a legal framework that allows us to maintain it in a changing environment “, Add.

“We do not establish comprehensive control over all communications. We are talking about tools that do not require access to content and are subject to review, under specific conditions imposed by law. It is already applied in the field of combating terrorism with an effective system and has many similarities with this system.

Swedish Socialist MP Ylva Johansson, who introduced the Commission’s motion, also defends the proposal: “As adults, we have a duty to protect children. Child sexual abuse is a real and growing risk. The proposal sets out clear obligations for companies to detect and report abuse of minors, with safeguards Strong privacy guarantees for everyone, including children.

You can write to [email protected] and follow up country technology in Facebook And Twitter Or sign up here to receive The weekly newsletter.

Subscribe to continue reading

Read without limits



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button