Maria and Alejandra use the same social network, but the content they access is completely different. It all depends on the information they enter into their searches, with this data, search engines can serve up such highly personalized content that they can generate such different facts for each one of them, with added risks for them and the user. to the community. This is a practical case already employed by a multidisciplinary team of staff from the European Center for Algorithmic Transparency (ECAT), which was officially inaugurated this Tuesday in Seville and has been called upon to become the main tool of the European Commission to ensure that platforms comply with the principles to ensure proper functioning and control illegal content. and harmful online contained in the Digital Services Directive, which came into effect last November.
The transparency mechanisms established in the DSA and the large digital platforms (those with more than 45 million users and representing 10% of the potential European market), which force their owners to identify and control the risks of their algorithmic systems, could have alerted Maria and Alejandra. Checking that the Internet giants have correctly performed this self-assessment and anticipating any possibility of evasion from them will be one of the main functions of the ECAT. The center has a major responsibility to open the black boxes of algorithmic systems, decipher them and ensure they meet the standards of European regulations.
“This is a big step because there are big expectations, as we are showing the world that the application of standards ends up making a difference on the ground and that this can change the attitude of the big platforms, that they have to take their social responsibility and not just look for the growth of their business,” he says. Senior UNHCR official responsible for DSA implementation. Algorithms and their use are a topic of discussion in many areas, and the difference that ECAT makes is that its research is done with the aim of supporting the Commission’s legal services to ensure compliance with regulations.
ECAT is based in the Joint Research Center (JRC) headquarters of the Commission in Seville. At the moment it has 10 experts in various topics not only related to computing, artificial intelligence or big data, but also to specializations of a social nature, because one of the goals is to discover that large companies and search engines comply with the obligations related to protecting children, and not to incite violence against women or that the contents do not pose a mental health risk. The goal is that in the coming months the team will expand to 30 professionals and in the second phase it will grow to 40.
In addition to the task of overseeing the documentation provided by Internet giants to them in self-assessment, which has been previously audited by a third company, ECAT can also initiate investigations through complaints from third parties or when it discovers such compliance with the DSA. All this information will be able to file legal cases against these companies in the event of evidence of malpractice. However, the Commission wants to make it clear that ECAT does not claim to be a “Ministry of Truth”. “If we discover that illegal content is being hosted, routed, or misinformation distributed and it has not been removed and corrected, we can take action. The burden of proving that they did not act properly is on the companies.” In this sense, Brussels is working to set standards so that large companies can carry out self-assessments of risks in matters such as privacy, mental health protection…
In this task of opening black boxes, there are three steps of transparency that large platforms and search engines must achieve. The first reaches the average user. “Companies should warn in language that minors accessing their networks can understand, so they can offer them better browsing options,” says another ECAT official. “This gives the user greater capacity and capacity.” The second level is the level that relates to civil society, experts, NGOs… – “This will provide us with some kind of information and publication reports for experts,” says the technician. The second level shows the prospective parameter that also overlaps with ECAT. The new center wants to act as a bridge between academia – universities and research groups – and private companies – agreements yet to be defined -.
The third level is the most important for researchers. This means getting into the guts of the algorithm and in order to investigate and gain access to it, the committee can appoint experts or auditors.
The conclusions of investigations on the operation and impact of algorithmic systems will also be shared with the public, continuing to push publications that are also ECAT compliant, but also with the desire to be able to depict the behavior of large companies. The commission avoids delving into potential disputes or disagreements it may encounter with Internet giants, who are accustomed to competing in a more liberal environment like the United States, and prefers to approach it as an opportunity to improve their image and build trust by offering opportunities in the digital world.
ECAT, along with experts, is called upon to become the commission’s arm to not only understand and reveal how algorithmic systems work to prevent and avoid unwanted effects, but also to ensure, through transparency, what is illegal in real life is therefore illegal on Social media.
You can follow country technology in Facebook And Twitter Or sign up here to receive The weekly newsletter.