Uncategorized

The Dirty Secret Of Artificial Intelligence | technology


Indeed, everyday actions such as consulting the best way to go somewhere or translating a text require large amounts of energy, hydraulic and mineral resources. These applications run in the cloud, which is a euphemism for millions of powerful computers arranged in sprawling data centers. For mobile applications to work, a huge number of computers are needed that store trillions of data and perform operations in milliseconds (for example, calculating distances taking into account traffic). It is estimated that data center energy consumption accounts for between 1% and 2% of the global total. But everything indicates that these numbers will go up.

Generative Artificial Intelligence (AI), which makes it possible Chat bot Gadgets like ChatGPT, as well as tools that create original artwork or music from text, need a lot of computing power. Major technology companies, with Microsoft and Google at the helm, have decided to integrate these functions into search engines, text editors or email. Our relationship with commonly used software will change: until now, we have pressed a series of commands to perform certain activities; Soon we’ll find ourselves talking to the machine, asking it for tasks we used to do before.

What is the impact of this paradigm shift on the environment? Nobody knows, but the estimates are all upward. “Artificial intelligence may seem ethereal, but it shapes the physical world,” says Kate Crawford. Artificial Intelligence Atlas. The Australian, Principal Investigator at Microsoft Research and Director of the AI ​​Now Institute, warned two years ago that the “planetary costs” associated with this technology do not stop growing. Some scientists calculated four years ago that the technology sector would be responsible for 14% of global emissions by 2040; Others, that energy demand in data centers will multiply by 15 through 2030.

See also  More Eyes in the Universe, New RNA Vaccines, and Steve Aoki on the Moon: What Surprises Science Can Bring in 2023 | Sciences

All of these expectations may be lower. They are interrupted by ChatGPT. Google and Microsoft bring together hundreds of millions of users. What happens if they all start using generative AI-powered tools? Canadian Martin Bouchard, co-founder of Qscale Data Centers, believes that at least four to five times more computing power will be needed for each search. When asked about their current consumption levels and growth projections in the age of generative AI, Google and Microsoft preferred not to provide this paper with specific data, other than to reaffirm their intention to achieve carbon neutrality by 2030. Crawford, this means that they offset their emissions by buying credit people” through ecological synthesis actions, such as planting trees or other similar actions.

One of the corridors of the data center owned by Google in Douglas, Georgia (USA).
One of the corridors of the data center owned by Google in Douglas, Georgia (USA).

says Carlos Gomez Rodriguez, professor of computing and artificial intelligence at Universidad La Coruña. “But AI still generates more emissions than search engines, because it uses neural network-based architectures, with millions of parameters that need to be trained.”

How much pollution does AI pollute?

Two years ago, the computer industry’s carbon footprint absorbed the impact of aviation when it was at its peak. Training a natural language processing model is equivalent to the same number of emissions that five gasoline cars would exhale over its lifetime, including the manufacturing process, or 125 round trips between Beijing and New York. Other than emissions, the consumption of water resources to cool the systems (Google spent 15,800 million liters in 2021, according to a study by natureAnd While Microsoft announced 3.6 billion liters), in addition to relying on rare earths to make electronic components, making artificial intelligence a technology that has major environmental implications.

Training a natural language processing model is equivalent to the same number of emissions that five petrol cars would exhale in their lifetime.

There is no data on how much and what kind of energy the big tech companies consume, and they are the only companies with an infrastructure powerful enough to train and feed the large language models on which generative AI relies. There are also no exact figures on how much water is used to cool the systems, an issue that is already causing tensions in countries like the US, Germany or the Netherlands. Companies are not required to provide such information. “What we have are estimates. For example, training GPT3, the model on which ChatGPT is based, would have generated about 500 tons of carbon, which is the equivalent of going to and from the Moon by car. That may not be a lot, but it has to be taken into account.” Keep in mind that the model must be retrained periodically to incorporate the updated data,” says Gomez. OpenAI has just introduced a more advanced model, GPT4. And the race will continue.

Another estimate says the January 2023 electricity use of OpenAI, the company behind ChatGPT, could equal the annual use of about 175,000 Danish families, which isn’t a big spender. “This is a forecast with the current numbers of ChatGPT; if its use becomes more widespread, we could be talking about an equivalent consumption of electricity for millions of people,” adds the professor.

An aerial view of the Google data center in Saint-Ghislain, Belgium.
An aerial view of the Google data center in Saint-Ghislain, Belgium.

The opacity of the data will start dissipating soon. The European Union is aware of the increasing energy consumption in data centers. Brussels has an ongoing directive that it will start discussing next year (and therefore, it will take at least two years for it to come into force) that sets out requirements for energy efficiency and transparency. The United States operates on a similar regulation.

Costly training for algorithms

Alex Hernandez explains, “AI carbon emissions can be broken down into three factors: the power of the hardware used, the carbon intensity of the power source that powers it, and the energy used in the time it takes to train the model.” is a postdoctoral researcher at the Quebec Institute for Artificial Intelligence (MILA).

It is in training where most emissions are concentrated. This training is a key process in developing machine learning models, the type of artificial intelligence that has grown the fastest in recent years. It consists in showing the algorithm millions of examples that help it create patterns that allow it to predict situations. In the case of linguistic models, for example, when you see the words “the earth is,” you know you have to say “round.”

Electricity use in January 2023 at OpenAI, the company behind ChatGPT, is equivalent to the annual use of about 175,000 Danish families

Most data centers use advanced processors called GPUs to perform training on AI models. GPUs need a lot of power to run. Training large language models requires tens of thousands of GPUs, which need to run around the clock for weeks or months, according to a recent Morgan Stanley report.

“Large language models have very large architectures. A machine learning algorithm might need to help you choose who to map to 50 variables: where you work, what salary you have now, previous experience, etc. Explains Anna Valdivia, a postdoctoral researcher at Computing and Artificial Intelligence at King’s College London, GhatGPT has over 175 billion variables.” You have to retrain all these kind of structures, as well as host and exploit the data that you’re working on. This storage also has consumption, he adds.

Hernandez, of MILA, just submitted an article analyzing the energy consumption of 95 models. “There is a bit of variance in the instrumentation used, but if you train your model in Quebec, where the majority of electricity runs on hydropower, you reduce carbon emissions by a factor of 100 or more relative to where coal, gas, or other predominates,” the researcher asserts. Chinese data centers are known to run on 73% coal-generated electricity, which emitted at least 100 million tons of carbon dioxide in 2018.

Directed by Joshua Bengio, whose contributions to deep neural networks earned him a Turing Prize (considered to be the Nobel Prize in computer science), MILA has developed a tool, Code Carbon, that is able to measure the carbon footprint of those who program and train algorithms. The goal is for professionals to incorporate it into their code to see how much they are exporting and that this helps them make decisions.

More computing power

An additional problem is that the computing power required to train the largest AI models doubles every three to four months. This was already revealed in 2018 by an OpenAI study, which also warned that “it’s worth preparing when systems need much more capabilities than they currently have.” It’s a much faster speed than that set by Moore’s Law, according to which the number of transistors (or power) of a microprocessor doubles every two years.

“Considering the models that are currently being trained, more computational capabilities are needed to ensure that they can run. Certainly, big tech companies are already buying more servers,” Gomez predicts.

For Hernandez, emissions from using AI are less of a concern for several reasons. “There is a lot of research aimed at reducing the number of parameters and power complexity needed by models, which will improve. However, there are not many ways to reduce it in training: weeks of heavy use are required. The first is relatively easy to improve; the second, not so much.

One possible solution to make training less polluting is to reduce the complexity of the algorithms without losing efficiency. “Do you really need many many millions of parameters to get well-functioning models? GhatGPT, for example, has been shown to have many biases. The way to achieve the same results using the simplest architectures is being investigated,” Valdivia reflects.

You can follow country technology in Facebook And Twitter Or sign up here to receive The weekly newsletter.

Subscribe to continue reading

Read without limits



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button