ChatGPT, MidJourney… AI, an energy cost that is difficult to calculate

ChatGPT MidJourney… AI an energy cost that is difficult to

Denouncing the energy mismanagement of artificial intelligence is a bit like criticizing the smoke from a merguez stand at the Fête de l’Huma. It’s incongruous. In the abundant literature on conversational AIs such as ChatGPT or generative AIs that produce images per kilometer, it takes a speleologist’s soul to find information on the environmental toll of this digital exuberance.

At the end of last fall, a group of Microsoft engineers were ordered to give in to a request from above: they were to suspend their work intensive in computing capacity and transfer the use of thousands of servers to a priority task for the company. It wasn’t until a few weeks later that they realized the directive was about integrating ChatGPT into the Bing search engine. As the model’s huge artificial neural network had to be retrained to bring it up to date, it was necessary to align as many specialized servers as possible.

The energy cost of AI

The problem is just beginning. Today, searches on the Internet using ChatGPT type systems represent only a tiny number of queries. But these interfaces are so enticing that the industry is gearing up for phenomenal growth. Measuring the environmental impact of artificial intelligence is extremely complex. Researchers in computer science do not agree among themselves, as for ecological pharmacies, their rigor is inversely proportional to their indignation in principle.

Let’s try to summarize the problem. Artificial intelligence is first and foremost data that is used to train models. The other important component is what are called parameters, which are weights used to analyze data; they act a bit like an immense sieve whose meshes would be of different sizes. The size of a model is determined according to the volume of data used to train it, and the number of parameters, which expresses its sophistication. The more the model has learned and has more parameters, the better the performance will be. In this domain, the law of large numbers reigns. Four years ago, there were tens or hundreds of millions of words or parameters. Today we are talking about hundreds of billions for both.

This growth has an astronomical energy and financial cost. In their academic paper “Energy and Policy Considerations for Deep Learning in Natural Language Processing“, three researchers from the University of Massachusetts at Amherst make comparisons: if we count one ton of CO2 for a New York – San Francisco flight, 57 tons for the complete life cycle of a car, they estimate that training a large language model produces 284 tons of greenhouse gases. And their paper dates from 2019. Since then, the size of GPT has grown geometrically: 120 million parameters for the first version, 1, 5 billion for the second, 175 billion for version 3. And, this year, we expect the new version, which will be 500 times more powerful, with 100,000 billion parameters, or as many as there are connections between the neurons of the human brain. As for the number of data, it grew in the same proportions.

ChatGPT power consumption

Admittedly, it is not necessary to continuously train these models, but it is then necessary to reckon with their exploitation. Each image generated by an AI like Midjourney will solicit a server for a few seconds consuming as much as a French resident (6 kWh per day). For the ChatGPT text generator, it was estimated that its operation for the month of last January was equivalent to a city of 175,000 inhabitants. And it only recorded less than 20 million visits per day. Even if it captures only a tiny fraction of a Google’s ten billion daily queries, it gives an idea of ​​the coming explosion of electricity meters.

Obviously, all this growth will only have a marginal impact on electricity consumption. Because we have to reckon with the spectacular improvement in performance/energy consumption ratios. Finally, the large data centers of Gafam are rapidly decarbonizing: Google has estimated that it will be carbon neutral in 2030. But this prognosis was made before the surge in the use of these new forms of artificial intelligence.

lep-sports-01