data centers facing the challenges of AI – L’Express

data centers facing the challenges of AI – LExpress

From its offices, the view is breathtaking over the giant letters forming the word “Marseille”, installed in Hollywood style on the heights of the city. Like many data center managers – or data centers -, the operator Free Pro has chosen the Marseille city for its location. The second most populous city in France enjoys direct access to the numerous underwater Internet cables arriving from the Mediterranean. The connection is excellent, and the climate is favorable thanks to the mistral refreshing the atmosphere around the machines. So much so that the company plans to install a second data center on site. It has already started expanding its existing building: an additional 2,000 square meters, with four new computer rooms. Between the democratization of the cloud – cloud computing -, the advent of 5G or the proliferation of connected objects, the large data market doubles approximately every two years globally.

In recent months, another innovation has been added to the list: generative artificial intelligence (AI). Gilles Elzière, the director of the site, does not yet know what place will be allocated within its walls to this technology which is based in particular on the use of specialized chips, called GPUs, to create images, text, sound or video . One thing is certain: electricity consumption will skyrocket. AI requires equipment that is “heavier, and which releases terrible heat that will have to be cooled”, whispers the owner. What about this energy bill?

To date, the most documented footprint regarding AI is that of training language models (LLM) which feed conversation agents (or chatbots). For GPT-3, from ChatGPT, this process required the equivalent of the annual electricity consumption of 120 American homes and 34.5 million liters of water. Since then, the OpenAI company has unveiled another model presented as “ten times more powerful”. In France, the LLM Bloom, trained on the Jean Zay supercomputer near Paris, consumed for its design the equivalent in electricity of around a hundred French homes over a year. But this period of intensive computing is limited in time, from a few weeks to a few months. More mysterious is the cost of “inference.” Or the daily use of AI by everyone: requests to ask a chatbot to plan a vacation, or perhaps tomorrow to provide a medical diagnosis.

READ ALSO >>Generative AI: “There is nothing magical in it, only mathematics”

This research is expected to grow significantly in the years to come and be distributed across thousands of data centers around the world. Among the large companies contacted by L’Express – which host, secure and manage the equipment used for the storage and calculation of data by companies – none provide definitive figures. Some expect a three-, four- or even five-fold increase in their electricity consumption. “A classic bay – 24, 48 or 96 servers – requires 10 kilowatts (kW) of power allocation. In the GPU area, we are more at 50 kW,” notes Gregory Lebourg, environment director at OVH Cloud. It is difficult, for the moment, to be more precise. Everything will depend on the uses, but also on the cooling techniques used by the data centers to dissipate the heat released by the servers. “We will learn as we go,” recognizes Jérôme Totel, vice-president of strategy at Data4, which manages around thirty sites in France and Europe.

The functioning of a current data center (Paris Region Institute, September 2023 study)

© / (Paris Region Institute, September 2023 study)

Monitored data centers

However, global estimates are beginning to appear. According to research firm SemiAnalysis, OpenAI would need 3,617 servers running Nvidia’s HGX A100 chips to support ChatGPT, which implies an energy demand of 564 Megawatt hours (MWh) daily. As much, or even more, than for the sole training of certain models. At the beginning of October, an analysis by economist Alex de Vries, founder of Digiconomist, known for his work on the impact of cryptocurrency mining, estimated the consumption of AI servers between 85 and 134 terawatt hours (TWh) per year. by 2027. This is more or less what Argentina or Sweden consume today to run their entire economy.

READ ALSO >>French Tech, regime time: funding at half mast, revenge of the “beavers” and AI boom

This measurement exercise should logically be refined in the months and years to come. By also integrating the possible gains that AI should bring in the management of autonomous transport and in the optimization or even automation of professional tasks, particularly in the agricultural or industrial fields. “Digital technology is an essential cog in the decarbonization of the planet,” assures Jérôme Totel. In France, data centers benefit from low-carbon, sometimes green, energy. But part of the environmental cost is linked to emergency systems, mainly fueled by fuel oil.

These early statistics portend challenges to come, as data centers come under increased scrutiny. The latter already account for almost 2% of global electricity consumption. Studies carried out before the current boom around AI had estimated that this share would rise to 13% in 2030. “We are being asked to consume less,” confirms Gilles Elzière, when asked about his relations with the municipality of Marseille. In a stand at Release published at the beginning of September, Sébastien Barles, deputy mayor of the city, delegate for ecological transition, called for the establishment of a moratorium on the subject. “In the Marseille city, the 30,000 m² of data centers consume the equivalent of a city of 150,000 inhabitants in electricity. The cumulative demands of the various operators for projects in progress are dizzying: it is the equivalent of the consumption of 600,000 inhabitants […] This is how this winter, districts of Marseille risk being victims of possible load shedding without penalizing the energy-intensive data centers,” lamented the latter, also evoking the few jobs generated by these centers. The debate is also very lively outside France. In the Netherlands, Amsterdam has given up on hosting so-called “hyperscale” data centers, the largest of them.

READ ALSO >>Bruno Patino: “Generative AI will further accelerate the digital deluge”

In Ile-de-France, within the main hub of data centers in the country – around 160 are installed there, out of a total of 264 in the country -, a study unveiled at the end of September by the Paris Region Urban Planning Institute also reports possible future tensions. “The growth of the data center market could have a considerable impact on electricity consumption and the robustness of the electricity network, in a context of increasing needs, linked in particular to the electrification of mobility, and uncertainty over capacities production”, note its authors. Contacted by L’Express, they indicate that they have not taken into account the additional weight of generative AI, due to lack of information on the issue.

Expertise

A new era is dawning for data centers. Which starts with investment. “For every dollar spent on a GPU, about a dollar must be spent on energy costs to run the GPU in a data center,” investment company Sequoia Capital recently warned. The various French centers assure that they did not wait for the arrival of ChatGPT to begin their transformation. The next generation will have “much larger and more powerful buildings”, observes Fabrice Coquio. The head of Interxion supervises a titanic project in La Courneuve, with “130 MW of electrical power, or a sixth of that of a nuclear power plant”. Data4 will quadruple its offer by 2029. The company recently took on debt of 2.2 billion euros, to which is added 1 billion as an option.

These different players praise their know-how in terms of energy management and space density, which has so far given rather satisfactory results. Despite an increase in Internet traffic of 600% between 2015 and 2022, energy use by data centers has only increased by 20 to 70% at the same time, according to the International Energy Agency (OUCH). This, thanks to the efforts made on cooling servers, which today use more and more water in a closed circuit. The ratio between the energy consumed by the machines and that required to run the entire data center, including cooling, has continued to decline. This PUE, for Power Usage Effectiveness, a basic reference in the sector, today stands on average at 1.55 worldwide, compared to 2 in 2011. In France, figures of 1.2 to 1.4 are often advanced.

This quest for performance should continue in the AI ​​era. CNRS researcher Abdoulaye Gamatié, deputy director of the Montpellier Computer Science, Robotics and Microelectronics Laboratory, indicates that “various avenues are already being explored in the academic and industrial environment to take advantage of AI with little effort. ‘energy”. Data centers are not the only ones in the loop. Guillaume Avrin, national coordinator for artificial intelligence, deplores the use of “components that are still poorly optimized” for these AI tasks. All eyes are therefore turning towards Big Tech and first and foremost towards the great leader in the market for electronic chips used in artificial intelligence: the American Nvidia.

.

lep-life-health-03