Growing environmental impact over generative AI ahead of Paris summit
As the rapid rise of generative artificial intelligence (AI) continues to transform industries, growing environmental concerns are fueling discussions ahead of a global summit in Paris on February 10-11, 2025.
The event will focus on the ecological footprint of AI, with experts warning that the technology’s expansion could exacerbate already significant environmental challenges.
Generative AI, which includes systems like OpenAI’s ChatGPT, is increasingly integrated into daily life. The chatbot alone, which now boasts 300 million weekly users, generates an astounding one billion requests per day.
Yet each request consumes 2.9 watt-hours of electricity—ten times the energy required for a single Google search, according to the International Energy Agency (IEA). This stark comparison highlights the mounting power demands of AI applications as they become more widespread.
The surge in AI usage is not limited to individual chatbots. A survey by French pollsters Ifop revealed that 70 percent of 18- to 24-year-olds in France use generative AI, with similar numbers in the United States, where a Morning Consult poll found 65 percent of teenagers are regular users.
Data centers consume 1.4% of global electricity
However, the power needs of AI go far beyond just end-user applications. Data centers, which host vast networks of information and computing resources, are essential to the functioning of generative AI.
These facilities currently consume almost 1.4 percent of global electricity, a figure that is expected to rise significantly. Deloitte projects that by 2030, data centers will account for 3 percent of global electricity consumption, equivalent to the combined annual usage of France and Germany. Additionally, the IEA forecasts a 75 percent increase in data center energy consumption by 2026.
AI training and greenhouse gas emissions
The environmental toll extends beyond electricity consumption. AI training sessions for large language models (LLMs), such as OpenAI’s GPT-3, have been found to release substantial amounts of greenhouse gases. In 2019, researchers at the University of Massachusetts Amherst estimated that training one
LLM could produce approximately 300 tonnes of carbon dioxide, equivalent to 125 round-trip flights between New York and Beijing. Despite advances in the technology, experts say it remains difficult to fully assess the cumulative environmental impact of AI due to a lack of transparency in production processes and inconsistent measurement standards.
Water usage is another concern. Cooling the hardware required for AI models demands significant amounts of water. GPT-3 alone consumes about half a liter of water for every 10 to 50 responses it generates.
The total water consumption across the industry could reach up to 6.6 billion cubic meters annually—four to six times the annual water usage of Denmark, according to a 2023 study by the University of California Riverside and the University of Texas at Arlington.
The rapid growth of generative AI has also led to an increase in electronic waste. A 2023 study published in Nature Computational Science revealed that AI applications produced 2,600 tonnes of e-waste in 2023.
If current trends persist, that figure could soar to 2.5 million tonnes by 2030, equivalent to the disposal of 13.3 billion smartphones. The production of AI hardware, including chips and memory cards, further compounds the problem, as mining for rare metals often results in environmentally damaging practices, particularly in regions such as Africa.