Ecological Price of AI: Energy Consumption of Neural Networks, Water, and Investor Risks

/ /
Ecological Price of AI: Energy Consumption of Neural Networks, Water, and Investor Risks
12
Ecological Price of AI: Energy Consumption of Neural Networks, Water, and Investor Risks

Artificial Intelligence Becomes a Major Consumer of Energy and Water: How the Growth of Neural Networks Affects the Climate and What Risks and Opportunities It Creates for Investors and the Global Economy

Artificial Intelligence (AI) is rapidly transforming into a significant consumer of resources. By 2025, AI systems alone are estimated to consume so much electricity that associated CO2 emissions could reach around 80 million tons — comparable to the annual emissions of a metropolis like New York City. Additionally, cooling servers for these neural networks consumed up to 760 billion liters of water. Notably, exact figures remain unknown: technological giants do not disclose detailed statistics, prompting scientists to rely on indirect data. Experts warn that without transparency and sustainability measures, such trends could evolve into a serious environmental issue.

The Rapid Growth of AI and Its Appetite for Energy

The demand for AI computing power has soared in recent years. Since the launch of public neural networks such as ChatGPT in late 2022, businesses worldwide have accelerated their adoption of AI models, which require vast amounts of data processing. Industry estimates suggest that by 2024, AI will account for approximately 15–20% of the total energy consumption of data centers globally. The power needed to operate AI systems could reach 23 GW by 2025 — comparable to the total electricity consumption of a country like the United Kingdom. In comparison, this figure exceeds the energy consumption of the entire Bitcoin mining network, indicating that AI has become one of the most energy-intensive types of computing.

This exponential dynamic is driven by large-scale investments from technology companies in infrastructure: almost every week, new data centers are opened, and specialized chips for machine learning are launched every few months. The expansion of such infrastructure directly leads to increased electricity consumption needed to power and cool thousands of servers that support modern neural networks.

Emissions at the Megalopolis Level

Such high energy consumption inevitably results in significant greenhouse gas emissions if the energy is partially sourced from fossil fuels. According to recent research, AI could be responsible for 32–80 million metric tons of carbon dioxide (CO2) emissions per year by 2025. This effectively elevates AI's "carbon footprint" to the level of an entire city: for instance, New York's annual emissions are around 50 million tons of CO2. For the first time, a technology that once seemed purely digital demonstrates climate impacts on par with large industrial sectors.

Importantly, these estimates are considered conservative. They primarily account for emissions from electricity production for server operation, while the complete life cycle of AI — from equipment production (servers, chips) to disposal — creates an additional carbon footprint. If the AI boom continues at its current pace, the associated emissions will grow rapidly. This complicates global efforts to reduce greenhouse gases and places the burden on technology companies to reconcile the explosive growth of AI with their commitments to achieving carbon neutrality.

The Water Footprint of Neural Networks

Another hidden resource appetite of AI is water. Data centers consume vast amounts of water for cooling servers and equipment: evaporative cooling and air conditioning cannot be executed without water resources. In addition to direct consumption, significant volumes of water are required indirectly — at power plants for cooling turbines and reactors during the production of the very electricity consumed by computing clusters. Experts estimate that AI systems alone might consume between 312 to 765 billion liters of water in 2025. This is comparable to the total volume of bottled water consumed by humanity in a year. Thus, neural networks create a colossal water footprint that has remained largely unnoticed by the public until now.

Official estimates often fail to reflect the complete picture. For example, the International Energy Agency reported approximately 560 billion liters of water consumed by all data centers worldwide in 2023; however, this statistic did not include water used at power plants. The actual water footprint of AI could be several times higher than formal estimates. Major industry players are still hesitant to reveal details: in a recent report on its AI system, Google explicitly stated that it does not account for water consumption at third-party power plants in its metrics. Such an approach has faced criticism, as a significant portion of water is indeed consumed to meet the electrical needs of AI.

Already, the scale of water consumption is causing concern in several regions. In arid areas of the U.S. and Europe, communities oppose the construction of new data centers, fearing that they will deplete scarce water from local sources. Corporations themselves are also noting increased "thirst" from their server farms: Microsoft reported that its global water consumption in 2022 surged by 34% (to 6.4 billion liters) largely due to increased load associated with training AI models. These facts underscore that water factors are rapidly rising to prominence when assessing the environmental risks of digital infrastructure.

Opacity Among Tech Giants

Paradoxically, despite the scale of AI's impact, publicly available data on energy and water consumption is extremely limited. Major tech companies (Big Tech) typically report aggregate figures on emissions and resource use in their sustainability reports, without separately detailing the share related to AI. Detailed information about data center operations — for instance, how much energy or water is consumed specifically for neural network computations — often remains within companies. There is virtually no data on "indirect" consumption, such as water expended during electricity generation for data center needs.

Consequently, researchers and analysts often act like detectives, reconstructing the picture from fragmented data: snippets from corporate presentations, estimates of the number of AI server chips sold, data from energy companies, and other indirect indicators. This opacity complicates understanding the full scale of AI's ecological footprint. Experts call for strict disclosure standards: companies should report on the energy and water consumption of their data centers, broken down by key areas, including AI. Such transparency would allow society and investors to objectively assess the impact of new technologies and push the industry to seek ways to reduce its environmental load.

Imminent Environmental Risks

If current trends persist, the growing appetite of AI could exacerbate existing environmental issues. Additional tens of millions of tons of greenhouse gas emissions annually will complicate meeting the goals of the Paris Agreement on climate change. The consumption of hundreds of billions of liters of freshwater will occur amidst a global water resource shortage, which is predicted to reach 56% by 2030. In other words, without sustainability measures, the expansion of AI risks clashing with the planet's ecological limits.

If nothing changes, such trends could lead to the following negative consequences:

  1. Accelerated global warming due to increased greenhouse gas emissions.
  2. Worsening freshwater shortages in several already arid regions.
  3. Heightened strain on energy systems and social-ecological conflicts over limited resources.

Already, local communities and authorities are beginning to respond to these challenges. In some countries, restrictions are being placed on constructing “energy-hungry” data centers, requiring the use of water recycling systems or the purchase of renewable energy. Experts note that without radical changes, the AI industry risks transitioning from a purely digital sphere into a source of tangible environmental crises — from droughts to disruptions of climate plans.

Investors' Perspective: The ESG Factor

The environmental aspects of AI's rapid development are becoming increasingly important for investors. In an era when ESG (Environmental, Social, and Governance) principles are coming to the forefront, the carbon and water footprint of technologies directly influences company evaluations. Investors are pondering: will the “green” shift in policy lead to increased costs for companies betting on AI? For instance, tightening carbon regulations or introducing water usage fees could escalate expenses for companies whose neural network services consume significant amounts of energy and water.

Conversely, companies that are already investing in mitigating AI’s environmental impact could gain an advantage. Transitioning data centers to renewable energy, enhancing chips and software for improved energy efficiency, and implementing water reuse systems reduce risks and enhance reputation. The market highly values progress in sustainability: investors worldwide are increasingly incorporating environmental metrics into their business evaluation models. Thus, for technology leaders, the pressing question is how to continue scaling AI power while meeting societal expectations concerning sustainability. Those who find a balance between innovation and responsible resource management will thrive in the long term — both in terms of reputation and business value.

A Path to Sustainable AI

Despite the scale of the problem, the industry has opportunities to steer AI growth toward sustainability. Global tech companies and researchers are already working on solutions capable of reducing AI's ecological footprint without stifling innovation. Key strategies include:

  • Improving energy efficiency of models and equipment. Developing optimized algorithms and specialized chips (ASICs, TPUs, etc.) that perform machine learning tasks with lower energy consumption.
  • Transitioning to clean energy sources. Using electricity from renewable resources (solar, wind, hydro, and nuclear energy) to power data centers, thereby eliminating carbon emissions from AI operations. Many IT giants are already concluding "green" contracts to procure clean energy for their needs.
  • Reducing and recycling water consumption. Implementing new cooling systems (liquid, immersion) that require significantly less water, as well as reusing technical water. Selecting data center locations considering water conditions: favoring regions with cooler climates or sufficient water resources. Research indicates that judicious site selection and cooling technologies can reduce the water and carbon footprint of a data center by 70–85%.
  • Transparency and accountability. Introducing mandatory monitoring and disclosure of energy and water consumption data by AI infrastructure. Public accountability encourages companies to manage resources more efficiently and allows investors to track progress in reducing ecosystem burden.
  • Using AI for resource management. Paradoxically, AI itself may help solve this problem. Machine learning algorithms are already being used to optimize cooling in data centers, forecast loads, and allocate tasks to minimize peak loads on networks and improve server utilization efficiency.

The next few years will be crucial for integrating sustainability principles into the core of the rapidly growing AI sector. The industry stands at a crossroads: either move inertially, risking facing environmental barriers, or transform the challenge into a catalyst for new technologies and business models. If transparency, innovation, and responsible resource management become integral parts of AI strategies, the “digital mind” can evolve hand in hand with care for the planet. Such a balance is what investors and society as a whole expect from the new technological era.


open oil logo
0
0
Add a comment:
Message
Drag files here
No entries have been found.