Blog

/

Nov 19, 2024

/

7 min. read

Understanding the carbon footprint of AI and how to reduce it

Data center construction

The rapid growth of artificial intelligence (AI), particularly large-language models (LLM) and generative AI, has taken many by surprise. This surge has led to escalating electricity demands at data centers and raised concerns about the strain on the power grid. It has also sparked the construction of new, larger data centers, resulting in growing embodied emissions tied to building and maintaining AI physical infrastructure.

Managing the risks of increased greenhouse gas (GHG) emissions from AI requires investment, expertise, and new approaches to building and operating many aspects of AI operation and supply chains. The immediate task is to understand these risks, gather the necessary information, and to avoid poor outcomes by proactively managing construction, operation, and emissions associated with the growth in AI. In parallel to that work, it’s important to recognize that AI can itself be a real force to reduce emissions incrementally and dramatically across a wide range of sectors.

What is the carbon footprint of AI?

The carbon footprint of AI consists of two main parts: “embodied” emissions that come from manufacturing IT equipment and constructing data centers, and “operational” emissions that come from electricity consumed by computer chips as they perform AI-related calculations. Both of these aspects of emissions are growing as more data centers are built and existing data centers increase their share of power-hungry AI applications like generative LLM searches and AI image generation.

Understanding the electricity demand for data centers 

Today, the electricity demand from AI-specific applications is estimated to be less than 1% of global electricity use—and likely much lower. To understand this number, it helps to start with the electricity consumed by the 11,000 data centers worldwide, which was about 1.0 to 1.3% of global electricity consumption in 2022. (This excludes another 0.4% from cryptocurrency mining.) However, most of the computation at these data centers is not AI; instead, it’s more conventional applications like e-commerce, video streaming, social media, and online gaming.

The amount of AI-based computation at data centers is hard to determine because there are no good global datasets. Based on the number of AI-specific computer chips that have been sold (mostly graphics processing units, or GPUs) it’s likely that AI applications only consume about 0.04% of global electricity. That electricity use results in about 0.01% of global greenhouse gas emissions.

Still, the demand for AI applications is rapidly growing, and this is likely to drive up the electricity used by data centers and the associated greenhouse gas emissions. The most important implications of this trend are in the US, which hosts about half the world’s data centers. Currently, data centers use about 4% of US electricity, but projections for the future range from a low of 4.6% to a high of 9.1% in 2030. 

How electricity sources influence data center emissions

A large increase in electricity use doesn’t necessarily result in a similarly large increase in greenhouse gas emissions. Currently, a significant portion of the electricity powering data centers comes from zero-carbon sources such as wind and solar. This is because of large, corporate power-purchase agreements (PPAs) signed by leading data center operators, particularly Amazon, Meta and Google. While much attention has focused on clean-energy deals such as Microsoft’s agreement with Brookfield Renewables to buy 10 GW of renewable energy starting in 2026, and a separate agreement with Constellation Energy to buy 0.8 GW of nuclear power starting in 2028, US technology companies have been buying renewable energy for years, and had already contracted over 35 GW of clean electricity by the end of 2022. 

The use of low-carbon power means that the net emissions from these data centers is smaller than the electricity consumption numbers might suggest. Of course, a crucial consideration is whether this low-carbon power is truly “additional,” meaning that it is being added to the grid and not simply taken away from other uses. Data center operators are expanding their traditional wind and solar PPAs and exploring novel approaches to try to meet this standard, including geothermal projects in the US and Kenya.

However, the projected electricity demand from AI applications at data centers will be difficult to meet entirely with low-carbon power, at least in the near term. Despite installing over 30 GW of wind, solar and battery projects in 2023, these generators face an increasingly long wait for interconnection approval in many parts of the US. Geothermal and hydro power, which offer steady (“baseload”) low-carbon electricity, remain constrained in the near term. And the interest in scaling up nuclear power, from restarting full-scale reactors to novel small modular reactors (SMRs), faces significant regulatory, cost, and supply chain hurdles. 

One important source of low-carbon electricity that has not received enough attention is natural gas-fired power equipped with carbon capture and storage (CCS). This technology has the potential to significantly reduce emissions from existing power plants and enable new projects to achieve near-zero emissions. 

The role of embodied emissions from data center construction

Embodied emissions include all emissions associated with the extraction, production, transportation, construction, and disposal of materials used in construction. 

The embodied emissions from constructing data centers are substantial, and include concrete, steel, and IT hardware. Scope 3 GHG emissions for data centers—which include embodied emissions—range from approximately one-third to two-thirds of overall lifetime emissions. Between 2020 and 2023, Microsoft’s carbon footprint grew by 30%, largely due to the emissions associated with steel, concrete, and chip manufacturing. In response, Microsoft has started using wood in some data center construction to reduce this impact. While using wood offers a partial solution, it cannot fully offset the emissions of even a single facility, and wood supply chains remain limited. 

Major data center operators are working hard to address this challenge, including emphasizing the need for standardized emissions measurements and disclosures for key building materials. Ultimately, achieving deeper decarbonization will require further action to address both operational and embodied emissions.

Eight key strategies to reduce the carbon footprint of AI

  1. Adapt technology architecture.

Efficiency is the foundational strategy in any clean energy approach. As such, chipmakers are developing ways to cut energy use from the outset, such as incorporating more memory directly onto computer chips or hard-wiring basic calculations. These innovations have already reduced energy consumption in new computer chips substantially—in some cases a 96% improvement. Likewise, servers are being designed with new architectures that minimize internal data transfers, delivering additional efficiencies.

  1. Optimize training geography. 

Carbon Direct believes there are also significant opportunities to manage AI’s energy use directly. For instance, a large portion of the energy consumption for LLMs occurs during the training phase, prior to the model's release. Because these training tasks are not location-dependent, they can be carried out in regions with abundant, low-cost, low-carbon electricity, as part of broader efforts to dynamically move computing tasks to reduce emissions (“carbon-aware computing”). Additionally, server requests for generative AI tasks, like ChatGPT searches, can be routed through systems powered by low-carbon electricity. Although this may add only a few milliseconds of latency, it can substantially reduce potential emissions.

  1. Select appropriately-sized models. 

Not all generative AI tasks, like ChatGPT queries, are equal in terms of energy demand. Leading AI companies are increasingly focusing on using smaller, more efficient AI models to perform these tasks, achieving nearly equivalent quality for far less energy consumption. Similarly, many AI applications, such as digital twinning and satellite-based pattern recognition, consume far less electricity than generative AI systems like large language models. For instance, some of the most advanced AI-driven weather prediction models require far less energy than traditional weather simulations, running on a laptop rather than a supercomputer.

  1. Address fugitive methane emissions. 

As data center operators consider using natural gas for new electricity supply, reducing upstream emissions from gas production and transmission will be crucial. In the U.S., the EPA’s Methane Rule could cut these non-carbon dioxide greenhouse gas emissions by approximately 80% (although the future of the rule is now in doubt). Additionally, AI-powered tools from companies like Kayrros and organizations like Carbon Mapper help detect methane leaks and attribute them to specific operators. The best actors in the industry emit minimal methane—less than 0.5% of what is produced. This standard is achievable for nearly all gas producers.

  1. Use carbon capture on new and existing power plants. 

For both new and existing natural gas-fired power plants, carbon capture and storage technology offers the potential for generating low-carbon power with very high availability. While many plants currently in operation continue to emit unchecked, Carbon Direct believes this doesn't have to be the case. Their emissions can be captured and stored. Hyperscalers and project developers should pursue new investments and business models to reduce existing emissions by 95% or more. For new generation projects, options like NetPower, Arbor, and CES will soon enable emissions abatement of 100%—or even more, if combined with biopower to deliver carbon dioxide removal as well. Achieving this will require the development of carbon dioxide pipelines, barges, and storage facilities, which face their own challenges, such as permitting and community approval, that must be addressed directly.

  1. Add more zero-carbon to the grid. 

Over 12,000 solar, wind, and battery projects in the U.S. are facing delays in connecting to the grid. These delays need to be addressed. Permitting reforms, such as the Manchin-Barrasso bill, could help expedite the process. One potential innovation is to use AI to accelerate the development of power flow models and streamline the paperwork required to complete the regulatory process.

  1.  Invest in low-carbon building materials. 

While wood is a promising low-carbon building material, we’ll also need glass, steel, aluminum, concrete, and computer chips with minimal embodied carbon emissions. Hyperscalers currently face significant challenges accessing low-carbon versions of these materials, which will eventually be produced using low-carbon hydrogen, carbon capture and storage, and low-carbon electricity. However, these systems require significant investment, workforce development, and permitting to be built. Without these advancements, the embodied emissions from data centers will increase rapidly and significantly in the U.S., Europe, and globally.

  1. Increase carbon dioxide removals. 

It’s already clear that AI applications at data centers will generate emissions from electricity use and embodied carbon that cannot be avoided in the near term. Estimates of current greenhouse gas emissions exceed 300 million tons per year and are likely to grow this decade. These emissions should be measured using full life-cycle analysis and then offset through high-quality carbon removal projects, preferably those with high durability.

To effectively reduce the environmental impact of AI, Carbon Direct believes all eight strategies discussed must prioritize the communities most affected: frontline communities near new infrastructure, consumers facing price increases, and tribal authorities with limited legal protections. Planning should begin by understanding the needs of these communities, ensuring that efforts focus on minimizing harm while maximizing benefits. Equity and justice must be embedded in every stage of planning, production, and permitting across all strategies. 

Conclusion

AI is just one part of a broader trend of rapidly growing electricity demands, including from electric vehicles, heat pumps, industrial electrification, green hydrogen, and various e-fuels. The challenges AI presents to hyperscalers, communities, regulators, and investors serve as a preview of the complex, far-reaching impacts emerging in other sectors.

Achieving net-zero isn’t about slowing down or forcing businesses into compliance; it’s about fostering innovation, building new solutions, and experimenting with different approaches. AI’s carbon footprint underscores the critical need for expertise in clean electricity, grid management, decarbonization, and carbon removal—expertise that will become increasingly vital as more companies realize the complexity and cost of the journey ahead.

Fortunately, AI itself can be part of the solution. With applications in grid management, material science, and advanced manufacturing, AI has the potential to play a powerful role in the climate response.


Read the full 2024 report, ICEF Artificial Intelligence for Climate Change Mitigation Roadmap.

Connect with an expert

Get answers to your decarbonization questions and explore carbon management solutions.

Connect with an expert

Get answers to your decarbonization questions and explore carbon management solutions.

Connect with an expert

Get answers to your decarbonization questions and explore carbon management solutions.