Skip to main content

Decarbonizing AI While Using AI to Decarbonize 

Written by: Owen Curtin

Friday morning at the Clean Energy Conference, Marissa Galizia (SOM/YSE ‘15, Senior Director of Partnerships at Voltus) and Jacob Mansfield (co-founder of Tierra Climate) talked about the ability of AI to assist innovators in the energy transition. However, both speakers also pointed to the intense resource use of AI, especially when it comes to energy, as contradictory to AI’s usefulness in the climate space.  

 

Galizia, who received her MBA and Master of Environmental Management at Yale, brought a unique interdisciplinary perspective to the discussion. Her background spans both business strategy and environmental science, positioning her to understand AI's dual role as both solution and challenge in the climate space. Mansfield, meanwhile, founded Tierra Climate to leverage computational tools for climate action, giving him firsthand experience with the practical applications and limitations of AI in clean energy deployment. 

 

Both speakers highlighted energy supply as a critical bottleneck facing AI expansion. As AI systems scale, they rely on infrastructure that currently runs primarily on electricity generated from oil and natural gas. Data centers globally consumed around 460 terawatt-hours in 2022, representing approximately 1.5% of global electricity consumption, with projections showing this figure could more than double by 2030. In the United States alone, data centers consumed over 4% of the country's total electricity in 2024, equivalent to powering the entire nation of Pakistan. The rapid growth of AI workloads has become a primary driver of this surge, with AI-specific computing expected to account for as much as half of data center energy use by decade's end. 

 

Because these facilities rely on fossil fuels, AI has a steep carbon footprint. Utilities are racing to build new generation capacity to meet demand. The concentration of data centers in specific regions creates additional grid integration challenges, potentially straining local infrastructure and delaying the broader transition to renewable energy. 

 

Despite these concerns, both speakers emphasized AI's transformative potential for the energy sector. Machine learning algorithms excel at managing grid complexity, particularly as renewable energy sources introduce variability into power systems. AI can help regulate and allocate energy during peak demand periods by coordinating battery storage systems with renewable generation from solar and wind farms. These predictive capabilities allow grid operators to optimize resource deployment in real-time, reducing waste and improving system efficiency. 

 

AI's strength lies in handling the logistics of modern energy systems—forecasting demand patterns, predicting equipment failures, and continuously optimizing the flow of electricity. These applications represent exactly the kind of repetitive, data-intensive tasks where machine learning demonstrates clear advantages over traditional approaches. 

 

However, Galizia and Mansfield cautioned against viewing AI as a silver bullet. The technology's utility is fundamentally constrained by thoughtful application. Simply deploying AI tools across the grid without meaningful integration won't automatically yield efficiency gains. The speakers stressed that AI must be implemented with clear objectives, validated performance metrics, and careful consideration of its own energy footprint. Adding computational complexity for its own sake risks creating systems that consume more resources than they save.  

 

The morning session underscored an essential truth about AI in the climate space: the technology offers powerful tools for optimizing energy systems, but only when deployed deliberately and with full awareness of its costs. As the energy transition accelerates, the challenge will be harnessing AI's capabilities while simultaneously decarbonizing the infrastructure that powers artificial intelligence itself.