Skip to main content

Conclusion: Can Data Centers Become Grid-Friendly And Green-Friendly?

Google’s Council Bluffs, Iowa, data center. Source: Google.

Google’s Council Bluffs, Iowa, data center. Source: Google.

 

“The future of intelligence will be shaped not only by code, but by how we power it. Fueling the next era of AI requires an energy revolution.”
— Eric Schmidt, Former CEO and Chairman, Google

 

This story is bigger than power bills and server racks. It is about how we choose to grow.

First, the global picture. Some countries are easing climate rules to move faster on AI. Others are tightening them. If the United States relaxes guardrails, the question is whether Europe and parts of Asia can offset the extra emissions with stricter standards and cleaner grids. Offsetting is hard. Geography, timing, and politics do not align neatly. A split world risks higher total emissions even if some regions do very well.

Second, the race to lead in AI is also running into real-world limits. Hyperscalers want speed. Communities want quiet, clean water, fair electricity bills, and protected land. The existing grid wants time and investment to catch up. When these priorities collide, projects stall or shift. That slows the AI race no matter what the press releases say. Where companies bring their own clean energy, pay for grid upgrades, reuse waste heat, and share data openly, communities tend to support them. Where they don’t, public resistance grows and timelines stretch.

Third, the grid itself is the binding constraint. The core issue is transmission, not generation. We’re seeing cases where there is ample low-cost generation ready to build, but moving electrons to load is where systems struggle. The United States is a useful example: interconnection queues include about 2.6 TW—twice the total installed capacity—of mostly solar, wind, and storage waiting to connect, yet the current grid can’t absorb this. There might be three solutions to this: 1) Build generation near load centers, such as behind-the-meter renewables for industrial or data center applications. While appealing, this is spatially and economically limited; 2) Expand transmission infrastructure (e.g. high-voltage long-distance lines) to move electrons from generation to loads. This is the ideal long-term fix but will take years and incur huge costs; 3) Store electrons at both ends and release them as and when needed. Batteries at the source can store excess renewable output, while batteries near consumers can smooth peaks and reduce strain on T&D assets. This approach can lower peak demand and minimize infrastructure requirements. Aggregated through virtual power plants (VPPs), this flexibility shows up at scale. Again using the U.S. to illustrate: peak demand today stands at 759 GW (July 2025 record) and is projected to reach roughly 900 GW by 2030. About 38 GW of VPP capacity operates today, and the DOE forecasts 80–160 GW by 2030. Taking the optimistic end, if that capacity is added to the grid, the U.S. could theoretically avoid major transmission upgrades in the short to medium term. VPPs can handle most of the projected peak demand increase, saving significant infrastructure costs. Even realistically, they can substantially reduce upgrade needs. While this doesn’t eliminate the need for new wires, it buys time, cuts costs, and reduces conflict as long-term assets are built.

Fourth, generation-mix has to shift slightly. The world does not lack potential megawatts; it needs megawatts that make the system dependable. In the U.S., most queued projects are solar and wind. While paired with batteries, they can handle daily intermittency, long-duration reliability during multi-day or seasonal shortfalls still calls for resources like geothermal, nuclear, and long-duration storage. The generation mix should tilt toward these firm anchors while distributed flexibility does the near-term heavy lifting.

Fifth, the capital question. Many investors want to ride the AI wave. The risk is funding quick fixes that lock in fuel costs and carbon. Patient capital can underwrite transmission, storage, and true 24/7 clean supply and still earn. Returns may come later, yet risk is lower and value is durable. Some projects will not need concessions at all because clean portfolios now compete on price in many markets. Others will need blended finance to cross the first-mile gap. The discipline is to ask, project by project: is this asset resilient to fuel shocks, policy shifts, and community pushback over a 20-year life?

For policymakers, the next step is to give the market clear signals and consistent direction. Governments should publish open maps showing where the grid can handle large new loads so that data centers and factories can locate where capacity already exists. Faster permits should go to projects that supply verified clean power, reuse waste heat, or can flex their energy use when the grid is under stress. Regulators must also create clear market rules for VPPs and grid-enhancing technologies such that utilities are rewarded for efficiency, not just for building new infrastructure. These changes might help utilities, industry, investors, and communities work in tandem instead of working in silos.

For utilities, the real change is cultural. They have long earned profits by building more—more lines, more substations, more plants. That model must shift to one that values efficiency and performance. Utilities should treat distributed batteries, smart devices, and flexible demand as part of their core supply, not as small pilot projects. They can use digital tools like dynamic line ratings and better circuit management to carry more power through existing wires instead of waiting years for new ones. Regulators, in turn, should let utilities earn fair returns for saving money and improving reliability, not only for spending more.

For industry, the challenge is to treat power not as a bill but as strategy. Companies should match their power needs with clean supply every hour, not just annually. Batteries and clean backup systems should serve the grid, not only their own campuses. New facilities should help nearby communities through better infrastructure, lower bills, useful waste heat, and local jobs. Claims about sustainability must be proven with clear numbers, open to public review.

For investors, the best rule is to favor assets that stand the test of time. Every project should pass three checks: can it survive a carbon price, a spike in fuel costs, and a change in local politics? If it can, the investment is likely strong. Patient capital should focus on projects where delay, not direction, is the main risk. Quick returns from weak rules are a short-lived illusion. Fairness matters too. The rewards of AI are global, but the burdens of energy use are local. A fair system shares value with the places that host these massive facilities. That is not charity; it is smart risk management and good planning.

If this balance is achieved, the boom in data centers can strengthen power grids, speed up clean energy, and win public support. If not, it could bring more bans, higher costs, and deeper public frustration. The choice is not between progress and climate action; it is about building progress that can last. As James Simonelli, CTO at Schneider Electric, noted, “One thing that doesn’t exist yet for the data center industry is how to be grid-friendly.” But through a combination of industry initiatives and policy actions, data centers can gradually become more grid-friendly and green-friendly. The coming years will be critical in cementing standards and practices—much like auto efficiency standards in past decades—to ensure this digital revolution can be powered sustainably. Governments and industry leaders must engage in that task, crafting a future where our thirst for data doesn’t conflict with our duty to the planet and the public interest.