Data Center Energy Consumption: How Much Energy Did/Do/Will They Eat?
Plumes of steam rise above the cooling towers at Google’s data center at The Dalles, Oregon. Source: Google.
Data centers today consume a significant and growing share of electricity. Globally, data centers (excluding cryptocurrency mining) used an estimated 415 terawatt-hours (TWh) in 2024, about 1.5% of world electricity demand. This global demand has roughly doubled since 2010 (when usage was ~194 TWh) thanks to the explosion of digital services. Efficiency gains (better hardware, cooling, and power usage effectiveness) moderated growth for much of the 2010s, but the acceleration of cloud computing and AI has pushed energy use sharply upward in recent years. The International Energy Agency (IEA) reported in 2025 that worldwide data center electricity demand is on track to more than double by 2030, reaching around 945 TWh (slightly above Japan’s current consumption). This implies an annual growth on the order of 10–15% per year in the latter 2020s. Indeed, one analysis forecasts a 165% increase in global data center power demand from 2023 to 2030. A major driver is artificial intelligence workloads, which require energy-intensive chips; AI-related computing alone is projected to quadruple its electricity use by 2030. By the end of this decade, data centers could be consuming as much power annually as a major industrialized country (on par with Japan). In terms of peak power, analysts estimate the global data center fleet currently draws on the order of 50–60 GW on average, and this may rise to ~130 GW by 2028 (a 16% compound annual growth rate). Such growth would elevate data centers from about 1–2% to roughly 3–4% of world electricity use in the next 5–10 years, absent drastic efficiency or offsetting measures.
In the United States, data centers are especially prominent: U.S. facilities consumed roughly 176 TWh in 2023, about 4.4% of national electricity sales. This share more than doubled from ~2% in 2010 and is rising quickly. For most of the 2010s, U.S. data center energy use hovered around 60–70 TWh annually (virtually flat from 2014–2016 at ~60 TWh), as efficiency gains balanced growing workloads. But since about 2017, consumption has risen steadily alongside the boom in hyperscale cloud centers and now AI computing. Federal researchers project U.S. data centers could triple their electricity draw by the end of this decade. In fact, scenario analyses for the U.S. foresee data centers using anywhere from ~350 TWh up to ~580 TWh per year by 2028 in high-demand cases. The upper end of that range (~580 TWh) would represent roughly 12% of total U.S. power consumption by 2028 – a seismic shift in the electricity landscape. Even more conservative outlooks show data centers’ share of U.S. electricity at 9% by 2030 (up from ~4% now). In absolute growth terms, this load is enormous: one utility think-tank names data centers as the “principal culprits” for skyrocketing U.S. power demand in the next five years, potentially adding 90 GW of new load and “ending an era of flat demand” for utilities. Indeed, the IEA expects that almost half of the growth in U.S. electricity demand through 2030 will come from data centers alone, as running servers for AI and cloud eclipses traditional industrial power uses. To put this in perspective, by 2030 the U.S. may consume more electricity for data processing than it does to manufacture all its steel, aluminum, cement, and other energy-intensive goods combined – a remarkable indicator of the economy’s digital shift.
Looking globally again, similar trends are playing out. Europe’s grid operators, after decades of flat or declining demand, now report a surge of new load requests “mostly driven by data centres,” marking a return to load growth. Goldman Sachs analysts found European utilities have seen connection requests for new data centers skyrocket from a handful to thousands, indicating a pipeline of roughly 170 GW of data center capacity in planning across Europe – about one-third of Europe’s current power generation capacity. In Asia, markets like China, India, and Southeast Asia are also expanding their digital infrastructure; by some estimates, global data center energy use is rising ~12–16% annually in the mid-2020s, far above the growth rate of most other sectors. The consensus is that without dramatic efficiency improvements, data centers and digital networks will become an ever-larger piece of the energy puzzle. (Notably, these figures exclude cryptocurrency mining, which by itself was another ~100+ TWh globally in 2022 – a separate issue often concentrated in specific regions.) To put things in perspective, data centers already represent about 1 out of every 20 kilowatt-hours consumed in the U.S., and roughly 1 out of 70 globally; by 2030 those ratios could more than double. This unprecedented growth sets the stage for both challenges and opportunities in the energy sector.
2028. Rapid growth after 2020 is driven largely by the emergence of AI-oriented hardware (purple segment).
Under a high-demand scenario, U.S. data centers could approach ~580 TWh by 2028