The Wall Street Journal (WSJ) recently published an article, entitled, “Big Tech’s Latest Obsession Is Finding Enough Energy”; the subhead clearly states that the “the AI boom is fueling an insatiable appetite for electricity.” This piece is just the latest in a growing chorus of articles about the potential energy demand for AI, focusing on how AI could require the expansion of fossil fuel energy (mostly natural gas) or otherwise harm progress on decarbonization. Much of this analysis is flawed, however; the WSJ article is in many ways better than most but still manages to miss the mark. The argument goes something like this:
- AI will require a huge amount of power.
- The growth of AI-linked data centers and energy demand represents a departure from past growth patterns.
- Companies are turning to fossil fuel sources to meet demand because of these new growth patterns.
The basic flaw comes in point 2: the assumption that AI will represent a major turning point for energy demand growth. Let’s dig into this and break it down. First, it’s worth setting a baseline on energy demand growth from data centers: You may be surprised to learn that despite the huge expansion of digital tech into every area of our lives over the last decade, data center energy usage growth has been almost flat over that same period. This is largely due to improved energy efficiency in chips, programs, and the data centers themselves — there’s been a major shift to hyperscale centers, which are more energy efficient. This lack of growth comes despite the growing number of data centers and the growing amount of computing power. Note that cryptocurrency mining is in its own category — Bitcoin alone consumes more than 140 TWh and has added around 100 TWh of demand in the past few years — but non-crypto data center energy demand has grown very modestly. So, does AI fit within the existing demand growth paradigm, or is it another cryptocurrency, set to make huge impacts on energy demand?
There’s a lot of evidence that AI is a more likely fit with existing demand growth patterns. Consider Google search: Google serves about 8.5 billion searches per day — but what if it wanted to integrate an AI response with each one of those searches, as it has already started doing? Querying ChatGPT likely consumes around 3–4 Wh of electricity; we’ll call it 5 Wh to be safe. Factoring in training, adding a query of a similar large language model (LLM) like Google’s Gemini to each search would add around 20,000 GWh of energy demand —- roughly doubling Google’s total energy consumption. That’s a lot! This is the analysis that underpins a lot of the scaremongering about AI energy demand.
Leaving it there misses the mark, however. First, the models themselves are getting a lot more energy efficient: LLMs optimized for energy efficiency have already demonstrated 10× improvements in energy demand per query; factoring this in brings down the total energy consumption of Google search LLM integration to 2,000 GWh, even factoring in future demand growth for search. This 2,000 GWh of incremental energy consumption is well in line with Google’s trend of doubling energy consumption every three years or so.
On top of the energy efficiency improvements in the models, there’s also a ton of real-world reasons to think energy demand growth won’t be that significant: Things like Google searches don’t need to be re-run every time, as there’s lots of common searches that are repeated. Many large-volume tasks will end up using smaller purpose-built and custom optimized models that don’t need as much energy per query. AI simply doesn’t have the extremely anti-efficiency architecture of Bitcoin mining, which gets programmatically less energy efficient over time — AI is the opposite, as developers are highly incentivized to minimize computational demands to optimize their own costs and limited GPU time.
It’s not clear to me that AI will expand demand for these services either — demand growth for web search has already slowed down to a modest 5% annually. AI-generated photos and videos are kind of the wild card here, but our estimates show that it would take a huge amount of image generation to really move the needle on total data center energy demand. The WSJ takes for granted that AI will continue to grow and be adopted, despite the fact that there’s no clear plan on how to turn a profit from large language models. LLMs could just turn out to not make money! Energy problem solved in that case.
What the WSJ does get right, however, is point 3: Companies are increasingly turning to fossil fuels to support demand growth. This doesn’t have much to do with AI, however. As the article points out, the U.S. is coming out of a long period of electricity demand stagnation; due to new government incentives for electrification of homes, transportation, and industry and for more domestic manufacturing, demand for electricity is now surging. Utilities that have been in maintenance mode for a decade are now scrambling to add capacity, even as some older fossil fuel generation needs to be phased out. The crunch is further compounded by the long times needed to connect renewable energy to the grid in America, and the concentration of industry (including data centers) in a handful of regions, heightening the issue for certain utilities. The timeline of building a data center is just a few years, something that renewables currently can’t match (mostly due to permitting), which makes natural gas attractive.
What we need are policies that make it easier to add renewable energy to the grid for any application, including other industrial applications. Permitting reform in the U.S. would be a good place to start. In addition, we need to focus not on AI’s energy footprint, but the other forms of waste from data centers, including water consumption and the Scope 3 emissions and e-waste created by building and equipping those data centers. Also, ban cryptocurrency mining! Crypto already is already what people are afraid AI will become — the International Energy Agency notes electricity demand from crypto will be far larger than from AI at least through 2026 — and crypto is a net negative for society. There’s plenty of potential benefits to AI, including its ability to help optimize energy consumption or reduce manufacturing waste; it’s entirely possible AI will be net-neutral on carbon emissions.