AI: The cause of, and solution to, all of life’s (energy) problems
The Hidden Energy Costs of AI
Without question Generative AI models (aka large language models, aka LLM’s) are the next defining technology of our time. Whether you need an impartial friend to bounce ideas off of, a snippet of code, a book recommendation or a new cocktail recipe, AI is there for you, chipper AF and with all the answers (even wrong ones). Yet this new fangled digital wizardry is not without its costs: the average ChatGPT question uses 10x the electricity of a typical Google search query. It’s not enough that AI is taking all the jobs, it hungers for our energy as well, how will I air fry my vegetables?
Why AI Workloads Are So Power Hungry
Goldman Sachs projects data center global energy consumption to double, which will be 3-4%, by the end of the decade! The good folks at SemiAnalysis.com have modeled closer to 4.5% by 2030. Of this increased consumption, 90% of it will be driven by AI-related data activities (see graph below). While society’s data needs have grown dramatically over time, it’s been pretty steady. With the introduction of LLM’s, things are shifting dramatically due to two main factors.
First, LLM’s are based on the concept of a neural net which is essentially a digital brain consisting many digital neurons (or nodes) in which training data is passed through several times until connections are formed helping the program find patterns. To be reductive, LLM’s are this same process on steroids. Training these models requires weeks or months of data processing and these workloads are extremely power hungry (also that’s not accounting for the power requirements from the inference phase).
Second, these training sessions push the required AI hardware much closer to their thermal design power which is the upper limit the system can handle before it overheats (basically it’s like that guy at the gym going uncomfortably hard, making noises that no one asked for). All of this means that AI-focused data centers need a lot of reliably generated power that scales quickly. In the past, power utilities typically received one or two large customer power requests per year (e.g., exceeding 20 MW). However, according to Brian Janous—currently the CEO of Cloverleaf Infrastructure and formerly the head of energy strategy at Microsoft—utilities are now fielding such requests one or two times per week!
Green Opportunities for Decarbonization
These numbers sound big and to be clear action is absolutely necessary, but there is also so much opportunity here. On the demand side, companies like Google have committed to achieving net-zero emissions by 2030 and Microsoft has said they’ll be carbon-negative (e.g. actually removing carbon from the atmosphere) by then. From a supply side, there is also decarbonization pressure as the Biden Administration has set a goal of a complete carbon pollution-free energy grid by 2035 and full net-zero emissions by 2050 (yes I’m aware of the casserole of incompetence and narcissism that is set to inhabit the White House). Here’s where things get exciting though, take a look at the two charts below (as provided by this Epoch AI report). In our first graph, we see that the costs of training these LLM’s is rising rapidly (that Y-axis is in log scale so…yeah…10x for each increment) as each company tries to cram more data into their model and push the limits of their hardware.
But, more importantly, look at this second chart which shows the costs associated with training the major AI models. The energy costs are fairly trivial at 2-6% of the overall costs. As tech giants seek guaranteed green power supplies and can afford premium rates, they present an ideal funding source for smaller utilities facing costly grid decarbonization projects. Let’s do this thing already!
The issue here is that the tech companies need this capacity today and most power plants require years to build out and bring online. Still folks are getting creative. In June of 2024, Google signed a deal with NV Energy to build out 115 MW of geo-thermal energy in Nevada. In October of 2024, Google and Amazon announced their investment in small nuclear reactors (another cool topic I’ll cover in a future article), to support their data center expansion. That was only a couple of weeks after it was announced that Microsoft is partnering with Three Mile Island to restart their reactor to power their data center plans. Additionally, AI chip efficiency has doubled every 2.5 to 3 years, with modern processors using just 1% of the power required by their 2008 counterparts for equivalent computations.
Innovative Policies To Accelerate Green Energy Generation
Still we need to do more, we need policies to bring folks to the table, accelerate build outs, cut red tape and also to hold these companies accountable for their climate commitments. One way to do this is for utilities to offer energy tariffs like Duke Energy’s Accelerated Clean Energy (ACE) tariffs which is essentially an agreement between large energy customers and utilities committing to a higher, but stable, energy rate for a fixed period of time. This extra money is dedicated to building out the additional clean energy generation and transmission necessary to supply the data center. This serves several functions: first it requires that the energy capacity is ‘additional’ (e.g. the data center isn’t just consuming existing clean energy). Second, the higher costs are paid by the tech company, and not pushed onto local residents. Third, these agreements ensure there is demand and therefore lower the risk for building out clean energy projects. Finally, the participants typically see long-term savings because they are more insulated by overall energy volatility.
Another policy option is interconnection fast tracks, where utilities could fast-track grid connections for tech companies that agree to fund interconnection studies, finance transmission upgrades, or install on-site renewable energy and storage systems. These fast tracks essentially standardize the permitting and connection process which creates frameworks others can use to rapidly add clean energy capacity to the grid which should benefit all customers over time.
Turning AI's Energy Appetite Into A Greener Future
Generative AI is reshaping our world, offering transformative possibilities but demanding significant energy resources in return. While the challenges are clear—soaring energy demands, complex grid upgrades, and pressing decarbonization goals—there’s immense opportunity for innovation. Companies like Google, Microsoft, and Amazon are stepping up with investments in clean energy projects, showing what’s possible when sustainability and technological advancement align. Policies like clean energy tariffs and interconnection fast tracks can accelerate progress, ensuring that AI growth drives renewable energy adoption rather than environmental strain. By fostering collaboration between tech giants, utilities, and policymakers, we can turn AI’s energy appetite into a force for a greener, more sustainable future.