The unsustainable use of electricity in AI development
By Kira Donegan, Photo by Brett Sayles on Pexels
From drafting an email to online “relationships”, artificial intelligence (AI) has seemingly become the new “man’s best friend”. After all, it saves us time and brainpower. But the technology is soaking up another kind of energy.
AI is used in everything from college papers to healthcare services, with many concerned with just how reliant we’re becoming on AI–environmental scientists included. Data centers have a huge impact on the electrical grid, use a ton of water, create chemical waste, and more (often unfairly burdening vulnerable communities). Turning to ChatGPT or another AI assistant can use 10 times more electricity than a Google search.
It doesn’t have to be this way. Rethinking data center infrastructure could drastically reduce the use of energy (the ability to create change) and power (how fast the energy is used). Data centers house technology infrastructure. Here, the AI can be trained, deployed, and delivered to users. Compared with traditional data centers, AI-designated centers contain more powerful computing and IT (information technology) infrastructure capabilities, said Peter Cappers, a staff scientist in the Energy Technologies Area at Berkeley Lab.
AI is also developing at a much faster pace than other technologies. In just a couple years, videos have gone from a few glitchy seconds to hauntingly lifelike. The problem is, construction can’t keep up. Data centers alone take two to three years to complete.
The rapid demand of energy and power also has a huge impact on the electricity grid. AI currently uses around 4% of the U.S.’s total electricity. That could soon spike to 12%. “There isn’t another major industry out there, a major industry center like that, that has anywhere near that percentage,” said Ian Hoffman, a technology researcher in the building technology and urban systems division at Berkeley Lab.
Think of the grid as a lake, said Hoffman. It’s only healthy when it takes in as much water as goes out the other end. For a grid–divided into eastern, western, and Urkot (Texas) in the continental U.S.–there has to be enough power coming in to meet the demand, or voltage starts dropping. And any change in voltage creates spikes in the cycles and leads to a shorter lifetime of equipment or blackouts for users. AI is extremely spikey. “If the goal is to really not inhibit the growth of data centers, then this electric issue is going to have to get figured out, and it’s not just ‘build it faster’,” said Cappers.
Improving data centers has become a charged debate among tech enthusiasts, environmentalists, and researchers alike. Hoffman is researching how to get power to chips and cool everything more efficiently. One solution is improving tokenomics (how many tokens can be produced for a certain amount of electricity and power), the basis of AI. Although successful, many companies choose to make more tokens over grid relief. Another potential solution is improving transients, short bursts of electrical energy. Either way reduces spikes and relieves the grid from supplying those sharp peaks.
A third solution is cranking up the heat of the water used to run the data centers, and using outside air to cool. “The hotter you can run things, the more energy that you generally can save,” said Hoffman. Making the water hotter, around 80-90 degrees Fahrenheit, means outside temperatures will typically be cooler and can be used as the cooling system. This reduces reliance on the chiller and air handlers, lowering energy use. Centers can also maximize equipment, running things as high as possible. The challenge is developing parts that can handle these temperatures.
Of course, a simpler answer would be developing more efficient code. Reducing decimal points to save energy and optimizing the time it takes to pass information between chips are some examples. Imagine the luxury of asking ChatGPT for something and getting a sane answer the first time. Satisfying and sustainable.
The biggest hurdle is how new AI is. No one has the answers. In fact, around 25 states are grappling with how to implement the technology into the electric grid, with nothing to go off of, said Cappers. As a start, stricter policies and renewable energy have been recommended by organizations like the United Nations Environment Programme (UNEP). The good news is, the future of AI may be uncertain, but it doesn’t have to be unsustainable.

