Understanding And Curbing Generative AI’s Energy Consumption

17:30 minutes

Listen to this story and more on Science Friday’s podcast.

A darkened hallway with data storage units lining the walls and a green glow.
Credit: Shutterstock

The explosion of AI-powered chatbots and image generators, like ChatGPT and DALL-E, over the past two years is changing the way we interact with technology. Their impressive abilities to generate lifelike images from written instructions or write an essay on the topic of your choosing can seem a bit like magic.

But that “magic” comes at a steep environmental cost, researchers are learning. The data centers used to power these models consume an enormous amount of not just electricity, but also fresh water to keep everything running smoothly. And the industry shows no signs of slowing down. It was reported earlier this month that Sam Altman, the CEO of leading AI company OpenAI, is seeking to raise about $7 trillion to reshape the global semiconductor industry for AI chip production.

Ira Flatow is joined by Dr. Jesse Dodge, research scientist at the Allen Institute for AI, to talk about why these models use so much energy, why the placement of these data centers matter, and what regulations these companies could face.

Further Reading

Segment Guests

Jesse Dodge

Jesse Dodge is a research scientist at the Allen Institute for AI in New York, New York.

Segment Transcript

IRA FLATOW: This is Science Friday. I’m Ira Flatow. The explosion of AI-powered chat bots and image generators, like ChatGPT and DALL-E, that explosion is changing the way we interact with technology, right? Their impressive abilities to generate lifelike images from text or to write an essay on the topic of your choosing can seem a bit like magic.

But you know what? That magic comes at a steep environmental cost, researchers are learning. The data centers used to power these models consume an enormous amount of not just electricity but fresh water to keep everything running smoothly. And the industry shows no signs of slowing down. It was reported earlier this month that Sam Altman, the CEO of OpenAI, is seeking to raise $7 trillion– yeah, that’s with a t– $7 trillion to reshape the global semiconductor industry for AI production.

Joining me to talk about how much energy and water these data centers are using and what regulations these companies could face is Dr. Jesse Dodge, research scientist at the Allen Institute for AI. He’s focusing on techniques to increase efficiency with these models, and he is located in New York. Welcome to Science Friday.

JESSE DODGE: Hi. Thanks for having me.

IRA FLATOW: You’re welcome. OK, so why do these AI models use so much energy?

JESSE DODGE: Today, AI systems are run on supercomputers. When I started doing AI research more than a decade ago, I was able to run most of the training and production of AI systems on my laptop. But what we’ve seen over the last decade is this dramatic increase in computational cost that’s led to today’s environment, where most of the AI that you’ll interact with online is run in a data center and primarily run on GPUs.

These GPUs are often very specialized hardware that are pretty energy intensive. So people don’t often think about it, but when you type a query to a chat bot or search for something online, that now has a significant energy consumption and, therefore, environmental impact.

IRA FLATOW: You’re sounding just like what we used to say and still do about cryptomining, making Bitcoin. It sounds like the sort of same thing is happening.

JESSE DODGE: Yeah, that’s right. So I think a key difference here between crypto and AI is that AI has become sort of enmeshed in everyday life. And so many people are actually using it to help them with creative writing tasks or to generate an image, let’s say. Crypto does use similar hardware. It also uses GPUs to mine cryptocurrency.

But it hasn’t received quite the wide adoption. And so we’ve got some measurements about the electricity consumption and the environmental impact of hardware used to mine cryptocurrency. But the rate of growth of AI, because it’s become so widely adopted, has perhaps even exceeded that of cryptocurrency.

IRA FLATOW: Wow. Wow. OK, so let’s talk about the environmental impacts of AI. What are they?

JESSE DODGE: So as I said, the hardware that’s used to power these AI systems is pretty energy intensive. And a lot of our energy, depending on what region you’re in around the world, is generated from fossil fuels. So according to some recent reports, about 2% of electricity consumption in the US in 2022 is just from data centers. And data centers power all kinds of things, but AI is certainly the fastest-growing portion of that.

And those data centers, they do exist in areas that have some renewable sources of energy. They’re often built close to, let’s say, hydroelectric power or wind farms. But it’s also the case that data centers are always going to be built next to population centers, and those are primarily going to be powered by fossil fuels.

So once we have some initial transparency– we’ve done a little bit of work now on getting the first measurements around electricity consumption, but there’s a lot more transparency that’s still needed. And one area where we have almost no transparency today is on embodied carbon, which is a measurement of all of the emissions that come from building the hardware itself. So that’s going to be things like mining for the rare earth minerals and then shipping those minerals from one country to another, where they’re manufactured into a GPU, and then shipping that GPU from where it’s manufactured to a place where it’s going to be put into a data center.

We’ve only got measurements now of the emissions from electricity consumption and a little bit about water consumption. This, I think, is going to be one of the most important areas that we can have research make progress on in the next one to two years is getting some measurement of this embodied carbon for all of the hardware that we use because, by the time AI systems are running in the data center, there’s already been a tremendous environmental impact to just set that up.

But how much? We don’t know yet.

IRA FLATOW: I know from visiting these centers and I know from using electronic devices they get pretty hot, right? And you need a lot of cooling water to cool these things down.

JESSE DODGE: Yeah. Again, a key thing to keep in mind here is that the location of the data center really matters, and that hardware does need to be kept at a really constant temperature. All data centers need a lot of water, and some data centers are in locations that have access to lots of pure water. And some are in areas, for example, like the Southwest US, that just are already going through a drought or don’t have significant access to water. And they’re competing with other uses of fresh, clean water.

IRA FLATOW: And the companies, have they been keen to disclose all these numbers about the energy and the water they’re using?

JESSE DODGE: That’s a great question. There is some improvement in transparency in the last year or two. But again, this is really the key. Right now, there is significant consumption, as I just said, but we’re expecting this to grow. And unless we start to get improved transparency now, this is expected to get very out of hand.

And it really isn’t in the big tech companies’ interest to be transparent about some of these downsides, right? They’re trying to promote AI as this transformative technology that can really benefit humanity, and they’re glossing over the fact that there are some trade offs here, like, yes, AI can be this transformative technology. But powering it, and supplying the necessary water, and even building the data centers themselves, all of that has measurable impacts.

IRA FLATOW: Right. And we the public have a right to know about this, don’t we?

JESSE DODGE: Oh, absolutely. And even people that are using AI and using, let’s say, cloud computing, they have some ability to choose like the, let’s say, location of the data center that they want to run their AI in. I’m an AI researcher, and so when I’m doing my research, I can choose which data center around the world I use. And vitally, I need access to the information about which data centers have more or less emissions or more or less water usage to be able to minimize that impact.


JESSE DODGE: And that transparency there, where the providers of the cloud computing, for example, they should provide that information to me so I can make that choice. And they’re just not doing that today.

IRA FLATOW: Well, that’s good that you bring this up because there is a concrete example about transparency. I’m talking about what happened in the city of The Dalles in Oregon and Google’s data centers there. Tell us about that.

JESSE DODGE: Yeah, so in The Dalles, there’s some additional data centers being built by Google, and the water consumption from those data centers, according to some recent reports, has nearly tripled in the last five years. Those data centers now consume more than a quarter of all of the water used in the city, which is kind of a wild statistic.

IRA FLATOW: Wow. That is. And isn’t it true that the city filed a lawsuit to keep Google’s energy and water use secret? They were using so much.

JESSE DODGE: Yeah, that’s right because, again, as you say, there’s just not much incentive for improved transparency along these topics. And internally, at these companies, they are certainly keeping track of, let’s say, water use, electricity consumption, CO2 emissions, and so on. But they recognize that it’s not exactly what makes them look like an environmentally friendly company, and so they are doing their best to not provide transparency.

And I can also talk a bit about some other pretty dramatic environmental impacts. So for example, there was a power station in Virginia that was planned to be shut down. It was using coal to generate electricity. And the increase in energy consumption from data centers in that region in Virginia has led to– that power station has continued to be online. They’re continuing to burn coal to meet that increase in demand, even after they planned to shut down that coal power plant.

IRA FLATOW: So what kinds of policies are being introduced to curb AI energy use?

JESSE DODGE: Yeah, that’s a great question. I spent last week in Washington DC talking to people in the House and the Senate about some recent legislation and research on trying to provide transparency around exactly these topics.

IRA FLATOW: Let me jump in there. I learned that, February 1, a new bill called the artificial Intelligence Environmental Impact of 2024 was introduced, and Senator Markey was one of the sponsors.

JESSE DODGE: That’s right. And one of the key ideas in this bill is to provide some standard formatting and standard best practices around how to report a lot of this information that we’re talking about.

IRA FLATOW: So it’s like a modern environmental impact statement sort of thing.

JESSE DODGE: That’s right. That’s right. And right now, this legislation is more focused on providing some, let’s say, guidelines and best practices and encouraging transparency. But going forward, I think we do have to recognize that providing transparency needs to be done across the entire industry. And just suggesting or providing a framework for how to be transparent is the first step, yes, but we expect that better, a more impactful piece of legislation might require transparency much more broadly.

IRA FLATOW: Well, let’s look at a possible good side of this, some hopeful news. Is it possible to have these data centers powered entirely by renewable energy?

JESSE DODGE: Yeah, that’s a great question. There’s definitely some work in pairing data centers with renewable sources of energy. And again, this is a great area where the large tech companies are making some progress. Of course, the amount of electricity consumption from these data centers directly impacts their bottom line, and so they’re pretty motivated to reduce electricity consumption and, therefore, reduce overall emissions.

A key idea here is that there’s really two considerations when choosing the location for a data center. And those two considerations are, one, access to the resources they need, like renewable energy, which is often quite inexpensive, and water, and so on. That’s one consideration.

And the other consideration is that sometimes, for some applications, data centers really do need to be immediately next to population centers. So these are always going to be two tensions around where data centers are built. And for that first one, it is the case that the large tech companies are going to build data centers next to renewable energy, and maybe they will also, for example, build a wind farm. Or I’ve heard some reports that they’re working on, let’s say, a small nuclear reactor that doesn’t have emissions and could power a data center.

On the other hand, it’s always going to be the case that some data centers need to be built near every city. And we’re not going to be putting, let’s say, a bunch of windmills or a small nuclear reactor next to a large city here in the US.


JESSE DODGE: And so here, again, I’ll just reiterate that having transparency around the consumption of electricity, and water, and so on, that’s vitally important and will continue to be important, even if we do have some data centers that are powered more by renewables.

IRA FLATOW: Right. There’s always the argument every time something like this comes up– you know there’s going to be a technology breakthrough that’s going to solve our problems. Is there an efficiency breakthrough on the horizon that seems promising?

JESSE DODGE: We’ve been working on efficiency in artificial intelligence for more than a decade. This has become a standard area of research that’s published at our conferences. And what has really shown to be true and what a lot of people recognize in AI research and across industry is that progress in the last few years has at least partially been driven by increasing the size of our models, increasing the size of our data, and increasing the computational cost of running these AI systems.

And so over the last maybe seven or eight years, we’ve seen more than 300,000-fold increase in the computational cost of expensive AI systems. That trend has not slowed down. I don’t think that we will see significant advancements in at least the powerful, most-capable AI systems to make them a lot more efficient.

I don’t think anyone expects that. We’d love to see that. But very consistently, instead, we’ve seen the opposite, that larger systems that are more expensive to train have been more capable and so been what we end up using.

IRA FLATOW: There’s an argument in the field, that the application of AI in our society can be more impactful than the emissions, sort of like we should accept it because it’s going to be better for us. Can you explain that?

JESSE DODGE: Yeah. So I think there is some merit to this idea, that the application of the AI itself can be incredibly impactful. AI is an accelerant. People can use AI to do what they’re trying to do faster and better.

Some of those applications can mitigate some of the harms of climate change. So at the Allen Institute, for example, we’ve got some teams working on using AI to model climate change. We’ve got some teams working on using AI to track and model wildfires or endangered species. These are applications of AI that we support and we think are overall beneficial. And of course, as with any application of AI, they do use electricity and that same hardware and, therefore, have CO2 emissions.

On the other hand, there are other applications of AI that can exacerbate climate change. So if someone’s using AI for, let’s say, increasing the rate of oil extraction, that’s an application that can potentially accelerate climate change. And again, AI here is just the accelerant. It’s a general purpose tool. And so we hope that the development of AI will lead to more positive applications, but we have to recognize that application, whatever it is, potentially more impactful than the emissions from the AI itself.

IRA FLATOW: I say this very seriously. Could we ask AI to solve its own problem?

JESSE DODGE: AI, historically, was a very targeted tool, that it would typically be trained to do a single thing and it would do that quite well. Now we have things like chat bots which are pretty good at doing many things, but they’re not excellent. They’re not replacing people’s jobs quite yet.

So I think here these problems are complicated and pretty serious. And so our expectation is that people need to work on them. And AI, again, can be a tool that we use along the way. But it, so far, has not been successful in solving, let’s say, the problems around efficiency, or where to place a data center, and so on.

IRA FLATOW: Are you as fearful of AI as some people are?

JESSE DODGE: Do you know what? I’m not, and I have to say– so I did spend all of last week in Washington DC. And I was so thankful to hear that a lot of the people I spoke to, the trend that they had gone on was like, a year ago, two years ago, they had heard from, let’s say, people in what’s called effective altruist movement that AI has this risk of taking over the world or getting access to nuclear weapons.

And they, over the past year and a half, had said to them repeatedly, provide us any evidence of this, any example of AI actually doing those things. And when they couldn’t, eventually a lot of the staffers in DC, and the senators, and so on, they said, we’re just not worrying about that right now, which I think is the right choice.

So to me, those are problems that might exist someday. But there are real problems today that we do need to account for, like job displacement and the environmental impact.

IRA FLATOW: Great. Great answers to our problem here. Jesse, thank you for taking time to be with us today. This is something we all have to keep an eye on, for sure.

JESSE DODGE: Yeah, absolutely.

IRA FLATOW: Dr. Jesse Dodge, research scientist at the Allen Institute for AI.

Copyright © 2023 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/

Meet the Producers and Host

About D. Peterschmidt

D. Peterschmidt is a producer, host of the podcast Universe of Art, and composes music for Science Friday’s podcasts. Their D&D character is a clumsy bard named Chip Chap Chopman.

About Ira Flatow

Ira Flatow is the host and executive producer of Science FridayHis green thumb has revived many an office plant at death’s door.

Explore More