Subscribe to Science Friday
There’s an enormous buildout of data centers underway across the country to fuel the AI boom. Hundreds of billions of dollars have already been spent on data centers, with talk of spending trillions more. And these data centers use a lot of power: According to the Times Picuayune, Meta’s new data center under construction in Louisiana will require nearly three times the power that New Orleans uses in a year. Residents across the country have taken note, and rising utility rates have become an issue in some recent elections.
Casey Crownhart, senior climate reporter at MIT Technology Review, has been studying the costs and impacts of the data center boom. She joins Host Ira Flatow for an update on the latest.
Sign Up For The Week In Science Newsletter
Keep up with the week’s essential science news headlines, plus stories that offer extra joy and awe.
Segment Guests
Casey Crownhart is a senior climate reporter for MIT Technology Review in New York, New York.
Segment Transcript
[MUSIC PLAYING] IRA FLATOW: I’m Ira Flatow, and this is Science Friday. Let’s turn to a topic that’s sucking all the air out of the room, and I mean electricity, and water, too, actually. I’m talking, of course, about the huge buildout of data center construction across the country for the AI boom. Hundreds of billions have been spent already, with talks of trillions more.
And according to the Times-Picayune– listen to this– Meta’s new data center, under construction in Louisiana, requires nearly three times the power that New Orleans uses in a year. This has not gone unnoticed. Rising utility rates nationwide have become issues in some recent elections. Casey Crownhart, a senior climate reporter at MIT Technology Review, has been studying the costs and impacts, and she’s back here to fill us in. Welcome back to SciFri, Casey.
CASEY CROWNHART: Thanks so much for having me.
IRA FLATOW: Can you give us a sense of the scale of this buildup? Because the last resource drain like this that comes to my mind was crypto mining, but this seems even bigger.
CASEY CROWNHART: Absolutely. I think the money and the electricity that we’re seeing talked about– the scale of these numbers is eye popping. So according to the International Energy Agency, data centers accounted for about 1.5% of the world’s electricity consumption in 2024, and that’s set to double by 2030. We’re seeing so many data centers get built out.
One number that really struck me was– in a recent report– that there was $580 billion invested globally in AI in 2025 and data centers. That’s more than the $540 billion spent on developing the global oil supply.
IRA FLATOW: Wow.
CASEY CROWNHART: So this year, we spent more on data centers than the oil supply.
IRA FLATOW: Wow. A lot of people may not realize that when they ask a question of ChatGPT or Gemini, that it actually contributes to this. It uses up electricity. We had you on in the summer to talk about some reporting that you did about just how much energy these models use, and you told us the data was dodgy then. Do we have any more clarity on how much energy that’s being used?
CASEY CROWNHART: Yeah. So since we came up with our own estimates, working with leading researchers in this area, a couple of companies have come out with estimates of how much energy each query, or each question you ask, to one of its models will use.
And it turns out we were in the right region. So Google came out with an estimate that says that the average query to its Gemini model uses about 0.24 watt-hours of electricity. I like to put things in microwave seconds, so that’s about the same as a second in the microwave. ChatGPT OpenAI came out with its own estimate. It’s in the same range, about 0.34 watt-hours. So basically, the individual queries or individual questions you’re asking– it’s a not insignificant amount of energy, but it’s small.
IRA FLATOW: But they don’t say the total energy used for all of its AI activities, right?
CASEY CROWNHART: Exactly. And so that’s something that I’ve been talking about a lot this year, is that it’s great that we’re starting to get these estimates of individual queries, but that doesn’t really give us the full picture. We’re seeing billions and billions of people and queries every day. And so this is all adding up to a lot.
IRA FLATOW: Yeah. Now, we’ve been talking about electricity, but these data centers use a lot of water to cool those chips. And most of them are built in pretty dry areas like Arizona and Nevada. Why is that?
CASEY CROWNHART: Part of the reason is that in a lot of cases, these companies are looking for places that have plentiful land, cheap energy. That’s what they’re really looking for when they’re trying to find a place to build a data center. So like you said, a good amount of this activity is happening in Nevada, Arizona, Texas. 2/3 of new data centers that are in development since 2022 are in these water-stressed areas.
IRA FLATOW: Yeah, and how much of this is drinking water here? And what happens to that water after it’s used for cooling the servers?
CASEY CROWNHART: It depends on what the water is being used for. So when we’re talking about water consumption for AI, a lot of it is actually what’s called indirect use, so the water that’s used at the power plants that are actually running the data centers. Some estimates say that over 60% of the water consumed when we’re talking about AI is from power plants. So it depends on what segment of this you’re talking about. Some power plants are able to use treated water or something.
But when it comes to the actual data centers and the water that they’re using to keep their machines cool, a lot of them do need to use drinking-quality water because when they’re doing this evaporative cooling they want to avoid clogging their pipes, bacterial growth. It’s very sensitive equipment.
IRA FLATOW: Right. Can you give us an idea of how much water we’re talking about here?
CASEY CROWNHART: It’s moderate when you look at it overall. So one report found that data centers use about 0.3% of the nation’s total water use. But again, that’s set to double between 2023 and 2030. It’s small when you look at the big picture, but it will very much affect local grids. In some cases, a single data center can use more water than an entire county’s homes do.
IRA FLATOW: Are the big companies, the big chip companies that are using the water– are they sensitive to this and looking for more efficient ways?
CASEY CROWNHART: Absolutely. There’s a lot of research and a lot of really interesting work being done in cooling. And there are different ways to cool data centers that use less water. So today, a lot of data centers are cooled with what’s called evaporative cooling. And so basically, you let the water evaporate, and that cools down the equipment. That, obviously– you lose a lot of the water that you’re pulling out of the resources.
So there are other techniques, so something like direct liquid cooling, where you have a coolant circulating directly through the servers. There’s also immersion cooling, where servers are submerged in some sort of fluid to help keep them cool.
So there’s a lot of interesting alternatives. Some of them, at least right now, tend to have some sort of downside. So they’re either more expensive– in some cases, they’re also more electricity intensive, so might use as much as 10% more energy as compared to evaporative cooling. So it’s a trade-off. But we are seeing– a lot of companies are sensitive to this, especially as we’ve seen this public outcry.
IRA FLATOW: I can understand that, because doesn’t Microsoft use something like 8 million gallons of water a year in their chips?
CASEY CROWNHART: It’s absolutely bonkers. And another aspect that we haven’t really talked about, and this is a niche part of this, is you also need very, very high purity water to actually make the chips. And so, again, when we’re talking about the whole picture of water use by AI, it’s not just water being used to cool power plants, water being used to cool data centers. There’s also an element of just making the chips, and that can be up to 10% of the total water use.
IRA FLATOW: Let’s get into some of the politics of this, because the recent governors elect for New Jersey and Virginia both campaigned on lowering utility rates. Now, Virginia has the largest concentration of data centers in the world. So what kind of pressures are states facing as more tech companies are trying to build these?
CASEY CROWNHART: Yeah, it’s been a really interesting conversation, especially around the elections this year. I’m based in New Jersey, and so I saw so many commercials– one side or the other saying, oh, they’re going to raise your electricity rates.
So I think that as people continue to see prices go up across the board, I think there’s even more sensitivity to this. And so people are saying, well, you’re coming in and raising my electricity rates. And so we’re seeing a lot of projects blocked. One recent report from Data Center Watch found that from just March to June, about $93 billion worth of projects were either delayed or canceled because of community pushback.
IRA FLATOW: What happened to all those climate pledges that Google and Microsoft and others were promising a few years back?
CASEY CROWNHART: Yeah, this is a great question. So some tech companies– it’s hard to paint with a broad brush. But some tech companies have stopped reporting as much or backed off. Google does say that they’re still on track for their 2030 net zero energy goals.
But I think that, again, in this AI build out, they’re really chasing a moving target. We’re seeing, as they need so much electricity, even the companies that are doing really great work and procuring a lot of wind and solar, helping support new advanced clean energy technologies– it’s just a huge challenge when they’re seeing their energy demand increase off the charts.
IRA FLATOW: Yeah, because they were talking about using renewables for this, or possibly even nuclear. I know Microsoft bought out all that usage on Three Mile Island. So we’ll have to see how that works out.
CASEY CROWNHART: Absolutely. I think it’s going to be really interesting to see how these longer-term plays end up working out, especially, like you said, that this nuclear build out, the efforts to reopen shuttered plants– it’s such a long-term play. It takes years even just to reopen a nuclear plant. And you can build a data center much more quickly than that.
And so I think that’s one of the fundamental challenges here, is that AI has become part of the daily lives of people so quickly. And so companies are racing to keep up with that and continue this growth curve. And building energy– our energy system is complicated. It takes a while to build. And so that’s one of the fundamental challenges that I’ve been thinking about a lot this year.
IRA FLATOW: And that’s the existential challenge to me, that you say how much people are going to depend and are depending on AI. Are we going to reach a trade-off we reached on other energy-consuming lifestyles that we have, like fossil fuels usage, about whether we want to trade water and climate change for this new lifestyle we can’t live without?
CASEY CROWNHART: And I think one thing that I’ve been thinking about a lot this year is– when we’re talking about oh, ChatGPT uses this much energy or water– whatever model you want to talk about– you should just not use it. You should just avoid it. And I think one message I have for people is, sure. I don’t want to say that– make as many generative videos as you want.
But increasingly, it’s not really a personal choice. This is part of our digital infrastructure. This is– you’re getting AI suggestions based on your Google searches. You’re getting ads served to you that maybe were generated using AI. And so I think that this is overall a systems conversation that we need to be having, rather than talking about personal choices and personal use.
IRA FLATOW: Good point, Casey. You always bring good stuff to us. Thank you for taking time to be with us today.
CASEY CROWNHART: Thanks so much for having me.
IRA FLATOW: Casey Crownhart, senior climate reporter for the MIT Technology Review.
[MUSIC PLAYING]
This episode was produced by Dee Peterschmidt, but a lot of people helped make the show happen this week, including–
JOHN DANKOSKY: John Dankosky.
JOHN DANKOSKY: Danielle Johnson.
BETH RAMME: Beth Ramme.
JACI HIRSCHFELD: Jaci Hirschfeld.
IRA FLATOW: Thank you, folks. I’m Ira Flatow. Thanks for listening.
[MUSIC PLAYING]
Copyright © 2025 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
Meet the Producers and Host
About Dee Peterschmidt
Dee Peterschmidt is a producer, host of the podcast Universe of Art, and composes music for Science Friday’s podcasts. Their D&D character is a clumsy bard named Chip Chap Chopman.
About Ira Flatow
Ira Flatow is the founder and host of Science Friday. His green thumb has revived many an office plant at death’s door.