Subscribe to Science Friday
Forty years ago this week, the space shuttle Challenger exploded in flight, 73 seconds after liftoff from Cape Canaveral. All seven crew members were killed. In the months that followed, the tragedy was traced to a failed O-ring in one of the shuttle’s rocket boosters. Now, with the Artemis II mission preparing for launch to lunar orbit, what have we learned about spaceflight and risk?
Former astronaut Jim Wetherbee joins Host Ira Flatow to remember the Challenger tragedy, and look ahead to the age of private spaceflight and the upcoming Artemis II mission.
Further Reading
- NASA and families of fallen astronauts mark 40th anniversary of space shuttle Challenger accident, via AP News
Donate To Science Friday
Invest in quality science journalism by making a donation to Science Friday.
Segment Guests
Jim Wetherbee is a former NASA astronaut, the former head of flight crew operations for NASA, and the author of Controlling Risk: Thirty Techniques for Operating Excellence.
Segment Transcript
[MUSIC PLAYING] IRA FLATOW: I’m Ira Flatow, and you’re listening to Science Friday. Today on the podcast, thinking about space flight and risk. 40 years ago this week, the Space Shuttle Challenger exploded in flight 73 seconds after it had lifted off from the Kennedy Space Center. All seven crew members were killed.
In the months that followed, the tragedy was traced to a failed O-ring in one of the shuttle’s rocket boosters, due to the cold temperatures at launch time. Now, with Artemis II mission preparing for a launch to circle the Moon, what have we learned about space flight and risk?
Joining me now is Jim Wetherbee. He’s thought a lot about this question. He’s a former NASA astronaut who went to space six times, commanding five shuttle missions, more than anyone else. And later, he became head of flight crew operations for NASA. He’s also the author of the book Controlling Risk, 30 Techniques for Operating Excellence. Welcome to Science Friday.
JIM WETHERBEE: Thank you very much, Ira. How are you today?
IRA FLATOW: Thank you, I’m very fine. When the Challenger disaster occurred in ’86, you had been selected as an astronaut, but you had not flown yet. Did that change how you thought about the job and the mission and the risks?
JIM WETHERBEE: Well, I was accepted in 1984, so I had been an astronaut for two years. In fact, the first launch I ever saw in person was the Challenger accident. I was down at the Kennedy Space Center relaying weather information and wind information to the Mission Control Center.
I would have to say, no, it really didn’t. I had been a naval aviator for about 10 years, I think, and I had seen and experienced, tragically, death and destruction. And I have a theory that it really affects you one time and then after that, although still tragic, you’re able to process this and think about the mission. I knew space flight was inherently dangerous, and so that did not change my opinion at all.
IRA FLATOW: I remember how people were surprised that the accident occurred, but it was the part of accepting that as part of the risk?
JIM WETHERBEE: Well, I would say it quite differently, Ira. We never really want to accept risk. What we want to do is control the risk. As an operator, you’ve heard the basic way that we assess risk is to assess the risk and the benefit.
But I think of a third thing. In addition to the risk and the benefit, I always think about how much control do I have? Can I control the risk? And if I can, then I’m willing to accept the risk as long as the benefit is great.
IRA FLATOW: So how would you have controlled the risk of this launch?
JIM WETHERBEE: Well, after the tragedy, it’s clear that the managers made incorrect decisions based on not processing or receiving the information that was coming from people like Roger Beaujolais, the engineer who predicted this problem. And so it’s actually quite obvious after the accident– in fact, so many accidents are kind of the same thing where you have managers who are either ill-informed or ignoring data, that are making incorrect decisions. So the best thing to do is to accept and evaluate all information that’s coming to you before you make a decision.
IRA FLATOW: So did the safety culture at NASA change, or how did anything change about managing the risk there?
JIM WETHERBEE: Well, it did. Specifically, new leaders came in place. The Rogers Commission was formed by President Reagan to assess the accident and make recommendations. Recommendation number two in the Rogers Commission Report, essentially was that the director of flight crew operations, who at the time was George Abbey, be elevated in the NASA hierarchy. In other words, pay more attention to what the flight crew and the operators and the mission controllers are saying. The operational people, give them more weight in their decision-making ability.
So yes, the culture did change after Challenger. But remember, it also changed after the tragic Apollo 1 fire on January 27, 1967, where three astronauts perished in the fire. New busses were inserted. Then, George Lowe took over, and he mentored George Abbey. And then George Abbey mentored me for 20 years while I was at NASA.
And then we had the third accident, the Columbia accident. I was responsible, as the search director, to find and recover the human remains of the Columbia crew. And so the culture does change after accidents. But the bosses that come in who really shape the culture, either retire or are moved years later, and generationally, the culture tends to degrade if you’re not careful, if you don’t have great leaders at the top making wonderful decisions.
IRA FLATOW: You mean the corporate memory is gone.
JIM WETHERBEE: Exactly, and I believe it’s a generational thing. If you experience that kind of an accident or tragedy, you never forget, but eventually you retire and new bosses come in who had not viscerally experienced the tragedy and the devastation.
And they’re not doing things intentionally incorrectly. But in their zeal to accomplish missions, rather than accomplish missions correctly, than the culture tends to degrade. The culture in organizations will always degrade unless you have leaders at the top who are constantly nurturing the culture to ensure that it always stays vibrant and healthy.
IRA FLATOW: And how do you do that? Do you make people aware of the past? What’s a method that these new managers would use?
JIM WETHERBEE: There are only two ways humans learn, either experientially, if you go through a tragedy, and we find leaders in organizations who have held somebody in their arms dying after a tragic event, they never forget. So experientially is one way that humans learn.
But the only other way humans learn is vicariously through the experience of others. And so we tell stories. And so, as you point out correctly, keep it alive, talk about the tragedies, and make sure that people never forget.
IRA FLATOW: Are you comfortable with where managing risk at NASA is now?
JIM WETHERBEE: So I’m not involved in how risk is managed at NASA now. What I see from the periphery– I really like Jared Isaacman, the new administrator of NASA. He clearly understands. He has flown in space a couple of times and done a spacewalk.
The culture in every organization is greatly dependent on the leader at the top. So from that perspective, I’m very encouraged.
IRA FLATOW: Mm0hmm.
JIM WETHERBEE: They have a great administrator who I think is going to ensure that they make the proper decisions to carry out the mission of NASA, and so people on the Earth can benefit.
IRA FLATOW: Mm-hmm, we’ve seen a big shift from space being the realm of just huge government operations to now we’re seeing tourism, an emphasis on private spaceflight. Does having business and the profit motive in the mix change the risk equation?
JIM WETHERBEE: It doesn’t really change the risk equation. By the way, I also think that commercialization and turning over to private entities is the natural flow in transportation. We have historical precedents. In America, when we first created the railroads, they were government subsidized because it’s a very expensive proposition.
But fairly quickly, companies figured out they could make money from transporting cattle. And so private companies took over railroads. Same thing with aviation, initially, US mail, airmail, was subsidized by the government, but eventually companies figured out we could make money.
So in the space business, I think it’s a natural flow. If we can figure out how to make money in space, then it’s appropriate to have space be commercialized and turned over to private entities. NASA will always have a mission of overseeing and integrating all the various functions, providing the regulations, and really doing the dangerous exploration missions as we push out farther into the solar system.
IRA FLATOW: But do you think that the private space realm is as concerned about managing risk as the public would be because it was owned by the public and beholding to them?
JIM WETHERBEE: Well, it all goes back to who’s the leader at the top of the organization? If you have a leader, in a private entity or a government entity, who really understands risk and how to make the proper decisions, it doesn’t change. You really have to understand that flying in space is inherently dangerous, and so we must control the risk, make the proper management decisions.
It’s a combination of two things. You must have rules-based procedures, which come from managing the risk, which corporations typically do in entities, organizations. But you must supplement the rules-based procedures with a suite of principles-based techniques, which the operators tend to think about.
If I’m facing the risk, I have to have a method for controlling the risk. If I’m a manager or an engineer in an office, I can manage the risk by changing the policies and the procedures or even the mission. But as an operator who straps into the vehicle, we have to be able to control the risk. So we do follow the rules, but we follow these suite of principles-based techniques to supplement the rules.
IRA FLATOW: Well, I imagine if you don’t control the risk well enough, you’re going to go out of business, right?
JIM WETHERBEE: Exactly.
IRA FLATOW: Do you think about that? Are you a consultant on risk?
JIM WETHERBEE: That’s what I do now. I’m passionate about helping companies that are specifically working in dangerous environments to be able to, not only manage the risk properly, but to control the risk. And I love traveling around the world and talking to companies and organizations about this concept.
[MUSIC PLAYING]
IRA FLATOW: We have to take a break, but when we come back, looking ahead to modern spaceflight and the upcoming Artemis II mission to the Moon.
[MUSIC PLAYING]
NASA is preparing for the launch of the Artemis ii mission, as you know, planning to go around the Moon as early as next month. What do you think is going through the minds of the crew and the launch team for that mission?
JIM WETHERBEE: Well, I exchanged emails with Reed Wiseman, the commander, about a month ago. I spoke with the crew and Victor Glover, the co-pilot, about a year ago. They’re ready to go. They’ve been training for a long time. They understand the risk. They know how to control the risk. And I think they’re eager to go.
IRA FLATOW: So what are the factors that you use in controlling risk for a giant mission like this one?
JIM WETHERBEE: Oh, there are many. So the crew will attend meetings. They’ll have conversations with all levels throughout the organization, all the way up to the administrator.
Again, going back to the Rogers Commission, after the Challenger accident, when that commission made their recommendation, number two to elevate the position of the flight operations director, there was an emphasis on making sure that managers received inputs from the operators, from the operational perspective, from the pilots, the people who are strapped into the vehicle, what are the kinds of things you think about? How can you control the situation? How can you help us? How can we work together in the Mission Control Center and the flight crew? And it’s a huge team effort, but you really have to pay attention to the operational concerns that the crew has.
IRA FLATOW: Yeah, I remember going back to the first missions, where they were designing the capsule in the Mercury series and. The astronauts said, can you put a window in here, so we can see outside? That kind of feedback, is that what you’re talking about?
JIM WETHERBEE: Well, exactly, and by the way, a window is, not just for the aesthetics or to take glorious pictures, but it’s also to be able to control the risk. And so one of the later missions, I think it was the final mission of Mercury, where they had problems with the system and the pilot had to take over manually. The way he did that was looking out the window with a reticle where he could precisely align the vehicle with respect to the horizon.
Similar kind of thing with the mission when they come back to the Moon. They have to hit that angle of the atmosphere precisely. And if the system is having trouble, which sometimes it does, the humans need to take over. 98% of the time, the system is better than the human, but 2% of the time, and these are not accurate numbers–
IRA FLATOW: Yeah.
JIM WETHERBEE: –just rules of thumb. The human has to be ready to take over. We have better judgment. We have intuition. And essentially, we have a fear of death. The computer does not, and it will crash periodically and take the humans with it when it does.
IRA FLATOW: You’re an astronaut who had extensive Navy flying experience, hundreds of landings on aircraft carriers. Do people with that sort of background have a different perception of risk from astronauts who have been selected as engineers or scientists or some other specialization?
JIM WETHERBEE: I think, yes, and that’s why they are the commander and the pilot on the mission because we have, throughout our careers, developed an ability to really assess the situation, not let our emotions carry the day, but rather respond correctly, make the right decisions.
I learned from Chuck Yeager, I heard him, one day, talking about the system. If you really understand the system, when bad things happen, you will be able to figure out the answer, even if it’s not in the checklist or the procedures, because you really understand how the system works. And that’s the kind of mentality that operators have that engineers, as great as they are, don’t really have that in their psyche.
IRA FLATOW: So how much do you, as a crew member or someone flying in the capsule, how much do you say do you have to the engineers, I’m not sure this is a safe idea, I don’t feel comfortable with this, can you change something?
JIM WETHERBEE: It’s a constant push. The engineers will always try to– and the managers, try to automate things, essentially, take the human out of the loop, and the operators are continually trying to leave the human in the loop.
I’m a big proponent of automation and computers and systems. They’re really good. They’re accurate, they’re fast. They don’t get tired as long as you feed them electricity. But they must be programmed precisely.
The human has other advantages judgment, intuition. For example, I can land on any runway in the world without ever seeing every runway in the world. The computer can’t. It must be programmed precisely for the landing that it’s going to make.
IRA FLATOW: Do we really need astronauts in this flight around the Moon. I mean, aren’t we at a level where you could just fly them by wire? I mean, just the computers fly the capsule, and it comes back by itself?
JIM WETHERBEE: Well, that’s what we’ve been talking about. 98% of the time, you’re probably right. 2% of the time, you need the humans. But more importantly, when it’s time to go back to the Moon and land and on to Mars, it’s really the human element that inspires people.
I’m not worried that AI is going to take over the world for a long, long time. Humans have the better judgment, the intuition. We can tell the story. We can learn much better than the computer. Computers don’t really learn. They’re programmed. Humans learn. And it’s our genetic destiny as explorers, and you always have to have the human in the equation.
IRA FLATOW: Well, Jim Wetherbee, this has been a really interesting interview. Thank you for taking time to be with us today.
JIM WETHERBEE: Ira, thank you very much.
IRA FLATOW: You’re welcome. Jim Wetherbee, former NASA shuttle astronaut, former director of flight crew operations for NASA. He’s also the author of the book Controlling Risk, 30 Techniques for Operating Excellence.
[MUSIC PLAYING]
This episode was produced by Charles Bergquist. Thanks for listening, and we welcome your feedback. Yes, rate and review us wherever you get your podcasts or give us a call and tell us what’s on your mind. 877-4-SCIFRI. I’m Ira Flatow. We’ll see you soon.
[MUSIC PLAYING]
Copyright © 2026 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
Meet the Producers and Host
About Charles Bergquist
As Science Friday’s director and senior producer, Charles Bergquist channels the chaos of a live production studio into something sounding like a radio program. Favorite topics include planetary sciences, chemistry, materials, and shiny things with blinking lights.
About Ira Flatow
Ira Flatow is the founder and host of Science Friday. His green thumb has revived many an office plant at death’s door.