01/11/2019

Can An App Fight Opioid Overdoses?

11:24 minutes

black and white image of a smartphone displaying a screen with options to learn how to administer naloxone or cancel the call
If a person fails to interact with the app, it will contact someone who can administer naloxone.
Credit: Mark Stone/University of Washington

Last year, about 47,000 people in the United States died from an opioid overdose, including prescription and synthetic drugs like fentanyl, according to the CDC.

And as the epidemic of opioid abuse continues, those looking to reduce death rates are searching for ways to keep drug users safer.

But what if your smartphone could monitor your breathing, detect early signs of an overdose, and call for help in time to save your life? Researchers writing in Science Translational Medicine this week think they have just that: smartphone software that can ‘hear’ the depressed breathing rates, apnea, and changes in body movement that might indicate a potential overdose.

University of Washington PhD candidate Rajalakshmi Nandakumar explains how the software, which uses smartphone speakers and microphones to mimic a bat’s sonar, can ‘hear’ the rise and fall of someone’s chest—and could someday even coordinate with emergency services to send help.

Further Reading

  • Read the full study on how we could potentially use smartphones to detect opioid overdoses.
  • Learn more about how Rajalakshmi Nandakumar is taking inspiration from bats in terms of monitoring inaudible sound signals.

Segment Guests

Rajalakshmi Nandakumar

Rajalakshmi Nandakumar is a PhD candidate in the Paul G. Allen School of Computer Science and Engineering at the University of Washington in Seattle, Washington.

Segment Transcript

IRA FLATOW: The number of deaths from opioid overdoses continues to rise in the US. The CDC estimates that 47,000 people fatally overdosed in 2017. That is up nearly 10% from 2016. And while some experts are looking for ways to reduce opioid dependence, others are looking for new ways to prevent the deaths. For example, opioid overdoses are easily reversed by administering the drug naloxone if the person overdosing is treated in time. 

Now, how might you catch an overdose in time? Well, turns out there’s an app for that. University of Washington PhD candidate Rajalakshmi Nandakumar has helped developed some software that turns a phone’s microphone into a kind of sonar that can detect when a person’s breathing has slowed or even stopped. The research was published this week in Science Translational Medicine. And she joins me now to explain the project. Welcome to Science Friday. 

RAJALAKSHMI NANDAKUMAR: Hello. 

IRA FLATOW: Tell us how this works. What is the idea behind this? 

RAJALAKSHMI NANDAKUMAR: So this app works by transforming the phone into an active sonar system. So you can imagine like a bat or a dolphin communicates, we send inaudible sound signals using the phone’s speaker. And these signals get reflected off a person. And in this case, their moving chest when they’re breathing. So these reflected signals from the chest are recorded using the phone’s microphone. And when it’s processed, you can actually get the breathing signal. 

IRA FLATOW: What made you decide to tackle this opioid crisis with technology this way? 

RAJALAKSHMI NANDAKUMAR: As you mentioned, opioid overdose is a massive public health epidemic today. And people– studies have shown that 115 people die every day due to opioid overdose. So as computer scientists, we wanted to use technology to solve this problem and connect people to these lifesaving interventions, like naloxone. 

IRA FLATOW: Now let’s say, OK, so the software on the phone is sensitive enough to detect the change in my chest heaving up and down when I breathe? 

RAJALAKSHMI NANDAKUMAR: Yes, so it can detect using just a speaker and microphone. It can detect like up to a 7 millimeter motion. So we are talking about this minute breathing motion. 

IRA FLATOW: And you have tested this out? 

RAJALAKSHMI NANDAKUMAR: Yes. We tested it at Insight at Vancouver, British Columbia. It is a supervised injection facility. And we tested it with around 200 people who come to the facility and engage in this high risk behavior. 

IRA FLATOW: OK, so let’s say that this software detects a possible overdose, what happens next? Does it immediately call 911? 

RAJALAKSHMI NANDAKUMAR: That’s a great question. So what we do is that when we see that the person has stopped breathing, the phone first emits an alarm, so basically to stimulate and wake up the patient. If they fail to wake up and engage with the phone, we then call– automatically connect them to a preset emergency contact. Like it could be a family or a friend with naloxone, or sometimes emergency services. 

IRA FLATOW: You know, this sounds like very much like the heart monitoring device some of the smart watches have now, that they can detect if you have a heart problem and then do the same thing. 

RAJALAKSHMI NANDAKUMAR: Sure. That’s one of the great examples, I guess. 

IRA FLATOW: Yes. OK, so let’s say, you know, you’re developing the app, who do you target at? Who is the app for, someone who plans to keep using opioids or someone who doesn’t want to overdose? 

RAJALAKSHMI NANDAKUMAR: So the app is targeted for people who have an opioid use disorder today so that they can keep themselves safe while they engage in this high risk behavior. And this is kind of like a harm protection technique, where they are safe enough to actually then kind of go for the next level of treatment to solve their problem. 

IRA FLATOW: Let’s talk about the technology of how the phone is used. Do you have to place it on your body? Do you have to put it on a table? How does it detect your slow down in breathing? 

RAJALAKSHMI NANDAKUMAR: So it can detect the breathing in a contactless fashion. So we envision that, say when the person is actually engaging in the high risk behavior, when they’re setting up their equipment on a table, they just place the phone on the table and then just turn on the app. 

IRA FLATOW: Is that right? So what about people who want to kick their dependence on this, can that aid them in getting through this? 

RAJALAKSHMI NANDAKUMAR: Sure. So one of the things we are, as a next step, we’re planning is to actually connect them to inpatient or outpatient rehabilitation centers. And this app could also help them kind of monitor through this process of beating the addiction. 

IRA FLATOW: You know, it’s funny because I didn’t know my phone could do sonar. What frequency are we talking about here? We get technical with this on our show. 

RAJALAKSHMI NANDAKUMAR: So we are talking about higher frequencies, which are typically inaudible to people, kind of 18 kilohertz to 22 kilohertz and close to the ultrasonic band. 

IRA FLATOW: Isn’t that the dog hearing range? 

RAJALAKSHMI NANDAKUMAR: Yes, it is the dog hearing range. 

IRA FLATOW: So are the dogs going to come? I mean, sort of semi-seriously about that. 

RAJALAKSHMI NANDAKUMAR: So we emit these signals in a very low volume. You could like imagine this as like a static sound in a TV. So we have actually, while we were testing it at Insight, people brought their dogs and the dogs were fine. 

IRA FLATOW: So why a smartphone app? A lot of people have, for example, Fitbits, those tiny little wrist devices, something that’s touching a person’s body. Wouldn’t that be more accurate or sensitive if you wore it on your wrist? 

RAJALAKSHMI NANDAKUMAR: So our app– we first targeted it in smartphones because this is a population where smartphones are ubiquitous and you don’t need kind of any additional hardware. That said, our app can even draw on like Fitbits or smart watches, or pretty much any device which has a speaker and a microphone. 

And since it works in a contactless fashion by looking at the reflections, it doesn’t matter if you’re wearing it or if it’s anywhere within a meter from you. 

IRA FLATOW: I’m Ira Flatow. This is Science Friday from WNYC Studios talking with Rajalakshmi Nandakumar. I also would probably think that people who are injecting opioids are not going to want to wear a wristband of some sort. 

RAJALAKSHMI NANDAKUMAR: Yeah, that is also true. One of the– so when we were conducting the study, one of the things they liked about it is that they just have to like place it on a table and it doesn’t like come in the way of them finding a vein or their usual process. 

IRA FLATOW: Now I know that– you know, this sounds to me like you could have all kinds of different applications for sonar like this, for example. You were talking about with developing an app to help diagnose sleep apnea because, could you not detect when people are not breathing? 

RAJALAKSHMI NANDAKUMAR: Sure, so our team has worked on diagnosing sleep apnea using smartphones, where we can monitoring the breathing while they are sleeping, and then identify specific apnea events, like central apnea where they stop the breathing, or obstructive apnea, and the other apnea events. Yes, so any condition which involves monitoring the respiration of a person can be enabled on a phone. 

IRA FLATOW: I would imagine sleep apnea is probably easier because– 

RAJALAKSHMI NANDAKUMAR: Yes, it’s true. Because we do the sleep apnea in a constrained environment where the person is actually sleeping. So though people kind of toss and turn occasionally during sleep, it’s still they are not active. They are not moving, compared to an opioid overdose situation where the person is injecting, moving forward, and stuff. 

IRA FLATOW: So when are we going to see this ready for prime time? 

RAJALAKSHMI NANDAKUMAR: Our hope is to make the app available, let’s say, within a year after the proper FDA approvals. 

IRA FLATOW: You know, all this reminds me of that scene in one of the recent Batman movies where they do something very similar with the cell phones using the microphones to physically map the entire city of Gotham. Maybe kids– is that a little too far fetched at this time? Could we do that now? 

RAJALAKSHMI NANDAKUMAR: I think we can do that now. Yes. 

IRA FLATOW: Is that right? 

RAJALAKSHMI NANDAKUMAR: Yeah. I think if we put together all the smart phones, use all the microphones and speakers, I think we can do that now. 

IRA FLATOW: Anything else on your menu of things that you can do with the sonar system on the phones that you’re working on? 

RAJALAKSHMI NANDAKUMAR: Sure. Our lab has worked on developing other systems using sonar. One of the interesting projects was we call finger [INAUDIBLE], which is fine grained finger tracking system. So basically, the devices like smart watches are becoming smaller in size and their screens are small to type anything on them. 

So the finger IO system can provide an alternative means to interact with these devices by just– you can just use your finger, write something on the air or draw anything on the air. Or like anywhere around the device. It could be a table. It could be your hand. And then the device can detect it and use it as an input. 

IRA FLATOW: So the sonar will detect the movement of your finger in air and see what you’re drawing. 

RAJALAKSHMI NANDAKUMAR: Yes. 

IRA FLATOW: You don’t have to touch anything anymore. Wow. 

RAJALAKSHMI NANDAKUMAR: You don’t have to. 

IRA FLATOW: Could you draw your own name in the air and then log on somehow? 

RAJALAKSHMI NANDAKUMAR: Yeah, you could draw your name. You could possibly like, say, map your screen to another place. You could draw even a picture and it can be detected by the smart watch or any other device like a smartphone. 

IRA FLATOW: Well, you’re one smart scientist, Dr. Nandakumar. Thank you for taking time to be with us today. 

RAJALAKSHMI NANDAKUMAR: Thank you. 

IRA FLATOW: Rajalakshmi Nandakumar is a PhD candidate at the University of Washington in Seattle. That’s about all the time we have for this hour. Charles Bergquist is our director. Our senior producer, Christopher Intagliata. Our producers are Alexa Lim and Christie Taylor, and Katie Feather. We had technical and engineering help today from Rich Kim, Sarah Fishman, Kevin Wolf. And we’re active all week on all the social media, Facebook, Twitter, Instagram. 

And if you have a smart speaker and you could ask it to play Science Friday whenever you want, it’ll play the latest episode of the show. So everyday now is Science Friday. I’m Ira Flatow in New York.

Copyright © 2019 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/

Meet the Producer

About Christie Taylor

Christie Taylor was a producer for Science Friday. Her days involved diligent research, too many phone calls for an introvert, and asking scientists if they have any audio of that narwhal heartbeat.

Explore More