07/28/2017

Can You Feel Me Now? The Science Of Digitizing Touch

11:56 minutes

Photo by Japanexperterna.se/flickr/Close up person using smartphone

For you gamers out there, you might remember when video game controllers would come with rumble packs. Whenever you drove over a pile of rocks or ran into a tree, the pack would shake to give you the sensation of that crash. That digitization of touch is called haptics. Today, haptic technology can be found in different sorts of devices. For example, smartwatches equipped with haptic technology will tap you on the wrist when you have a notification, and scientists are developing tools that can relay textures like wood and cotton.

[How touch helps us emotionally experience the world.]

Engineer Katherine Kuchenbecker, director of the Max Planck Institute for Intelligent Systems in Stuttgart, Germany, talks about engineering the sense of touch, and how the technology could “smarten” robotics.

[How do video games make you feel?]


Segment Guests

Katherine J. Kuchenbecker

Katherine J. Kuchenbecker is a director for the Max Planck Institute for Intelligent Systems. She’s based in Stuttgart, Germany.

Segment Transcript

IRA FLATOW: This is Science Friday. I’m Ira Flatow. You may have played one of those video games where you rumble over a pile of rocks, you smash into a tree, and the controller shakes and buzzes to give you that feeling of a crash. Well that 3D touch sensation is called haptics. And video game controller was haptics 1.0.

Haptics are showing up in smart devices too. You know that little tab notification you get from your smartwatch? Well, that’s a haptic. And scientists are working on haptics for surgery, prosthetics, and all sorts of possibilities.

How do you digitize that feeling of touch? What could it be used for? Let me introduce my next guest, Katherine Kuchenbecker is director of the Max Planck Institute for Intelligent Systems in Stuttgart, Germany. Welcome to Science Friday.

KATHERINE KUCHENBECKER: Thanks so much. Happy to be here.

IRA FLATOW: We won’t be taking your calls this hour, but we’d like to hear from you on Twitter if you like @SciFri and talk to one another about what we’re talking about. I know you’ve worked on a project where you created palettes of textures. So instead of picking a paint color you can pick a texture that feels like cotton or wood. Tell us about it though. What are you measuring to create textures? Is it all vibrations?

KATHERINE KUCHENBECKER: Great question. So the goal of that project was to be able to record how surfaces feel to touch with the same accuracy that we can capture them with a camera. So we actually call it haptography, haptic photography. And we do record vibrations but also other signals that your body can feel and that we can sense with engineering sensors.

So imagine you take a tool and you drag it back and forth over a piece of leather, or say a rough piece of stone. You can feel the tip of the tube bounce across the little bumps on the surface or be dragged. It might be more difficult to move across the leather. And also you can feel how hard or soft the surface is.

And so we created a little tool that we call a haptic camera, touch-based camera that can record all those sensations. And then we do a bunch of math and make a model so that when you go touch a virtual version of that piece of leather or that stone, we can create sensations that make it feel almost as though you’re touching the real thing.

IRA FLATOW: So when I do the touching, am I touching a trackpad? What am I touching?

KATHERINE KUCHENBECKER: Well we have a bunch of different instantiations. The first one we worked on was a Wacom tablet computer. So it’s like an iPad but with a stylus in your hand. So you can touch different places on the screen and we could put a picture of, say, the stone on the right and the leather on the left. And then as you drag the stylus across the picture, you would feel like the tip was going over the stone or the leather, even though it’s really just going over glass. And the way we do that is we have a little motor. It’s kind of like a speaker, but it plays vibrations through your fingers. And it would vibrate the tip of the tool back and forth depending on how fast you move or how hard you push, in a very similar way to how a real tool would feel.

That was kind of the first version, 1.0 as you might say of this haptography system. And 2.0 was using a more sophisticated haptic device where you can move in three dimensions, going closer to virtual reality. So you can move the tool around in 3D space and feel, oh, here is like a sphere or a cube or a big statue. And we could portray these sensations, the vibrations, and also how hard it is to move along it. You could tap on it and feel, oh, this stone is really hard and the leather or foam is really soft. And you could feel all those things together with an image of the virtual object.

IRA FLATOW: So is that really one of your goals, to use this in a virtual reality environment?

KATHERINE KUCHENBECKER: Yeah, that’s definitely one of the major goals. You probably heard that VR is on the rise and pretty exciting. Lots of people see ways for it to be used say, for medical training or online shopping or of course gaming as well. And we believe that the more realistic sensations we can bring in to such virtual environments, the more believable they’ll be. I mean, life is a lot more than just what you see and what you hear. You can’t do anything in the real world without reaching out and touching something. And you should be able to feel it.

IRA FLATOW: You mentioned using it in medicine and surgery. How would you do that? What kind of uses could you make?

KATHERINE KUCHENBECKER: So certainly one big opportunity is to improve the kind of video games that doctors train on. There’s a big movement in medicine right now to help surgeons or dentists or other medical practitioners practice on a realistic computer simulator before they try out, say, suturing or drilling a cavity on a real human. There are a lot of ethical reasons. And even for the doctor themselves, they’d like to practice and know they’re good at something before they do it on a person. And so improving the quality of what you feel– sometimes those simulators are just like a fake plastic tooth and sometimes they’re computer-based, like a sophisticated computer game. So we work on improving what the surgeon or the dentist, the doctor would feel. So it’s kind of the first simulation.

The second one, we also work in my research lab a lot on robotic surgery. And now here there’s no virtual world. There’s a surgeon controlling a robot across the room to do surgery deep inside a human patient. So the company that has made robots like this is called Intuitive. They make the da Vinci robot. And they get used all around the world, every day, to let doctors operate deep inside the patient. But that commercial FDA approved version has absolutely no touch feedback. So the doctor can’t feel anything of what they’re doing.

IRA FLATOW: So is there a practical version using haptics on the horizon?

KATHERINE KUCHENBECKER: Yeah, so that’s another big topic my research group worked on. When I was a professor at the University of Pennsylvania in Philadelphia for the last 10 years, we invented– basically taking what I described before, the haptography idea where we record all the sensations and then model them, instead of modeling them, you can just transmit them immediately, sort of like a microphone and a speaker where we measure what the tools are feeling and play it for the surgeon’s hands right away. And so their right hand can feel what the sensors on the right tool are feeling and their left hand can feel what the tool sensors on the left tool are feeling.

And there are many, many different kinds of things we could measure. And surgical instruments are already pretty complicated and expensive. And so the trick for this project is to figure out which of those signals that we could measure, which of the touch-based cues would have the most bang for your buck, the most benefit. And for this we really focused in on vibrations. They hadn’t been studied very much before. Most people thought about forces. If I’m pushing on the tissue, how hard is it pushing back? And you can actually kind of see that with your eyes because tissue deforms. And so instead we looked at the higher frequencies, things more like the sounds, the tools hitting each other or cutting tissue or colliding with something outside the view. We can measure those signals quite easily with tiny sensors that just sit up on the robot arms. And then we can play them for the surgeon’s hands to feel using very similar motors to what we put on the original texture pad, that stylus on a tablet computer.

IRA FLATOW: I would think a great place for this would be in the development of self-driving cars or car steering wheels or something where you’d feel a vibration if some car was coming up next to you or behind you and maybe wake you up even if you were going to sleep.

KATHERINE KUCHENBECKER: Absolutely. Haptics is a great ways to get a driver or a user’s attention because it’s private and it’s salient. You can feel it easily when it’s designed well. And so there are lots of researchers. And car companies too are really interested in using this. And it does exist I think in lane keeping to help you stay in your lane.

A lot of modern cars are what’s called drive by wire, meaning there’s no physical connection between your steering wheel and the tires. And so it’s kind of like that remote control surgery that I was talking about. A computer and some motors have to connect what you feel on the steering wheel to what’s happening at the wheels. And you want it to be not just that the wheels go where you point them but that you can feel it if you start bumping into the curb, or maybe the automatic lane keeping assistance could share control of the car with you and kind of communicate with you through the steering wheel, almost like a driving teacher holding a second steering wheel on the other side of the car perhaps.

Did you want to hear about how we hack the vibrations to make them work even a little better?

IRA FLATOW: Oh, now you have my attention, absolutely. We like to get into the weeds on this program.

KATHERINE KUCHENBECKER: It was a nice side story because when we record, say, what a tool feels as it drags across that piece of stone, the tool’s actually shaking in all directions, left and right, up and down, in and out. So we can put a three axis accelerometer, vibration sensor on there and measure all of those vibrations. And your hand holding the tool can feel all of them. But really interesting, you can’t feel which way it’s vibrating.

At higher frequencies, like above 20 cycles back and forth in a second, you don’t have a sense of the direction. You’re kind of direction blind. You can only feel the fact that it’s vibrating. And then let’s say we’re trying to let the surgeon feel what their scissors are doing inside the patient. You could of course have three motors and output that up-down, left-right, and in-out vibrations. But that’s heavy and complicated and costly.

And so we came up with an algorithm to take the three axes of vibrations and turn them into just one, because your fingers can’t feel which direction the vibrations are happening in. So if we do the algorithm right, we can really find a loophole in the sense of touch and take advantage of it to engineer better, more cost effective, simpler systems.

IRA FLATOW: Do you think that this kind of stuff is underappreciated?

KATHERINE KUCHENBECKER: I certainly think the sense of touch is underappreciated. We as humans are really used to being able to feel what it’s like to move our bodies around and the responses we feel. But you know better what it’s like to close your eyes or be in a dark room or be in a situation where maybe you can’t hear. Your sense of touch is always there. And so we often take it for granted.

As a researcher, I also work on giving robots a good sense of touch, so autonomous robots, helping them pick up objects so they could put them away in the dishwasher or hand something to a human in a safe and really reliable way. And when you try to program robots to do tasks like that, you really start to see how important the physical contact haptic signals are for coordinating these movements in the real world.

Interpersonal connections are really powerful. That’s actually a very common request. People who’ve had an amputated upper limb, like let’s say they’ve lost their hand, it’s not just that they want to be able to pick up their coffee. They would really love to be able to hold hands with their loved ones. And so the emotional side of touch and physical contact between people is really important. We have been working on hugging robots in my lab recently, trying to make a robot that delivers really good hugs.

IRA FLATOW: Wow. You know you’re right, that is a very emotional and would be a very useful thing to have over the miles.

KATHERINE KUCHENBECKER: It’s something we’re looking into. And give me a few more years and I’ll have a PhD student and maybe we’ll see if it might have some true benefits. Could it really uplift your mood or help figure out how you’re feeling or help people stay in touch long distance? That’s some of the things we’re thinking about.

IRA FLATOW: All right, we’ll have you back in a couple of years, OK? Write it down.

KATHERINE KUCHENBECKER: I’d enjoy that. Thanks so much.

IRA FLATOW: Katherine Kuchenbecker is director of the Max Planck Institute for Intelligent Systems in Stuttgart. Thank you very much for taking the time to be with us today.

KATHERINE KUCHENBECKER: My pleasure.

Copyright © 2017 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/

Meet the Producer

About Alexa Lim

Alexa Lim was a senior producer for Science Friday. Her favorite stories involve space, sound, and strange animal discoveries.