Back in December, SciFri aired a segment
on DARPA's and Google's interest in robots. Indeed, these smart machines seem to be everywhere these days, cruising around Mars
, flying like birds
, snapping pictures of lions
, and working with children with autism
. Here’s a roundup of more robotics projects designed to make our lives easier and healthier, from a personal assistant to surgical devices:
You probably remember Rosie from The Jetsons
, that whirling dervish of a robotic maid. Are we close to building a real-life version? While “Rosie is very far away,” according to Ashutosh Saxena, a computer science professor at Cornell University’s personal robotics lab
, his team is working hard to make a personal robotic assistant a reality.
Saxena’s lab has built a robotic prototype
that predicts human movement in order to help with daily tasks. “Humans anticipate each other’s behavior all the time, whether it’s opening a door or pouring coffee,” said Saxena. “To get a robot to predict, we collect tons of human behaviors, from eating, to taking medicine, to microwaving food.” The team records those behaviors on video and shows them to the robot, which uses an algorithm to understand the human movements and objects, and how humans use those objects. The 'bot then extracts that information to store in its databases.
To assist a human, the robot tracks her movements using a sensor from a Microsoft Kinect home videogame system. It then accesses its databases to determine what activity she’s engaged in, and what she’ll do several seconds into the future. For instance, say you’ve got a heavy pot that needs to go in the fridge—the robot would open the door for you. If you were about to take a prescription, it would fill your water glass for you.
Just don’t expect this robomaid in your home anytime soon. It’s expensive, the battery only lasts a few hours, and it’s large—about the size of a fridge. “The wheel base really commands the room,” says Saxena. “It’s really not practical yet for small spaces.”
For those who just can’t wait, however, Saxena has open-sourced
his robot’s code, so you can tinker with its predictive powers for whatever tasks you dream up (within reason). “Giving the source code away allows others to adapt it for different purposes,” says Saxena. “It can be used in assembly lines, in nursing homes, in households where people need live-in assistance.”
SEARCH AND RESCUE
Vacuuming aside, robots are ideal for even dirtier—and downright dangerous—work, such as inspecting toxic factories or finding people in collapsed buildings. To design a nimble robot suited for such a task, Sangbae Kim, who runs MIT’s Biomimetic Robotics Lab
, and his team looked to the animal kingdom for inspiration. They found the perfect model in a sleek African cat—the cheetah. “A robot like this, with this maneuverability and durability, will play a part in search and rescue operations,” says Kim. For instance, police and fire departments could send it to find people in distress and notify authorities upon locating the victim, Kim says.
Capturing the speed, dynamic movements, and sleek maneuvers of a real cheetah in robot form, however, is a considerable challenge—most modern bots, overall, are stiff and bulky, according to Kim. To meet the feat, Kim gathered a diverse team, including biomechanics, motor physicists, power experts, mechanical designers, and a control engineer.
Together, they reimagined what it means to be a robot. The resulting robo-cheetah has a slender frame made from durable plastic that won’t shatter when the device runs and jumps. High-torque electric motors attached to gears at the hip joints allow for lightweight legs and movements that don’t require springs. Running on a lightweight battery, the robot has a response time of 215 microseconds (much faster than an actual cheetah).
Kim hopes to make the cheetah available for police work or the Special Forces in a decade. After that, “Maybe one day we can drop the price to a laptop, and you can have a cheetah as a jogging partner,” he says.
Watch the cheetah run in the video above; see more videos here.
Where medicine is concerned, a company called Intuitive Surgical dominates the surgical robotics game with its da Vinci
, a system that assists in laproscopic and thoracoscopic procedures. Using the da Vinci, a surgeon's hand motions are translated into precise movements that enable the device's robotic arms to make tiny, minimally invasive incisions, helping lead to shorter hospital stays, smaller surgical scars, and fewer sentinel events. “So many infections and complications are due to the incision or the suturing, and the robot takes that away,” says Amanda Nickles Fader, a gynecological oncologist at Johns Hopkins who performs a couple hundred surgeries a year with the da Vinci. “It makes surgery much safer and much less painful so a patient can leave the same day.”
Meanwhile, leading surgical robotics research labs across the country
have been experimenting with a surgical robotic system called the RAVEN (this device is not in clinical use). Created by Blake Hannaford, director of the University of Washington's Biorobotics Laboratory, in collaboration with Jacob Rosen of the Human Bionics Lab at the University of California, Santa Cruz, the RAVEN runs on open-source software. That way, users can share their modifications to the system with each other—and "sharing allows us to improve constantly,” says Hannaford. Scientists have also developed another version
of the RAVEN that allows two surgeons to work together from separate locations via the Internet.
HEAD IN THE CLOUDS
While robots make our lives easier, researchers are improving bots’ lives, too, so to speak. Robots typically carry a computer and a processor with finite amounts of memory and processing power. What’s more, computer processing uses up a disproportionate amount of battery power, according to Ken Goldberg
, a computer scientist at the University of California, Berkeley. His solution? Move most of the robot “brains” from on-board computers to remote servers. “We can make robots faster, smarter, and more powerful by using the cloud,” he says. With access to vast repositories of cloud-stored data, a robot “pulls what it needs to pull, when it needs to pull it,” Goldberg says. “That keeps the on-board computer light. And, the battery power is used for the robot, not the processor.”
Google’s self-driving car is an example of cloud robotics in action, says Goldberg. The car draws on maps and images from the cloud to figure out its location and route while also sending information back about road and traffic conditions.
What other cool robots have you heard about lately? Let us know in the comments section.
*This article was updated on March 26, 2014, to reflect the following correction: In the section on "Robomaid," the quote by Ashutosh Saxena originally stated that, “Giving the license away allows others to adapt it for different purposes." It is the robot's source code that Saxena's team is sharing, not the license.
**On March 27, 2014, a more recent video of the cheetah running replaced a previous video of the cheetah's "first run." The newer video demonstrates the robot's speed and efficiency, according to Sangbae Kim.
***On April 2, 2014, the word "policy" was corrected with "police" in the following sentence, appearing in the "Search and Rescue" section: "For instance, police and fire departments could send it to find people in distress and notify authorities upon locating the victim, Kim says."
RELATED SCIENCE FRIDAY LINK