What if robotic limbs could feel texture, size and even pain? A group of scientists from Cornell University may have devised a way for a robot to feel its surroundings internally, similar to the way humans do.
Most robots are able to grasp objects and process tactile sensing through motorized means, but a group led by Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, discovered how to use an optical sensor inside a robotic soft hand.
The study was published in the journal Science Robotics.
“Most robots today have sensors on the outside of the body that detect things from the surface,” doctoral student and lead study author Huichan Zhao said in a statement. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”
Using optical waveguide technology, the researchers programed a robotic hand to do a variety of tasks, including grasping and probing for both shape and texture. The hand was even able to scan three tomatoes and determine which one was ripest by its softness.
This technology, Zhao explained, has many potential uses beyond just prostheses, including bio-inspired robots, which Shepherd in collaboration with Mason Peck, associate professor of mechanical and aerospace engineering, have been exploring.
“That project has no sensory feedback but if we did have sensors, we could monitor in real time the shape change during combustion [through water electrolysis] and develop better actuation sequences to make it move faster,” Shepherd said.
“Right now, it’s hard to localize where a touch is coming from,” Shepherd added.
Danielle Tarasiuk is a multimedia journalist based in Los Angeles. Her work has been published on AllDay.com, Yahoo! Sports, KCET, and NPR-affiliate stations KPCC and KCRW. She’s a proud Sarah Lawrence College and USC Annenberg alumn.