Teaching Robots the Gentle Touch
One of the quiet truths about robotics is that machines are often strongest where humans are weakest—and weakest where humans excel. Robots can lift hundreds of pounds, weld steel with microscopic precision, and repeat the same motion thousands of times without fatigue. Yet ask a robot to pick up a raspberry, and suddenly the task becomes far more difficult.
Researchers at the University of Texas at Austin are working to close that gap. Their new robotic hand system, called Fragile Object Grasping with Tactile Sensing (FORTE), demonstrates how combining soft robotics with tactile sensing can allow machines to manipulate delicate objects without crushing them.
The research, published in IEEE Robotics and Automation Letters, addresses one of the most persistent challenges in robotics: enabling machines to perform fine, careful manipulation.
“Right now, robotics is starting to be able to do large motions around the house, but struggles with really fine and delicate movements,” said Siqi Shang, the study’s lead author and a doctoral student in the Cockrell School of Engineering’s Chandra Family Department of Electrical and Computer Engineering. “Robots can fold a shirt, but may struggle to carefully pick up your glasses or unpack fruit from your groceries.”
The FORTE system approaches the problem by giving robots something they have historically lacked: a sense of touch.
Borrowing from Fish Fins
The robotic fingers are inspired by the fin-ray effect, a structural principle observed in fish fins that allows them to conform naturally around objects. When pressure is applied, the fin structure bends inward rather than away, enabling a secure but gentle grip.
Using 3D printing, the researchers created flexible fingers embedded with small air channels. These channels act as tactile sensors. When the fingers wrap around an object, the internal air pressure shifts slightly depending on how the object pushes against the finger.
Tiny sensors detect these pressure changes and feed the information back to the robot in real time.
The result is a robotic hand that can detect subtle signals—such as the beginning of an object slipping—allowing the system to adjust its grip before the item falls or breaks.
Testing the Limits of Gentle Robotics
To evaluate the system, researchers tested the grippers on 31 different items ranging from extremely delicate foods to rigid objects. The test set included raspberries, potato chips, apples, jam jars, billiard balls, and soup cans.
The results were impressive.
The system achieved a 91.9 percent success rate in single-trial grasping experiments. Even more notably, the tactile sensing system detected slipping objects with 93 percent recall and 100 percent precision, allowing the robot to respond without applying damaging force.
“Humans pick up objects with just the right amount of force; too much and you'll crush it, but too little and it'll slip out of your hand,” said Lillian Chin, assistant professor of electrical and computer engineering at UT Austin. “This system allows robots to approach that same balance.”
Why Delicate Manipulation Matters
For decades, robotics has excelled in environments where objects are rigid and predictable. Industrial robots weld car frames and assemble electronics with remarkable accuracy.
But many industries deal with items that are far less forgiving.
Food processing is one of the clearest examples. Handling fragile fruits, vegetables, or baked goods has traditionally required human workers because conventional robotic grippers either drop items or damage them.
Healthcare presents similar challenges. Medical tools, biological samples, and laboratory materials often require careful manipulation that existing robotic systems struggle to achieve.
Even manufacturing increasingly demands delicate handling as electronics, glass components, and micro-devices become smaller and more sensitive.
Advances in tactile sensing could dramatically expand where robots can operate.
A Step Toward Dexterous Machines
The team behind the project—including UT computer science associate professor Yuke Zhu and doctoral student Mingyo Seo—has made the system’s hardware designs and algorithms publicly available. By releasing the work openly, the researchers hope to accelerate progress in robotic manipulation.
Future improvements will focus on making the sensors less sensitive to temperature changes and enhancing the system’s ability to detect and respond to slips even more quickly.
These improvements may seem incremental, but they represent an important step toward a long-standing goal in robotics: machines that can interact with the physical world with the same nuance as human hands.
Robots are getting better at navigating spaces and performing large motions. But the next frontier of automation may lie in something far more subtle.
Not strength.
Not speed.
But the ability to pick up a potato chip—without breaking it.