This robotic hand rotates objects using touch, not vision

New York, (Samajweekly) Inspired by the effortless way humans handle objects without seeing them, a team of US engineers has developed a new robotic hand that can rotate objects solely through touch, without relying on vision.

The robotic hand built by a team from the University of California San Diego can smoothly rotate a wide array of objects, from small toys, cans, and even fruits and vegetables, without bruising or squishing them.

The work could aid in the development of robots that can manipulate objects in the dark, said the team who recently presented their work at the 2023 Robotics: Science and Systems Conference in Daegu, South Korea.

To build their system, the researchers attached 16 touch sensors to the palm and fingers of a four-fingered robotic hand.

Each sensor costs about $12 and serves a simple function: detect whether an object is touching it or not.

What makes this approach unique is that it relies on many low-cost, low-resolution touch sensors that use simple, binary signals — touch or no touch — to perform robotic in-hand rotation. These sensors are spread over a large area of the robotic hand.

This contrasts with a variety of other approaches that rely on a few high-cost, high-resolution touch sensors affixed to a small area of the robotic hand, primarily at the fingertips.

“We show that we don’t need details about an object’s texture to do this task,” said Xiaolong Wang, a professor of electrical and computer engineering at UC San Diego, who led the current study.

“We just need simple binary signals of whether the sensors have touched the object or not, and these are much easier to simulate and transfer to the real world,” Wang added.

The researchers further noted that having a large coverage of binary touch sensors gives the robotic hand enough information about the object’s 3D structure and orientation to successfully rotate it without vision.

The system assesses which sensors on the hand are being touched by the object at any given time point during the rotation. It also assesses the current positions of the hand’s joints, as well as their previous actions.

Using this information, the system tells the robotic hand which joint needs to go where in the next time point. The researchers then tested their system on the real-life robotic hand with objects like a tomato, pepper, a can of peanut butter and a toy rubber duck.

The robotic hand was able to rotate a variety of objects without stalling or losing its hold.

Wang and his team are now working on extending their approach to more complex manipulation tasks.

They are currently developing techniques to enable robotic hands to catch, throw and juggle, for example.

“In-hand manipulation is a very common skill that we humans have, but it is very complex for robots to master,” said Wang. “If we can give robots this skill, that will open the door to the kinds of tasks they can perform.”

Previous articleWorld’s largest private communications satellite to launch
Next articleIndiGo engages with Pratt & Whitney following engine recall