Humanoid robots have evolved from experimental novelties into increasingly useful tools. They can run, jump, skip and even flip, and several versions are being used in warehouses to complement humans. Yet their potential in workplace settings remains limited by a persistent challenge: Replicating the complex functions of the human hand.
The human hand is capable of 27 distinct movements and imbued with a tactile ability that allows humans to perceive the shape, size and texture of an object without seeing it—feeding vital sensory information to the brain. For decades, robotics engineers have struggled to recreate these intricate mechanics and sensory feedback in artificial hands. Now, the robotics and AI industry believes it may be on the verge of a breakthrough. Advances in materials science, AI and biomechanics are converging to produce robotic hands with more dexterous manipulation skills and tactile perception.
There are several models that have been released or are soon expected to be released, each version unique but that hold promise way beyond the grab and grasp models of yesterday. For example, Meta AI collaborated with GelSight, a company based in Waltham, Mass. that specializes in tactile intelligence technology, to create Digit 360, which is a silicone fingertip-shaped tactile sensor equipped with over 18 sensing features, according to GelSight.
“GelSight’s tactile sensing technology is a vision-based tactile sensing technology,” said Youssef Benmokhtar, CEO of GelSight, in an interview with Techstrong. “Essentially, it is like having a camera inside your finger and capturing what it would ‘see’ when touching objects. A camera is placed behind a gel and captures images of the gel deformation when it is in contact with surfaces. 3D reconstruction algorithms provide a way to characterize the 3D shape being touched. Our current GelSight Mini sensor and Digit can be integrated into a robotic hand. In the robotics field, Meta is our largest partner and we are very proud to be the company working closely with Meta on the commercialization of the Digit sensor and Digit360. Mitsubishi Reality Labs has also published many papers on their incredible work on tactile manipulation using GelSight Mini. NVIDIA has also included GelSight sensors in its robotics simulation tool available in Isaac.”
The NVIDIA Isaac AI robot development platform consists of NVIDIA-accelerated libraries, application frameworks and AI models that accelerate the development of AI robots such as autonomous mobile robots (AMRs), arms and manipulators, and humanoids, according to NVIDIA.
Mr. Benmokhtar added, “Our technology is also used extensively in the world of surface metrology, providing sub-micron measurement capability on any material (glass, metals, composites) in the palm of your hand or on a robotic arm. Customers like Rolls-Royce and Safran use GelSight to characterize defects on turbine blades and other jet and helicopter engine parts.”
A collaboration between Meta AI and Wonik Robotics, a robotics company based in South Korea, will utilize Digit 360 on Wonik’s Allegro Hand.
“Meta’s work with Wonik will focus on a new generation of Wonik’s Allegro Hand, a robotic hand with tactile sensors like Digit 360,” Wonik states. “Building on a platform Meta developed to integrate sensors on a single robot hand, the upcoming Allegro Hand will feature control boards that encode data from the tactile sensors onto a host computer.” The Allegro Hand will be available later this year, according to Wonik.
Researchers at Duke’s Pratt School of Engineering are relying on acoustics in developing SonicSense, which essentially listens to vibrations to identify materials.
“SonicSense features a robotic hand with four fingers, each equipped with a contact microphone embedded in the fingertip. These sensors detect and record vibrations generated when the robot taps, grasps or shakes an object,” the school stated in a press release last October.
“And because the microphones are in contact with the object, it allows the robot to tune out ambient noises. Based on the interactions and detected signals, SonicSense extracts frequency features and uses its previous knowledge, paired with recent advancements in AI, to figure out what material the object is made out of and its 3D shape. If it’s an object the system has never seen before, it might take 20 different interactions for the system to come to a conclusion. But if it’s an object already in its database, it can correctly identify it in as little as four.”
Mr. Boyuan Chen, Assistant Professor of Mechanical Engineering & Materials Science and Computer Science at Duke said “SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects.”
Mr. Chen added, “This is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch. We’re excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions.”
NASA and General Motors, two companies that have long integrated robotics, have also developed robotic hands with advanced dexterity and touch, affixed to their humanoid robot, Robonaut 2 (R2).
According to NASA, “R2’s hand and forearm assembly . . . is designed to approximate closely the capabilities of the human hand. The assembly is a completely self-contained unit featuring high dexterity, fine force control, and advanced sensing that enables the grasping and actuation of a broad array of tools. Relocation of components (e.g., motors, avionics) to the forearm makes room for increased sensing in the fingers and palm, where it is needed most.”
NASA sees the primary applications for its humanoid robot being industrial manufacturing and maintenance, space exploration, personal assistance and caregiving, emergency services and operations in hazardous environments, and repetitive task automation.
Several companies currently use humanoid robots to carry out routine tasks inside warehouses. Amazon, for example, uses Digit, which moves bins from a shelf to a conveyor. And Mercedes-Benz is using the Apollo robot for low-skill, physically challenging manual labor tasks, such as carrying bins. The hands on those robots are able to grasp and apply pressure, but beyond that they do not have the ability to “feel” in the way that the advanced hand sensors can, and that limits their usefulness beyond such basic tasks.
Mr. Benmokhtar, the GelSight CEO, said that replicating that human feedback through touch would mark a transformational moment in robotics. “Touch is available to humans in the womb, making it a fundamental sense to understand the world we live in,” he said. “GelSight has digitized touch and feel, and we are providing the only technology that when combined with AI, will allow machines to better navigate the world we are in, just like we do.