
As a precision multi-tool, the human hand is unmatched. It can clutch, squeeze, flicker, signal, catch, punch, poke, turn, scratch, lift and even make noises, among other functions. To duplicate those skills with a humanoid robot hand is a complex undertaking, but there are major advancements on that front.
Sanctuary AI has just announced that it has integrated new tactile sensor technology into its Phoenix general purpose robots.
Sanctuary AI was founded in 2018 as a general purpose robotics and physical AI company based in Vancouver, Canada. The company is widely known for “Phoenix,” its general purpose robot that was unveiled in 2022.
“The sense of touch is a key enabler for creating human-level dexterity in robots and critical for physical AI to achieve its full potential,” said James Wells, CEO at Sanctuary AI. “Our tactile sensors enable reliable and confident fine manipulation when vision is occluded, unlocking capabilities such as blind picking, slippage detection and prevention of excessive force application, all broadening the scope and range of tasks for our general purpose robots. By equipping general purpose robots with advanced touch sensors, Sanctuary AI’s technology provides industry-leading capabilities to perform an expanded set of work tasks.”
Engineers have long taken inspiration from the human body in designing robots — neural networks mimic the brain’s decision-making process, actuators function like muscles, and now robotic hands are evolving to replicate the dexterity of their human counterparts. But one of the more elusive traits to replicate is touch.
The human hand has more than 30 muscles that work together in a highly complex way, and about a quarter of all our body’s bones are found in our hands. Those muscles, and the skin of the hand are supplied by three nerves, the Radial, Median and Ulnar nerve, according to the National Library of Medicine’s National Center for Biotechnology Information. “The fingers on each hand are bent and stretched about 25 million times over the course of a lifetime. Our hands also have very sensitive antennae for receiving information from the environment: There are a total of 17,000 touch receptors and free nerve endings in the palm. These pick up sensations of pressure, movement and vibration. So it is with good reason that the sense of touch is often associated with the hands. The skin on our fingertips is especially sensitive to touch.”
The ability to sense pressure, texture and temperature has been a challenge to recreate in the research lab, yet recent breakthroughs in tactile sensing technology are bringing robots closer to a human-like sense of touch. Several companies are now refining fingertip sensors that could dramatically expand what robots are capable of, from delicate assembly work to providing care in medical settings. These advances signal a future in which robots don’t just look human, they feel human too.
“Without tactile sensing, robots depend on video to interact with their environment. With video alone you don’t know you’ve touched something until well after the collision has physically caused the object to move,” said Dr. Jeremy Fishel, principal researcher at Sanctuary AI. “This reduces work efficiency and can require numerous attempts, grasping and re-grasping the same object for a secure hold. Touch solves this.”
“As global labor shortages continue to impact industries, many organizations are struggling to fill roles that sustain their operations. Sanctuary AI’s proprietary technology is focused on addressing these growing labor challenges by developing world-leading dexterous hand technology that enables general purpose robots to fill gaps in the global workforce–fulfilling jobs across a variety of industries, including automotive, distribution, energy, logistics, retail, telecom, utilities, and more.”
Sanctuary AI joins a growing list of tech companies that are focusing on robotic hands. Meta AI collaborated with GelSight, a company that specializes in tactile intelligence technology, to create Digit 360, which is a silicone fingertip-shaped tactile sensor equipped with over 18 sensing features. Gelsight CEO Youssef Benmokhtar said the technology relies on a camera embedded behind gel in each fingertip that catches the gel deformation when it comes into contact with an item.
Also, researchers at Duke University’s Pratt School of Engineering are relying on acoustics in developing SonicSense, which essentially listens to vibrations to identify materials.
“SonicSense features a robotic hand with four fingers, each equipped with a contact microphone embedded in the fingertip. These sensors detect and record vibrations generated when the robot taps, grasps, or shakes an object.”