Tactile Sensing with DIGIT
What sensory feedback can be extracted from off-the-shelf vision-based tactile sensors?
Tactile Sensing for Robotic Manipulation
Tactile sensing represents a critical modality of perception for robots. Tactile perception systems enable robots to better understand and interact with the physical world by detecting properties such as hardness, texture, and slip that can only be comprehended through direct physical contact.
The Importance of Tactile Feedback
Humans utilize touch sensing constantly in daily activities. The human hand’s capability to execute various tasks, from simple grasping to intricate in-hand manipulations, has played a crucial role in our species’ success. Activities such as opening jars, folding materials, inserting objects, and cutting food all rely heavily on tactile feedback. This sensory information helps evaluate contact forces, maintain secure grasps against external disruptions, and recognize physical characteristics like center of mass and friction.
The deficiency of touch sensing is believed to be one of the primary reasons why current robotic hands lack dexterity in manipulation tasks. Despite significant advances in vision and motion systems, most robots operate without this critical feedback channel.
Vision-Based Tactile Sensing
We focus on vision-based tactile sensors—specifically DIGIT and GelSight—for integration into robotic hands. While highly accurate commercial force sensors exist, they present challenges including fragile physical wires, high costs, and integration difficulties, particularly at fingertips. In contrast, vision-based tactile sensors like GelSight and DIGIT can be easily mounted on robot fingertips.
These sensors allow robots to perceive object texture, hardness, and weight, mimicking human touch experiences. GelSight emerged from research at MIT’s MCUBE lab, while DIGIT was developed by Facebook Research. The rising popularity of these sensors can be attributed to advances in deep learning for computer vision and increasing computational power.


Extracting Tactile Information
We explore several key capabilities enabled by tactile sensors:
Depth Estimation
This work demonstrates methods to quantify gel deformation when objects contact the sensor:

Contact Area Estimation
By applying thresholding techniques, PCA analysis, and OpenCV ellipse fitting algorithms to depth information, the system can estimate object orientation angles. This capability is particularly valuable for robots that need to understand an object’s 2D pose during grasping, enabling more precise manipulation.

Normal Force Estimation
One can estimate the normal force values just from the images. For this purpose, we trained a ConvNet-based image-to-force regressor model. The ground truth values are recorded from a Force/Torque sensor. Here’s the performance of trained model against F/T sensor:

Force Direction Tracking
The implementation includes a marker-based approach where black dots on the gel surface enable tracking of both force direction and magnitude:

Applications and Impact
Tactile force feedback provides robots with a deeper understanding of physical properties of objects, such as hardness, texture, and slip. This information directly informs movement decisions and enables more effective interactions with the environment. In challenging environments where robot movement range is limited, tactile feedback can significantly enhance accuracy and precision, leading to more robust manipulation performance.
The extraction of force information purely from tactile images addresses limitations of traditional force/torque sensors, which typically involve fragile wiring, high costs, and integration challenges. Vision-based tactile sensing represents an important step toward the long-standing goal of enhancing robot capabilities in in-hand manipulation tasks.
Implementation and Code
Detailed explanations and implementation code for all techniques described above are available in this GitHub repository.
Related Publications
Visual Tactile Sensor Based Force Estimation for Position-Force Teleoperation
@INPROCEEDINGS{10115342,
author={Zhu, Yaonan and Nazirjonov, Shukrullo and Jiang, Bingheng and Colan, Jacinto and Aoyama, Tadayoshi and Hasegawa, Y. and Belousov, Boris and Hansel, Kay and Peters, Jan},
booktitle={2022 IEEE International Conference on Cyborg and Bionic Systems (CBS)},
title={Visual Tactile Sensor Based Force Estimation for Position-Force Teleoperation},
year={2023},
volume={},
number={},
pages={49-52},
keywords={Performance evaluation;Visualization;Parallel robots;Force;Force feedback;Tactile sensors;Estimation},
doi={10.1109/CBS55922.2023.10115342}}