The sense of touch is an essential ability for skillfully performing a variety of tasks, providing the capacity to search and manipulate objects without relying on visual information. Extensive research has been conducted over time to apply these human tactile abilities to robots. In this paper, we introduce a multi-finger robot system designed to search for and manipulate objects using the sense of touch without relying on visual information. Randomly located target objects are searched using tactile sensors, and the objects are manipulated for tasks that mimic daily-life. The objective of the study is to endow robots with human-like tactile capabilities. To achieve this, binary tactile sensors are implemented on one side of the robot hand to minimize the Sim2Real gap. Training the policy through reinforcement learning in simulation and transferring the trained policy to the real environment, we demonstrate that object search and manipulation using tactile sensors is possible even in an environment without vision information. In addition, an ablation study was conducted to analyze the effect of tactile information on manipulative tasks.
We propose DexTouch, a novel dexterous manipulation robotic system to perform three types of daily-life tasks with only tactile information. On the left, we show our manipulation tasks in simulation. A robotic system, consisting of a UR5e robotic arm and an Allegrohand with 16 attached touch sensors, was studied in both simulation and real environments. Our trained policy can be directly deployed in a real robotic arm-hand system and can successfully perform the tasks.
In experiments, our method is mainly compared with the following baselines. WO-Sensor (Without-Sensor): This policy is learned without any tactile information from the robot. LQ-Sensor (Low Quality-Sensor): The threshold of the tactile sensor to detect touch was set to 0.3 N. DA-Sensor (Deactivation-Sensor): The tactile information is disabled during evaluation.