fingertip_tectile_sensors_.png

Fingertip Tactile Sensors

2018 Mar. ~ Present

Collaborators: Tapomayukh Bhattacharjee, Shaoxiong Wang, Branden Romero, Connor Geiman


 

Publication

[C-1] H. Song, T. Bhattacharjee, S. S. Srinivasa, “Sensing Shear Forces During Food Manipulation: Resolving the Trade-Off Between Range and Sensitivity,” 2019 IEEE International Conference on Robotics and Automation. (Accepted) [pdf][video]


Award

T. Bhattacharjee, D. Gallenberger, D. Dubois, L. L'Écuyer-Lapiere, Y. Kim, A. Vamsikrishna, R. Scalise, R. Qu, H. Song, E. Gordon, and S. S. Srinivasa, "Autonomous robot feeding for upper-extremity mobility impaired people: Integrating sensing, perception, learning, motion planning, and robot control", NeurIPS, Montreal, Canada, 2018. Best Demo Award.


Demo Video

Demo video presenting the robotic feeding system equipped with fingertip tactile sensors


Media

  1. Sarah McQuate, “How to train your robot (to feed you dinner),” Online article, UW News, 11 March 2019. [link]


Motivation

When a robotic arm manipulates food using human tools, such as a fork, haptic feedback is essential because of two factors: visual occlusion when contact is imminent and the deformability of food. However, having force/torque (F/T) sensors on the tools is quite expensive and restricts the manipulator’s motion because of the wire connected to the sensor.  This problem motivated me to estimate 6 DOF force/torque data exerted to the tool not from the F/T sensor, but from the fingertip tactile sensors.

Another motivation comes from the fact that the technology has not been matured yet to utilize tactile sensing in robotics.

The most critical need for high-resolution data is when contact is imminent, at which point occlusion is inevitable. That shortcoming would be mitigated to some extent by tactile sensing, but neither the devices nor the methods are ready to fill the gap.

Mason, Matthew T. 2018. “Toward Robotic Manipulation.” Annual Review of Control, Robotics, and Autonomous Systems 1 (1): 1–28.


Purpose

The purpose of this project is 1) to develop/improve haptic devices and 2) to find methods for utilizing high-resolution data, which includes spatial and force data, generated from the haptic devices.


Approach

Current system is composed of the ADA and the Forque

Current system is composed of the ADA and the Forque

View from the wrist camera

View from the wrist camera

Forque

Forque

Current robotic feeding system uses Assistive Dexterous Arm (ADA) and the Forque. The ADA has a Kinova JACO arm equipped with a wireless video transmitter and a RGBD camera mounted to a wheelchair. Some main issues of this system come from the Forque; a wire with the F/T sensor restricted the manipulator’s motion, and equipping each utensil with the F/T sensor would have been extremely costly. For now, a human is cleaning the wire from the view of the camera and the path of the manipulator. Instead of putting the F/T sensor on the fork, I added fingertip tactile sensors that could gather similar data to the gripper. I selected the Fingertip GelSight sensor and the FingerVision sensor because they had the ability to track shear force and cost less than 100 dollars each, whereas the Nano25 F/T sensor was priced at about 7,000 dollars.

Design and the integration of the FingerVision sensors on the Kinova 2-finger gripper

Design and the integration of the FingerVision sensors on the Kinova 2-finger gripper

Design and the integration of the Fingertip GelSight sensor on the Kinova 2-finger gripper

Design and the integration of the Fingertip GelSight sensor on the Kinova 2-finger gripper

I integrated these two tactile sensors with the Kinova gripper by customizing the design and fabrication process and modifying their software. While calibrating the tactile sensors, I discovered a trade-off between the sensitivity level and sensing range of the shear force that occurred when changing the gripping force. I also created a control policy that took advantage of the trade-off where the bite acquisition phase required a high sensing range, and the bite prediction phase, which predicts the success of acquisition, required a high sensitivity level.

 
 
GelSight (left) and FingerVision (right) sensors on Kinova 2-finger gripper holding a non-instrumented fork

GelSight (left) and FingerVision (right) sensors on Kinova 2-finger gripper holding a non-instrumented fork

This video shows the shear force calibration, some examples of force trajectory, and the control policy of the tactile sensors.

More details of the results can be found in the paper [C-1].

 
 

Future Works

There are three main issues with the tactile sensors: 1) hysteresis, 2) precision, 3) size.

1) Hysteresis: Hysteresis is originated from the residual forces in the elastomer and the defect of tracking algorithm for the markers. I am closely working with Shaoxiong at MIT to improve it.

2) Precision: Transparency of the gel is beneficial if the proximity vision is used, but it just gives less precision when sensing shear force based on the marker tracking. For this application, I am trying to cover the surface of the gel to increase precision of the sensor.

3) Size: To have stable gripping of the tool, I need to put mechanical references on either the tools or the fingers. However, having mechanical references on the fingers, as shown in the demo video, disturbs grasping other objects. I am trying to put the mechanical references on the tools and design slimmer fingertip tactile instead.