T. Bhattacharjee, D. Gallenberger, D. Dubois, L. L'Écuyer-Lapiere, Y. Kim, A. Vamsikrishna, R. Scalise, R. Qu, H. Song, E. Gordon, and S. S. Srinivasa, "Autonomous robot feeding for upper-extremity mobility impaired people: Integrating sensing, perception, learning, motion planning, and robot control", NeurIPS, Montreal, Canada, 2018. Best Demo Award.
Sarah McQuate, “How to train your robot (to feed you dinner),” Online article, UW News, 11 March 2019. [link]
When a robotic arm manipulates food using human tools, such as a fork, haptic feedback is essential because of two factors: visual occlusion when contact is imminent and the deformability of food. However, having force/torque (F/T) sensors on the tools is quite expensive and restricts the manipulator’s motion because of the wire connected to the sensor. This problem motivated me to estimate 6 DOF force/torque data exerted to the tool not from the F/T sensor, but from the fingertip tactile sensors.
Another motivation comes from the fact that the technology has not been matured yet to utilize tactile sensing in robotics.
The most critical need for high-resolution data is when contact is imminent, at which point occlusion is inevitable. That shortcoming would be mitigated to some extent by tactile sensing, but neither the devices nor the methods are ready to fill the gap.
Mason, Matthew T. 2018. “Toward Robotic Manipulation.” Annual Review of Control, Robotics, and Autonomous Systems 1 (1): 1–28.