Multi-purpose tactile perception based on deep learning in a new tendon-driven optical tactile sensor

Abstract

In this paper, we create a new tendon-connected multi-functional optical tactile sensor, MechTac, for object perception in field of view (TacTip) and location of touching points in the blind area of vision (TacSide). In a multi-point touch task, the information of the TacSide and the TacTip are overlapped to commonly affect the distribution of papillae pins on the TacTip. Since the effects of TacSide are much less obvious to those affected on the TacTip, a perceiving out-of-view neural network (O$^2$VNet) is created to separate the mixed information with unequal affection. To reduce the dependence of the O$^2$VNet on the grayscale information of the image, we create one new binarized convolutional (BConv) layer in front of the backbone of the O$^2$VNet. The O$^2$VNet can not only achieve real-time temporal sequence prediction (34 ms per image), but also attain the average classification accuracy of 99.06%. The experimental results show that the O$^2$VNet can hold high classification accuracy even facing the image contrast changes.