Online Social Touch Pattern Recognition with Multi-modal-sensing Modular Tactile Interface

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 13
  • Download : 0
The capability of recognizing various social touch patterns is necessary for robots functioning for touch-based social interaction, which is effective in many robot applications. Literature has focused on the novelty of the recognition system or improvements in classification accuracy based on publicly available datasets. In this paper, we propose an integrated framework of implementing social touch recognition system for various robots, which consists of three complementary principles: 1) multi-modal tactile sensing, 2) a modular design, and 3) a social touch pattern classifier capable of learning temporal features. The approach is evaluated by an implemented Multimodal-sensing Modular Tactile Interface prototype, while for the classifiers, three learning methods-HMM, LSTM, and 3D-CNN-have been tested. The trained classifiers, which can run online in robot's embedded system, predict 18 classes of social touch pattern. Results of the online validation test offer that all three methods are promising with the best accuracy of 88.86%. Especially, the stable performance of 3D-CNN indicates that learning 'spatiotemporal' features from tactile data would be more effective. Through this validation process, we have confirmed that our framework can be easily adopted and secures robust performance for social touch pattern recognition.
Publisher
IEEE
Issue Date
2019-06
Language
English
Citation

16th International Conference on Ubiquitous Robots (UR), pp.271 - 277

ISSN
2325-033X
DOI
10.1109/URAI.2019.8768706
URI
http://hdl.handle.net/10203/274978
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0