EVENT-DRIVEN VISUAL-TACTILE SENSING AND LEARNING FOR ROBOTS

NATIONAL UNIVERSITY OF SINGAPORE

 

Tasbolat Taunyazov, Weicong Sng, Hian Hian See, Brian Lim, Jethro Kuan, Abdul Fatir Ansari, Benjamin C. K. Tee, Harold Soh

 

ABSTRACT

 

This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.

 

 

Source: Arxiv

PRODUCTS USED IN THIS PAPER

SEARCH PUBLICATION LIBRARY

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter