This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our biologically-inspired fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. For more details, please go to our official web-page.
This project also requires a fork of the SLAYER framework to learn a Spiking Neural Network (SNN), which we have included here as a git submodule. To obtain the full set of dependencies, clone this repository recursively:
git clone https://github.com/clear-nus/VT_SNN/ --recursiveThe requirements for this project that can be installed from PyPI are found in
requirements.txt. To install the requirements, run:
pip install -r requirements.txtThis project also requires a fork of the SLAYER framework to learn a Spiking Neural Network (SNN), which we have included here as a git submodule. To install this dependency, run:
cd slayerPytorch
python setup.py installThis repository has been tested with the declared sets of dependencies, on Python 3.6.10.
The datasets are hosted on Google Drive.
We also provide helper scripts for headless fetching of the required data. For slip:
./fetch_slip.shThe preprocessed data can be also downloaded with parameters specified in the paper:
./fetch_slip.sh preprocessWe provide the scripts for preprocessing the raw event data, and training the
models in the vtsnn folder. We provide code for the 3 models presented in our
paper:
- VT-SNN (Using SLAYER)
- ANN (MLP-GRU)
- CNN3D
The repository has been carefully crafted to use guild.ai to track experiment runs, and its use is encouraged. However, instructions for running each script (using both guild and vanilla Python) can be found in each script.
To see all possible operations, run:
guild operationsFor example, to run our VT-SNN tactile-only model on the Container-Weight classification task, run:
guild run vtsnn:train-cw mode=tact data_dir=/path/to/dataVisit the vtsnn/train_*.py files for instructions to run with vanilla Python.
To cite this work, please use:
@inproceedings{taunyazov20event,
title={Event-Driven Visual-Tactile Sensing and Learning for Robots},
author={Tasbolat Taunyazov and Weicong Sng and Hian Hian See and Brian Lim and Jethro Kuan and Abdul Fatir Ansari and Benjamin Tee and Harold Soh},
year={2020},
booktitle = {Proceedings of Robotics: Science and Systems},
year = {2020},
month = {July}}
if your scripts cannot find the vtsnn module, please run in the root directory:
export PYTHONPATH=.
