Code and README taken from tflite-micro project (see magic_wand example) and adapted to work with own data captured with an ESP32 and a MPU6050.
The scripts in this directory can be used to train a TensorFlow model that classifies gestures based on accelerometer data. The code uses Python 3.7 and TensorFlow 2.0. The resulting model is less than 20KB in size.
The following document contains instructions on using the scripts to train a model, and capturing your own training data.
This project was inspired by the Gesture Recognition Magic Wand project by Jennifer Wang.
The same three magic gestures as for the original example were used and data from one person was collected. Additionally some negative data was captured as well.
The sample dataset is included in this repository and can be downloaded at the following link: https://github.com/stefan-spiss/MagicWand-TFLite-ESP32-MPU6050/data/data.zip
The following Google Colaboratory notebook demonstrates how to train the model. It's the easiest way to get started:
|
|
If you'd prefer to run the scripts locally, use the following instructions.
Use the following command to install the required dependencies:
pip install -r requirements.txt
There are two ways to train the model:
- Random data split, which mixes different people's data together and randomly splits them into training, validation, and test sets
- Person data split, which splits the data by person
Using a random split results in higher training accuracy than a person split, but inferior performance on new data.
$ python data_prepare.py
$ python data_split.py
$ python train.py --model CNN --person false
Using a person data split results in lower training accuracy but better performance on new data.
$ python data_prepare.py
$ python data_split_person.py
$ python train.py --model CNN --person true
In the --model
argument, you can provide CNN
or LSTM
. The CNN model has a
smaller size and lower latency.
To obtain new training data use the Arduino script in the gesture_capture folder in the root of this repository (see readme.md in the root folder).
Edit the following files to include your new gesture names (replacing "wing", "ring", and "slope")
data_load.py
data_prepare.py
data_split.py
Edit the following files to include your new person names (replacing "stefan" or adding new persons):
data_prepare.py
data_split_person.py
Finally, run the commands described earlier to train a new model.