- Gesture Recognition Magic Wand Training Scripts.
- The scripts in this directory can be utilized to train a TensorFlow model that classifies gestures using accelerometer data. The code is designed for TensorFlow 2.0. The resulting model has a size of less than 20KB. This project was inspired by the Gesture Recognition Magic Wand project by Jennifer Wang.
- If you haven't installed NuEdgeWise, please follow these steps to install Python virtual environment and choose
NuEdgeWise_env
. - Skip if you have already done it.
- The
magic_wand_start.ipynb
notebook will help you prepare data, train the model, and finally convert it to a TFLite and C++ file.
- The dataset consists of accelerometer readings in three dimensions: x, y, and z, collected from various gestures.
- Users can collect their own data by running the code m467 sensor_collect on the m467 EVB.
magic_wand_start.ipynb
will assist you in collecting and preparing the data.
- Use
magic_wand_start.ipynb
to train the model locally.
- Utilize
magic_wand_colab.ipynb
to upload the dataset and train on Google Colab.
- Use
magic_wand_start.ipynb
to convert to TFLite and C++ files.
- If device supports ARM NPU, like U55, please use
vela\
to convert*quantized.tflite
.
- The ML_SampleCode repositories are private. Please contact Nuvoton to request access to these sample codes. Link
- ML_M460_SampleCode (private repo)
tflu_magicwand
: Real-time inferencetflu_magicwand_sensor_collect
: Data collection firmware