TensorFlow Lite Micro

I’ve started with running the pre-trained micro_speech inference example which simply uses neural network on the Arduino board to recognize simple voice commands like “yes/no“. For this example I’ve used Arduino Nano 33 BLE Sense that supports tensorflow lite mirco. I’ve worked with other microcontrollers but I found Nano 33 BLE very impressive on training models and running directly on it, with the variety of onboard sensors like voice, motion, environmental and light.

  • micro_speech – speech recognition using the onboard microphone

On this example, the Arduino board is getting the LED to flash either green or red. Here, here I’m using TensorFlow Lite Micro to recognize voice keywords. It has a simple vocabulary of “yes” and “no.”

Demo:

Next, I’ve tried to capture sensor data with my microcontroller. Basically, I’ve used ML to enable the Arduino board to recognize gestures. I’ve captured motion data from Arduino Nano 33 BLE Sense board, imported it into TensorFlow to train a model and deploy the resulting classifier onto the board.

  • magic_wand – gesture recognition using the onboard IMU

Demo:

Sensor data: flex and punch gesture movements

Sensor data: flex and punch gesture movements

TinyML_epochs

TinyML_epochs

TinyML gesture model_google colab

TinyML gesture model_google colab