Being able to include animated lighting or images that be synchronized with dancers is crucial to many performances that rely on visual and musical effects to make the spectacle. Eduardo Padron aimed to achieve precisely this by monitoring the dancer’s movements with an accelerometer, and then triggering the proper audiovisual experience based on the identified motion.
Padron’s system is built around the Raspberry Pi 4 running an MQTT server that allows communication with other IoT boards. The data on movement was collected using the Nano 33’s BLE Sensor as well as its onboard accelerometer to collect information and then send it to an Google Colab setting. The model was trained using these 600 epochs and was able to achieve an accuracy of about 91 percent. After deploying the model on the Arduino it could output the correct gesture through USB that interacts with the Python script. After the gesture has been received it is then the MQTT server broadcasts the message to all device that is a client, such as an ESP8266 that lights and plays the associated audio or video.
The post Mapping Dance syncs movement and stage lighting using tinyML appeared first on Arduino Blog.