This project aims to bridge the areas of music creation, physical movement and its graphic visualization, by utilizing the machine learning model PoseNet to control the music software via gestures and body motion.
Depending on the position of the nose, arms, and wrists as well as their relative distances to each other, the application sends MIDI signals to Ableton, which trigger the corresponding chords and drum patterns and control effects such as velocity, volume and filters.