Ceci Capstone GUI
TouchDesigner, p5.js, ml5.js, Google MediaPipe, OSC, WebSockets, Node.js
โ GitHub
For Ceci Sun's senior thesis dance performance, I created a GUI/debugging interface that sends OSC messages to TouchDesigner.
Background
In Golan Levin's Fall 2023 Creative Coding course, I used p5.js to create a gesture expander that rendered in real-time. The project used the
via the MIT ML5 Bodypose Keypoints library. Ceci Sun, a friend and
dancer at Johns Hopkins University, performed choreography to one of my
favorite songs at the time, "Motion Picture Soundtrack" by Radiohead.
When Ceci and I caught up in Winter 2025, she asked if I would be interested
in creating a new version of the project for her senior thesis performance.
Given that it had been two years since we last collaborated, and newer
technologies had become available, I was really excited to create an improved
version.

Web Render vs. OSC
I had two options for connecting MediaPipe to TouchDesigner. TouchDesigner's Web Render node can display a webpage as a texture, which would show the MediaPipe visualization directly. With this approach, I would receive an image as an input. The alternative was using OSC to send the numerical pose data. I decided to use OSC because it has a lower latency and the programming workflow of using variables as opposed to encoding and decoding an image felt more intuive to me. Additionally, I could create effects using placeholder values in TouchDesigner and then replace them with the OSC values later.

MediaPipe is really powerful because it can retreive a lot of information
super quickly. The framework provided three parameters for all 33 joints and
deciding which ones to use so that the mapping would look intuitive
to the audience was an interesting challenge. When connecting the data to TouchDesigner, I observed that the all of the data had
a natural jitter, so it turned out that only one to two parameters was enough to communicate
the overall movement of the choreography.
I decided to use the distance between two fixed points to drive the animation.
For example, the distance between the left hand and the left shoulder is
easy for the dancer to control.
The slider in the p5 GUI would control how sensitive the mapping was.

Limitations
Since I only needed a few parameters to communicate the movement, it was not necessary to use all 33 joints. Unfortunately, the MediaPipe library does not support turning off certain joints. If I were to scale up the project, I would need to create a custom model that only uses the necessary joints to improve overall performance. Also, MediaPipe is best trained for waist-up poses filmed on the webcam and tracks at most one person at a time. When an improved model is created in the future, a lot of new possibilities will open up in terms of the types of performances that can be created.
Inspiration
I was heavily inspired by discrete figures by Daito Manabe's Rhizomatiks Research group. For time-based visuals, I often find it easier to figure out the music first. Ceci sent me some placeholder music to work and I had a lot of fun letting sound guide what the visuals should look like.

A challenging aspect of this project was actually making the generated visuals tell a story. One piece of advice that Golan gave me that helped a lot was to think of particles as a substance that can be molded to mimic natural phenomena, such as clouds, snow, or sand. To me, it felt more intuitive to use p5.js to fine-tune the parameters to get the desired effect.

๐ง Check back soon for the final performance ๐ง
โ Previous
Next โ