δΈ (yi) / oneness
TouchDesigner, p5.js, ml5.js, Google MediaPipe, OSC, WebSockets, Node.js
β GitHub
For Ceci Sun's senior thesis, δΈ (yi) / oneness, I created visuals to accompany her live performance and a GUI/debugging interface that sends OSC messages to TouchDesigner.
Background
In Golan Levin's Fall 2023 Creative Coding course, I used p5.js to create a gesture expander that rendered in real-time. The project used the
via the MIT ML5 Bodypose Keypoints library. Ceci Sun, a friend and
dancer at Johns Hopkins University, performed choreography to one of my
favorite songs at the time, "Motion Picture Soundtrack" by Radiohead.
When Ceci and I caught up in Winter 2025, she asked if I would be interested
in creating a new version of the project for her senior thesis performance.
Given that it had been two years since we last collaborated, and newer
technologies had become available, I was really excited to create an improved
version.

Thematic Underpinnings
Much of Ceci's creative practice is informed by mind-body connections through qigong principles. Her work combines Eastern and Western philosophical perspectives: Eastern traditions emphasize balance and the flow of energy, while Western contemporary dance practices explore emotional expression and psychological experience. As an American-born Chinese American, I have also grown up with a mix of Eastern and Western philosophies. Topics such as meditation and traditional Chinese medicine are deeply ingrained in my personal life, which I was able to draw on when working on this project.

Inspiration
I was heavily inspired by discrete figures by Daito Manabe's Rhizomatiks Research group, as well as Lingdong Huang's {Shan, Shui}*, which Golan had previously shown us and undeniably influenced how we approached the mountain visuals. For time-based visuals, I often find it easier to figure out the music first, and Ceci sent me some placeholder tracks that helped guide what the visuals should look like.

Web Render vs. OSC
I had two options for connecting MediaPipe to TouchDesigner. TouchDesigner's
Web Render node can display a webpage as a texture, which would show the
MediaPipe visualization directly. With this approach, I would receive an
image as an input. The alternative was using OSC to send the numerical pose
data. I decided to use OSC because it has a lower latency and the programming
workflow of using variables as opposed to encoding and decoding an image
felt more intuive to me. Additionally, I could create effects using
placeholder values in TouchDesigner and then replace them with the OSC values later.
When it came to deciding what equipment to use for the live performance,
it turned out that using the OSC approach would be more reliable. This is
because images are very expensive to send. Depending on the size of the
venue, using longer cables may introduce some undefined behavior as well.

MediaPipe is really powerful because it can retreive a lot of information
super quickly. The framework provided three parameters for all 33 joints and
deciding which ones to use so that the mapping would look intuitive
to the audience was an interesting challenge. When connecting the data to TouchDesigner, I observed that the all of the data had
a natural jitter, so it turned out that only one to two parameters was enough to communicate
the overall movement of the choreography.
I decided to use the distance between two fixed points to drive the animation.
For example, the distance between the left hand and the left shoulder is
easy for the dancer to control.
The slider in the p5 GUI would control how sensitive the mapping was.

Limitations and Challenges
Since I only needed a few parameters to communicate the movement, it was not necessary to use all 33 joints. Unfortunately, the MediaPipe library does not support turning off certain joints. If I were to scale up the project, I would need to create a custom model that only uses the necessary joints to improve overall performance. Also, MediaPipe is best trained for waist-up poses filmed on the webcam and tracks at most one person at a time. When an improved model is created in the future, a lot of new possibilities will open up in terms of the types of performances that can be created.
A challenging aspect of this project was actually making the generated visuals tell a story. One piece of advice that Golan gave me that helped a lot was to think of particles as a substance that can be molded to mimic natural phenomena, such as clouds, snow, or sand. To me, it felt more intuitive to use p5.js to fine-tune the parameters to get the desired effect.

More Experiments


π§ Check back soon for the final performance π§
β Previous
Next β