top of page

Interactive Installations —
Embodied Systems of Light and Motion

This collection of works explores the dialogue between human motion and algorithmic space through real-time sensing technologies including Kinect Azure, Intel RealSense, Leap Motion, and MediaPipe. Each installation transforms gesture, posture, and proximity into evolving fields of light, particles, and sound—where the body becomes both performer and instrument.
By merging computer vision, generative graphics, and spatial audio, these systems investigate how presence can be visualized, fragmented, and re-composed across digital environments.

01

Ephemeral Echoes Deconstructing Marine Presence

A four-window interactive installation that visualizes the dissolution of bodily presence into the marine environment. Through real-time depth tracking and point-cloud particle dispersion, the work transforms human silhouettes into fluid constellations of data, echoing the impermanence of marine existence. Each window reflects a different temporal layer—presence, trace, memory, and disappearance—creating a spatial dialogue between fragmentation and continuity.

02

​Algorithmic Gestures

Algorithmic Gestures explores how human motion can act as a living algorithm. Using MediaPipe hand-tracking, each gesture becomes a rule-based command that scales, rotates, and transforms geometric structures in real time. The system bridges embodied movement and computational logic, turning gestures into both input and design language.

Influenced by Bauhaus modular minimalism and Deconstructivist geometry, the visuals shift between order and disruption—precise yet fluid, structured yet expressive. Through this interplay, the work examines how digital form can reflect the body’s intuitive rhythms within algorithmic systems.

Drawing on the chromatic energy of Fauvism, the project contrasts rational computation with emotional color and gesture. The result is a hybrid aesthetic space—where geometry, color, and movement merge into an evolving choreography of human-machine co-creation.

image.png

03

Air Guitar: Embodied Performance in Real Time

Air Guitar is an interactive performance system that fuses MediaPipe hand-tracking, Max/MSP + Jitter, and Unity 3D visualization to translate gesture into sound, movement, and virtual embodiment. Each motion of the performer’s hands generates MIDI signals that modulate both sound synthesis and real-time visual effects—transforming invisible gestures into a living audiovisual instrument.

In this extended version, a 3D avatar mirrors the performer’s gestures through live camera-based tracking, blending physical and digital presence into a single expressive continuum. The project explores how motion data can function as both a compositional and performative medium, where gesture becomes code, sound becomes motion, and embodiment becomes networked simulation.

Air Guitar highlights the convergence of body, sound, and algorithmic space, creating an immersive feedback loop between performer and system—a choreography of human and machine improvisation rendered in real time.

04

Embodied Fields: Real-Time Performance System

It's a live audiovisual performance that transforms the performer’s body into a dynamic interface for sound and motion.
Using MediaPipe full-body tracking, Max/MSP + Jitter, and Unity 3D, the system captures skeletal motion in real time and maps it to both audio synthesis and a 3D digital avatar.
TouchOSC provides additional accelerometer data for rhythmic modulation, allowing subtle gestures and physical intensity to shape both sound and spatial light.

During the performance, the physical and virtual bodies move in synchrony—each gesture producing sonic and visual feedback in an evolving feedback loop.
Virtual Body explores the blurred boundary between embodiment and simulation, questioning how digital systems can extend human presence into new performative realities.

​

05

Air Keyboard: Gesture-Controlled Musical Interface

Air Keyboard is a real-time performance system that transforms hand motion into musical expression through MediaPipe hand-tracking and Max/MSP + Jitter.
Each hand movement is analyzed to control different virtual instruments—piano, flute, percussion, and choir—allowing performers to compose and improvise without physical contact.
Gestures modulate pitch, rhythm, and dynamics, turning empty space into a responsive, invisible keyboard.

The system provides real-time visual feedback using Jitter-based reactive graphics, creating an audiovisual environment where motion, sound, and light merge in performance.

06

Dancing Lines

Dancing Lines transforms the body into a living brush that paints with movement.
Using Kinect Azure, the system captures real-time body motion and translates it into flowing trails of light, tracing every gesture through space like a digital choreography.
Each performance becomes a unique drawing in motion—an ephemeral dialogue between physical rhythm and computational fluidity.

Blending the aesthetics of motion capture, procedural animation, and gesture-based visual art, the piece explores how the act of dancing can become both expression and data, revealing the beauty of transient embodiment in digital space.

07

Liminal Tide

Liminal Tide is a four-window generative installation that visualizes the emotional and ecological cycles of the ocean through real-time gesture control. Using Leap Motion for hand tracking, the piece translates subtle movements into fluid simulations—guiding currents, dispersing light, and stirring the living particles of the sea. Each scene—Sea’s First Breath, Echoes Slowly Fade, Tide Weeps Deep, Drift in Gray—reflects a shifting state between vitality and dissolution.

08

Bloom in Fragments

A generative composition where a digital bloom emerges from fragments of light and motion. Using Leap Motion for hand tracking in real time, the piece translates gesture into the birth and dissolution of a pixelated flower—between nature’s organic rhythm and algorithmic abstraction.

09

Abyssal Motion

Abyssal Motion is an interactive GLSL-based installation where jellyfish drift through an infinite ocean responding to human movement. Using Kinect depth tracking, hand gestures generate underwater currents that guide the jellyfish swarm in real time. Each motion alters flow velocity and color diffusion, creating a living choreography between human gesture and marine motion.

The work explores embodied connection within digital ecologies — where the ocean becomes a reactive field, and light, body, and code intertwine beneath the surface.

10

Body as Brush

Body as Brush explores how motion can embody the act of painting. The participant’s silhouette, captured by Kinect Azure, becomes an instrument of ink and light. Each movement draws ephemeral traces that fade like breath on water—a dialogue between tradition and technology, control and dissolution.

11

Bloom Algorithm

Bloom Algorithm transforms body motion into a choreography of generative blossoms. Using Kinect Azure for real-time depth tracking, the installation maps the performer’s movement into a three-dimensional particle field that blooms, disperses, and reconstitutes with each gesture.

The work bridges organic growth and digital computation — a system where human energy becomes algorithmic nature. Each bloom is ephemeral yet algorithmically precise, evoking the tension between vitality and impermanence within interactive form.

12

Vortex

Vortex is a hand-controlled water tunnel built with Kinect depth tracking and a GLSL shader field. Gestures modulate vortex radius, rotation, and turbulence in real time, bending a procedural flow into a navigable tunnel of liquid light. The piece turns the body into a force field—where motion sculpts current, and current sculpts space.

13

Gesture Syntax

Gesture Syntax is a dual-hand interactive system that transforms movement into visual language. Using MediaPipe for real-time hand tracking, the work assigns distinct functions to each hand—one switches between visual layers (from abstract composition to typographic space), while the other controls scale and rhythm. Together, they compose a living syntax of gesture.

14

Celestial Field

Celestial Field transforms the human body into a constellation of motion. Using Kinect and RealSense sensors, each gesture disperses or gathers stars across an infinite field of light. The body becomes both the observer and the cosmic force, shaping the galaxy through movement.

15

çµ® — Drift

Drift transforms mouse interaction into a flowing simulation of digital fabric. This piece reacts to cursor movement as if stirred by invisible wind—streams of color weave and unravel in perpetual motion. The fine-threaded turbulence resembles silk or smoke, where each gesture reshapes the texture of space.

16

Cosmic Gate

Cosmic Gate is an interactive installation that transforms body movement into a cosmic event. Using Kinect’s real-time depth mapping, four spatial trigger points act as portals—each initiating a dynamic transformation of a generative tunnel rendered through GLSL displacement and particle motion. As participants move through these invisible points, the tunnel morphs—its geometry twisting, expanding, and collapsing in reaction to human presence. The result evokes a sense of traversing a living space-time continuum, where motion generates both architecture and atmosphere.

17

Rainspace

Rainspace transforms human motion into a living rainfall of light. Using Kinect Azure and TouchDesigner, the system captures the performer’s body and visualizes it as cascading streams of particles that respond in real time to gesture and movement. Each drop traces a fleeting contour of the body—sometimes dispersing into air, sometimes converging into form—blurring the line between presence and dissolution.

​

bottom of page