A multidisciplinary exploration of embodied Mixed-reality storytelling
A collaborative team blending robotics, immersive media, and speculative design
Two programmers building real-time robot control, networking, and AR/VR integration
Three artists developing character design, environment art, and AR visual layers
One producer coordinating technical pipeline, narrative direction, and project vision
Exploring human–robot co-presence through physical and virtual fusion
Integrating Unitree G1 with Apple Vision Pro via real-time bidirectional control
Creating a digital twin that bridges physical embodiment and AR identity
Designing layered robot “skins” that shift perception and emotional tone
Investigating agency, autonomy, and ambiguity in human–machine interaction
Building a short immersive experience that questions what it means to share space with a machine
We developed a real-time Unity–Python pipeline connecting Apple Vision Pro and the Unitree G1 through UDP communication. A synchronized digital twin mirrored the robot’s movements, enabling asymmetric interaction between two players. Iterative testing focused on latency, safety, and expressive autonomy, transforming technical constraints into meaningful behavioral design elements.