KAIQING HAUNG

           Works                    Bio/CV                   Curios               

Image description

CrossReality - Interfusion


AR Experience & Live Music
June 2024



Visual Artist: Kaiqing Huang, Boxiong Zhao

Technologist: Chu Zhang, Boxiong Zhao

Concept: Kaiqing Huang, Wangyu Ping

Music: Chu Zhang

Credit: Botao Hu, Holokit

INTRODUCTION

This iteration of CrossReality was honored to be showcased as an experiential work at ICLC 2024 (International Conference on Live Coding), held at NYU Shanghai. The work explores the dynamic interplay between humans and their environment through mixed reality, blurring the boundaries between distinct contexts. The environment is fluid, reshaped by human interaction, while simultaneously influencing perception and movement. In a shared space, participants affect and are affected by each other and the environment, creating a continuous cycle of mutual influence.


Artwork video [please select at least 1080p quality]
SETUP
For this occasion, we incorporated multiple pedestals to craft a tangible and curated environment, enhancing the sensory engagement of the audience. Using an immersive projection, we presented visual elements from all angles, enriching the spatial experience and enveloping the participants.Additionally, we created a three-dimensional AR field that amplified the ambient qualities of the virtual world. This setup provided participants with a direct and profound connection as they interacted with both the unseen virtual and the tangible physical environment.


 Basic setup  


 Complete presentation with AR  

—  
AR simulation in Unity  


MECHANISMAt the core of our piece is a dynamic virtual sphere that moves between two players, shaping the virtual field with each motion. This interaction invites the audience to engage via mobile devices or, for a deeper immersion, AR headsets, providing a lens into mixed realities and creating a shared, interactive space.By blending the physical and virtual worlds, the interaction mechanism invites the audience to shape the experience. As the virtual sphere moves, it draws the digital landscape and enhances collective participation, making the audience an integral part of the unfolding narrative.

Real physical scene


Mixed world with the mobile phone
Mixed world with the AR headset


REAL-TIME MUSICFor the real-time music component, we preset two distinct tracks in Max, each dynamically triggered by the players' movements, such as crouching activating specific sounds.Additionally, we incorporated sound effects that dynamically fluctuate in frequency based on the real-time positions of the players, with their proximity to one another influencing the auditory shifts.




A live controller, positioned to observe and respond to player actions, manipulates the music in real-time, crafting a seamless and fluid auditory landscape that evolves with the player's interactions.This relationship between the live controller and participants creates a deeply engaging environment where sound becomes an extension of the players' physical presence, shifting with every movement.

Live controller manipulates the music based on the movements of the two players
Participants immerse themselves in the dynamic context

PROTOTYPE

limit-by="width" scale="100%">
Projection preliminary tests
Initial pedestals

CONFERENCEOur work CrossReality was privileged to be presented at the 2024 International Conference on Live Coding, held as one of the workshops from May 30 to June 1 at NYU Shanghai.