Extended Reality and the Transformation of Live Performance: Where Are We Now?

Extended Reality and the Transformation of Live Performance: Where Are We Now?

Extended Reality and the Transformation of Live Performance: Where Are We Now?

By Francesco Mariotti, Toscana Produzione Musica – NEXT STAGE Pilot Leader

 

Rethinking “Liveness” in the Age of Immersion

The transformation of live performance by Extended Reality (XR) is now an established fact. The more pertinent inquiry concerns the extent to which XR is changing conceptions of presence, embodiment, and collective experience.
Over the past two decades, performance scholars have examined the status of “liveness” within technologically mediated culture. Philip Auslander (2008) contends that live performance has consistently been defined in relation to recording and broadcast technologies. XR introduces not only mediation but also immersion, repositioning the spectator within constructed spatial environments.
Erika Fischer-Lichte (2008) describes performance as an autopoietic feedback loop between performers and spectators, characterized by an active exchange of perception and response that produces the uniqueness of each event. Immersive systems complicate this feedback loop. When audiences occupy virtual vantage points or experience spatialized sound fields, the stage transforms from a fixed front-facing structure into a distributed perceptual field.
This transformation prompts a critical question: can immersion sustain relational intensity, or does it risk fragmenting collective attention?

From Spectacle to Integration

Large-scale virtual concerts, such as Travis Scott’s “Astromonical” in Fortnite, demonstrated unprecedented reach, with millions attending simultaneously. These events confirmed the scalability of digital environments but frequently prioritized showmanship over reciprocity. In contrast, productions like ABBA Voyage sought to preserve ritual co-presence by merging physical musicianship with digital avatars. Recent mixed-reality works, such as Kagami, reinterpret the music and presence of Ryuichi Sakamoto by employing volumetric capture and immersive audio to create contemplative spaces that emphasize intimacy rather than spectacle.
These divergent approaches indicate a key distinction in the current state of the art. XR can either amplify performance into hyper-mediated spectacle or serve as an extension of artistic language. The distinction depends more on dramaturgical logic than on technological hardware.earch on immersive environments suggests that presence is not purely a visual effect but a cognitive and emotional condition (Slater & Sanchez-Vives, 2016). High-resolution graphics do not automatically produce effective participation. Perceptual coherence, responsiveness, and shared temporality remain decisive factors.

Time, Latency and the Equilibrium of Musical Dialogue

While immersion transforms spatial experience, latency alters temporal dynamics.
In musical performance, even minimal delay can destabilize ensemble cohesion. Studies on networked music performance demonstrate that delays exceeding approximately 30 milliseconds substantially disrupt rhythmic synchronization and expressive nuance (Rottondi et al., 2020). Timing accuracy is therefore a relational necessity rather than a technical luxury.
The development of systems such as LoLa (Low Latency Audio/Video Streaming System) at the Conservatorio Tartini in Trieste illustrates this principle. By lowering latency below perceptual thresholds, LoLa enables geographically distant musicians to rehearse and perform together while maintaining rhythmic and interpretative coherence. In this context, the main challenge is synchronization rather than visual immersion.
This insight reframes the debate around XR live performance. Without shared time, immersion risks becoming an illusion rather than interaction. The sustainability of distributed liveness depends on temporal integrity as much as spatial depth.

Immersion, Attention and Collective Experience

Another unresolved dimension pertains to the management of audience attention.

Traditional concert settings organize spectatorship around a shared focal point. XR environments decentralize this structure. Spectators may navigate perspectives, explore sonic layers, or occupy personalized vantage points.

Christopher Small’s concept of musicking (1998) reminds us that performance is a relational activity involving all participants. When perception becomes individually navigable, the question emerges: Does personalized immersion deepen engagement, or does it weaken collective synchronization?

Recent research on mixed-reality interfaces indicates that immersion enhances experience only when usability and perceptual clarity are carefully balanced (Boem, Tomasetti & Turchet, 2025). Excessively complex environments may generate mental overload rather than facilitate deeper listening. Current evidence suggests that artistic restraint, rather than maximal technological density, often produces the most meaningful immersive outcomes.

A Plural Landscape in Formation

XR live performance today does not follow a single trajectory. Instead, it unfolds across multiple parallel developments: augmented physical concerts, hybrid avatar-based productions, fully virtual large-scale events, and low-latency distributed performances. These formats coexist within an evolving ecosystem rather than replacing one another.
As Steve Dixon (2007) observed in his analysis of digital performance, new media do not eliminate embodiment but instead reconfigure it. XR perpetuates this process of reconfiguration. The live event is not vanishing; rather, it is being redistributed across layered spatial and temporal conditions.
The need for shared time, perceptual coherence, and relational exchange remains constant. In the absence of these elements, immersion risks devolving into isolation.

Conclusion: AMPLIFY as a Space for Critical Experimentation

Within this transforming landscape, initiatives such as AMPLIFY function less as showcases of technology and more as observatories of transformation. By supporting experimentation across XR, spatial audio, and distributed performance, AMPLIFY fosters broader reflection on how live art can evolve without losing its relational core.
Rather than proposing definitive models, such projects articulate the open questions that define the current moment: how to sustain shared time across distance, how to balance immersion with attention, and how to integrate technological possibility with artistic intention.
Current developments reveal that the future of live performance will remain plural, hybrid, and negotiated. XR does not resolve the question of liveness; instead, it reopens it, inviting artists, audiences, and researchers to collaboratively rethink what it means to be present.

 

Selected References

Auslander, P. (2008). Liveness: Performance in a Mediatized Culture (2nd ed.). Routledge.
Boem, A., Tomasetti, M., & Turchet, L. (2025). Between Immersion and Usability: Mixed Reality vs 2D Interfaces for Remote Music Making. International Journal of Human-Computer Studies.
Dixon, S. (2007). Digital Performance. MIT Press.
Fischer-Lichte, E. (2008). The Transformative Power of Performance. Routledge.
Rottondi, C. et al. (2020). An Overview on Networked Music Performance Technologies. IEEE Access.
Slater, M., & Sanchez-Vives, M. V. (2016). Enhancing Our Lives with Immersive Virtual Reality. Frontiers in Robotics and AI.
Small, C. (1998). Musicking. Wesleyan University Press.

Social media

Other news items

Subscribe to our newsletter

Newsletter / Subscription ( Popup )

Your information will be used to send you the AMPLIFY project Newsletter, managed by the AMPLIFY consortium. Please tick the box to confirm that you understand and would like to receive the newsletter.

You can unsubscribe at any time by clicking the link in the footer of our emails. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing. Learn about Mailchimp’s legal policies.