>conference
As part of the ATEM Biennial, the ATEM Conference will take place on November 30!
During various talks and discussions, international experts will share their insights on the field of the musical metaverse and their artistic practices with this medium.
The ATEM Conference will take place in the foyer at the KreativeInstitute.OWL in Detmold (Germany) – doors open at 10:30. The event will also be available as a stream on this website.
After the conference
From 5:00 PM onwards, participants of the ArtLabs will present their work in the HyperLab.
Sunday, 30.11.2025 – 11:00 – 19:00 (CET)
11:00 – 12:30 (CET) – Musical Metaverse
| TIME | Speakers |
|---|---|
| 11:00 |
Matthias Nowakowski (Universität Paderborn/DE)Actio ex vacuo: Forget your musical skills Presenting Stackbeat L.O.V.E. as a new approach to live coding in the metaverse, this talk examines how digital performance can begin from a deliberate state of artistic emptiness. It explores the opportunities and the necessity of a performative tabula rasa as a strategy for rethinking musical creation in virtual environments. By focusing on a mode of music-making that resists familiar patterns and disrupts established musical intuition, the presentation demonstrates how medium-appropriate musical expression can emerge. |
| 11:20 |
Prof. Dr. Alberto Boem (University of Trento/IT)From Point to Sphere to World: spatial Organization in 20th century Music and Its Application to Virtual World Design I present an ongoing research with the goal of defining design guidelines for creating immersive musical worlds, by examining how 20th and 21st century composers have employed space as a fundamental compositional element. Unlike spatial audio techniques or site-specific installations, the works examined here demonstrate approaches where spatial organization itself carries musical meaning and structural significance. Drawing on theoretical frameworks such asInspired by Kendall’s taxonomy of spatial music organization and Barrett’s five meanings of space in sound art, with this research I am trying to identify recurring strategies through which composers have transformed space from a neutral container into an active compositional parameter. These include: the dissolution of the frontal stage, distributed ensemble configurations, mobile sound sources, listener positioning and agency, temporal-spatial relationships, and the architectural reimagining of performance environments. By categorizing and analyzing these historical practices, the presentation extracts principles applicable to the design of navigable, interactive musical metaverses. The resulting guidelines address how spatial organization can structure user experience, create musical coherence in virtual environments, and enable new forms of participatory engagement with musical works. This research bridges musicological analysis with contemporary design practice, demonstrating how compositional thinking about space can inform the design of immersive digital worlds for music. |
| 11:40 |
Prof. Dr. Michel Buffa (University Côte d’Azur/FR)WAM Jam Party: A Web-Based Playground for Collaborative Immersive Music Creation The “musical metaverse” is no longer science fiction, it’s here, running in your browser! This talk introduces WAM Jam Party, a collaborative platform where musicians, creators, and curious minds can step inside a virtual studio, assemble instruments and effects in 3D, and make music together in real time. Unlike traditional VR music apps locked to proprietary ecosystems, WAM Jam Party is built entirely on open web standards: WebXR, Web Audio, and WebAssembly… allowing anyone with a VR headset or laptop to join a shared immersive session instantly, with no installation required. Participants can browse a 3D “plugin shop,” grab virtual synths and audio effects, connect them in space, and perform or remix together using spatial audio and gesture-based interaction. Behind the scenes, a new generation of web audio plugins (WAMs) powers realistic instruments, advanced effects, and music note generators. A built-in 3D GUI editor lets creators design their own instruments visually, turning the metaverse into a creative playground for sonic experimentation and education. This presentation will explore how open web technologies are democratizing immersive music creation, highlight the design challenges of making 3D musical interfaces intuitive and expressive, and discuss what a truly collaborative musical metaverse might look and sound like in the near future. |
| 12:00 |
Prof. Dr. Luca Turchet (University of Trento/IT)The MUSMET project This talk will present the vision and the initial results of the project MUSMET “Musical Metaverse made in Europe: an innovation lab for musicians and audiences of the future” (https://musmet.eu/), which has been funded by the European Innovation Council under the Pathfinder Open scheme. |
| 12:20 | pause |
| chair | Prof. Dr.-Ing. Axel Berndt |
13:00 – 14:30 (CET) – Immersive and Networked Music
| TIME | Speakers |
|---|---|
| 13:00 |
Prof. Dr. Sarah Rose Weaver (New York University/US)Conducting Across Realities This talk will discuss Weaver’s conducting work over the past 25 years in music and multidisciplinary projects in local, hybrid, and virtual settings. Topics will include strategies for conducting through video conference, data gloves, haptics, avatars, and more. |
| 13:20 |
Dr. Matthew D. Gantt (Rensselaer Polytechnic Institute/US)Sound Becomes Site: Music of the Metaverse This presentation will trace a line from the creative concerns of avant-garde composition and electronic music in the twentieth-century to twenty-first century practices with virtual environments, real-time simulation and networked digital space. Connections between early development of the Buchla synthesizer and contemporary experimental practices exploring sound in digital space will be drawn, highlighting novel conceptual approaches to organizing sonic material in emerging mediums. These connections will be highlighted through examples from my own creative practice using virtual environments, game engines, networked ‘metaverse’ platforms and similar, then contextualized through a survey of recent developments surrounding creative sound and music in virtual space. |
| 13:40 |
Matthias Erdmann (University of Applied Sciences Düsseldorf/DE)Development and evaluation of a mixed reality music visualization for a live performance Live concerts increasingly integrate multimedia, with a growing interest in Extended Reality to enhance the audience experience. While many studies focus on technical aspects, the perceptual and psychological effects of visual media in live music contexts remain underexplored. This presentation covers a study investigating the impact of a real-time Mixed Reality (MR) music visualization during a live performance. In a lab-based experiment, participants experienced the same music performance with and without the MR music visualization. Differences in aesthetic experience, musical absorption, social interaction, and mind wandering are examined, providing new insights into how MR shapes the live music experience. Furthermore, technical implementations and future directions are discussed. |
| 14:00 |
Dr. Harald Muenz (Avatar Orchestra Metaverse/DE)Second Life as an Artistic Medium: Musical and Sound Installative Practices in a Virtual World This presentation examines Second Life as one of the earliest and most enduring user-built metaverses. Since its inception in 2003, the platform has evolved into a dynamic environment for experimental, participatory, and intermedia art. This talk focuses on its comparatively underexplored potential as a medium for interactive musical and sound installation practices. Drawing on my experience of creating interactive installations and participating in the Avatar Orchestra Metaverse (AOM), the presentation explores how the concepts of composition, performance, and listening are reconfigured through embodied interaction, distributed agency and collective creation. Second Life’s immersive yet socially and technically mediated nature reveals both expanded aesthetic possibilities and persisting structural limitations. |
| 14:20 | pause |
| chair | Prof. Dr. Aristotelis Hadjakos |
14:30 – 15:30 (CET) – Metaverse Live Coding
| TIME | Speakers |
|---|---|
| 14:30 |
Prof. Malitzin Cortés (Centro/MX)Live Coding Across Virtual Worlds: From Pandemic Metaverses to Live Immersive Coding This talk reflects on the evolution of shared live coding practices across physical, virtual, and hybrid performance contexts. Starting from audiovisual programming prior to the pandemic, we seek to share and analyze the shift toward metaversal environments understood as three-dimensional or two-dimensional spaces that emulate a place, as well as their expansions into mixed realities. The internet, understood as a “space” or “metaspace,” is the guiding thread for this reflection, where live coders have woven their communities, tools, and of course performative spaces—such as PatchXR concerts for MUTEK and experiments in Mozilla Hubs—where live coding expanded toward metaversal and global performance. The talk addresses the emergence of immersive live coding, including recent works connecting TidalCycles with Unreal Engine, Wwise, and other game development tools. Through several pieces and 3D virtual environments, we will discuss how these spaces became programmable performative environments. |
| 14:50 |
Flor de Fuego (AR/DE)Live Coding virtual scenography and staging with A-Frame This talk explores various projects created with A-Frame, a web framework for XR, in this case for performances and installations, and examines the possibilities for live coding in 3D worlds. The focus will be primarily on the visual and interactive aspects of these environments. The talk offers reflections on both the technical and conceptual aspects of this practice, addressing key questions: How can we control 3D spaces in real time? What does this experience bring to a live performance? What are its limitations? And why is there a need to expand performance into the virtual world? |
| 15:10 | pause |
| chair | Dr. Matthew D. Gantt |
15:30 – 16:30 (CET) – Panel Disscussion
with Matthias Erdmann, Flor de Fuego, Dr. Matthew D. Gantt & Matthias Nowakowski
Moderation: Damian T. Dziwis
17:00 – 19:00 (CET) – Live Coding Performances
CLOSING EVENT
Participants from ArtLabs in Detmold and Düsseldorf showcase the projects they developed during the workshops.
