Interfaces for real-time interaction and control of spatialized sound mixing – LAPSO



John Sullivan


In order to bring more fluency in the control of the parameters in the sound spatialization and thus its real physical and artistic interaction, this research aims to develop and/or adapt three types of devices:

The first two, of a non-conscious character, will be for coupling to the cast on stage. The first device will contain sensors that will perform the drawing-recording of its trajectory on stage, imprinting its characteristics of space on the sound scene – Azymuth, Elevation and Distance. The second will register the directionality characteristics of the sound source, linked to head movement – Yaw and Pitch.

The third device, having its interaction of conscious character, aims to control these same parameters by means not of the actors on stage, but by the musician-mixer. Thus, this musician will have the possibility of interacting live with other sound sources, such as the multi-channel instrumental track, off-key voices, and other sound interventions.

This need arose, among other works and performances, in the theatrical spectacle Lapso. Recorded in 2021, this play with two actors on the scene, exploring the poetic and aesthetic possibilities brought by binaural audio. In this project, the sound was approached as a narrative instrument, adding a layer, normally ignored by this type of performance. In Lapso the sound is present, almost like a third character, directly interfering in the construction of the scenes.

IDMIL Participants:

Research Areas: