What is the future of music performance? What role does visibility play in experiencing music? The role played by “liveness,” the physical presence and appeal of watching music being created, has been questioned not only since the pandemic and the increased relocation of events to the digital space.
Music produced on the computer has broken the centuries-old codes of the concert. With laptop performance, it established a format in which the audience stares at people staring at screens. But the joke has been going around for more than 20 years that you can’t tell whether electronic musicians aren’t just checking their emails. Computer software as a musical instrument lacks materiality; for the concert experience there is no visible connection between cause (gesture) and effect (sound). These difficult prerequisites for the performance of computer music have led to a number of new approaches: From screen views of algoraves to sensor-based musical instruments and virtual reality shows, visual stimuli are being reintroduced into electronic music performance on different levels; others have abandoned the format of the stage-based spectacle completely.
Rebecca Fiebrink (Creative Computing Institute, University of the Arts London / Computing, Goldsmiths University of London) researches and develops machine learning technology for the invention of digital, gesture-controlled musical instruments. Lars TCF Holdhus (musician, artist) has used AI to develop performance software that will replace him on tour in the future. Olivia Jack (artist, coder) is active as a programmer and artist in the algorave scene and deals with the possibilities of live collaboration over the Internet. Isabel Lewis (artist, DJ, dancer, philosopher) experiments with event formats in which sound and smell replace the primacy of the visual and the premises of the music performance are completely eliminated. Moderated by Peter Kirn (musician, producer, DJ, journalist)