3rd International Conference

Digital Culture & AudioVisual Challenges

Interdisciplinary Creativity in Arts and Technology

Online, May 28-29, 2021

ShareThis
Merging symbolic, physical and virtual spaces: Augmented reality for Iannis Xenakis’ Evryali for piano solo
Date and Time: 28/05/2021 (19:30-21:00)
Pavlos Antoniadis, Aurélien Duval, Jean-François Jégo, Makis Solomos, Frédéric Bevilacqua

The proposed paper will present interactive systems for the visualisation and optimisation of extreme scorebased piano performance. The systems are founded on an ecological theory of embodied interaction with complex piano notation, under the title embodied navigation (Antoniadis, 2018a; Antoniadis and Chemero, 2020). The theory has materialised in a modular, sensor-based environment for the analysis, processing and real-time control of notation through multimodal recordings, called GesTCom (Antoniadis, 2018b; Antoniadis and Bevilacqua, 2016). The motion capture modeling is based on an one-shot learning Hidden Markov Model developed at Ircam and called Gesture Follower (Bevilacqua et al., 2010). At a later stage, mixed reality applications have been developed on the basis of existent visualisation methodologies for motion capture (Jégo, Meyrueis and Boutet, 2019), seeking to create a virtual concert environment. Drawing on music performance analysis, embodied cognition, movement modeling and augmented reality, we consider the concert experience as embodied navigation of performers and listeners in a hybrid environment. This environment capitalises on the isomorphisms and decouplings of physical, virtual and symbolic spaces, which merge in static and dynamic relationships: the performer’s gesture shapes music notation, music notation becomes an integral part of the concert space, a virtual avatar of the performer allows the audience to experience multimodal aspects of the performance which usually remain private, and so on. The main focus of this presentation will be on a recent performance of Iannis Xenakis’solo piano work Evryali employing live motion capture and augmented reality1. This particular work problematises usual notions of virtuosity and performability, bears extra-musical references and is encoded in a unique graphic design. These features justify the task’s characterisation as extreme and demand a rethinking of technology-enhanced performance that combines sensorimotor learning, symbolic interpretation and multimodal feedback in novel ways.

1. References:
Antoniadis, Pavlos (2018a). Embodied Navigation of Complex Piano Notation: Rethinking Musical Interaction From A Performer’s Perspective, PhD thesis. Strasbourg: Université de Strasbourg
–IRCAM, 2018. http://theses.unistra.fr/ori-oai-search/notice/view/2018STRAC007, accessed 14.04.2021

Preliminary documentation of this project may be found in the following link: https://youtu.be/D-vhOX88NfM

Antoniadis, Pavlos and Chemero, Anthony (2020). “Playing without mental representations: embodied navigation and the GesTCom as a case study for radical embodied cognition in piano performance”, in the Journal of Interdisciplinary Music Studies, special issue“Embodiment in Music”following CIM19 conference in Graz, Austria (eds. Andrea Schiavio and Nikki Moran), season 2020, volume 10, art. #20101207, pp. 126-174. http://musicstudies.org/wp-content/uploads/2021/01/10Antoniadis_Chemero.pdf, accessed 14.04.2021

Antoniadis, Pavlos (2018b). “GesTCom: A sensor-based environment for the analysis, processing and real-time control of complex piano notation through multimodal recordings”. Invited talk at Séminaires Recherche et Technologie, IRCAM, 15.10.2018. https://medias.ircam.fr/x2253e1, accessed 14.04.2021

Antoniadis, Pavlos and Bevilacqua, Frédéric (2016). “Processing of symbolic music notation via multimodal performance data: Ferneyhough’s Lemma-Icon-Epigram for solo piano, phase 1”. Proceedings of the TENOR 2016 conference, 127-136. Cambridge: Anglia Ruskin University 2016. http://tenor2016.tenor-conference.org/TENOR2016-Proceedings.pdf, accessed 14.04.2021

Bevilacqua, F., Zamborlin, B., Sypniewski, A., Schnell, N., Guedy, F., and Rasamimanana, N. (2010). “Continuous realtime gesture following and recognition”. In Lecture Notes on Computer Science, Gesture Workshop, pages 73–84. Springer.

Jégo, Jean-François, Meyrueis, Vincent, and Boutet, Dominique (2019). “A Workflow for Real-time Visualization and Data Analysis of Gesture using Motion Capture”. In: Proceedings of the 6th

Pavlos Antoniadis

Dr Pavlos Antoniadis (PhD University of Strasbourg-IRCAM, MA University of California, San Diego, MA University of Athens) is a pianist, musicologist and technologist from Korydallos, Athens, Greece, currently based in France. He performs complex contemporary and experimental music, studies embodied cognition and develops tools for technology-enhanced learning & performance. He is currently researcher at EUR -ArTeC, Paris 8 and following up at the Berlin Institute of Technology (TU-Audiokommunikation) as a Humboldt Stiftung scholar. He collaborates with the Interaction-Son-Musique-Mouvement team at IRCAM and he is a member of the Laboratory of Excellence GREAM, Université de Strasbourg, where he also taught seminars on computer music and contemporary performance practice.


Back

The Special Session
“Reflections: Bridges between Technology and Culture, Physical and Virtual”
is supported by:

   
Text To SpeechText To Speech Text ReadabilityText Readability Color ContrastColor Contrast
Accessibility Options