

In this paper we report on efforts to expand an existing framework for sensor-enabled networked live coding of sound in order to achieve an integrated environment that includes 2D and 3D animated graphics. Additionally, we describe extensions of this framework to interface with a camera-based motion capture system based on Raspberry Pi equipped with a high-performance extension for local AI capabilities. For animated graphics, we use the Godot Open Source Gaming Engine. This engine includes its own scripting language, GDScript. As a starting point, we introduce an interface for data exchange between SuperCollider and Godot/GDScript based on the Open Sound Control protocol. Furthermore, we discuss the struture and scipting mechanism of GDScript and give examples of its operation when building interactive scenes. We discuss the live coding capabilities of GDScript, especially as regards on-the-fly modification of animation environments. We furthermore give examples of animating figures from OSC Data, using the above mentioned camera based motion capture system. We evaluate the capabilities, requirements and limitations of such a system and outline further work for the tighter integration between sound and graphics through code.
Iannis Zannos has a background in music composition, ethnomusicology and interactive performance. He has worked as Director of the Music Technology and Documentation section at the State Institute for Music Research (S.I.M.) in Berlin, Germany, and Research Director at the Center for Research for Electronic Art Technology (CREATE) at the University of California, Santa Barbara. He teaches audio and interactive media arts at the Department of Audiovisual Arts of the Ionian University, Corfu. Publications include: "Ichos und Makam" (1994) and "Music and Signs". Participation in artistic collaborations include with Martin Carlé (2000) programming of interactive sound for Eric Sleichim / Bl!ndman Quartet, and Ulrike and David Gabriel; Cosmos-X - Multimedia installation with multiple audio and video projections based on the work of Iannis Xenakis, with Efi Xirou (2005-2006); and with Jean-Pierre Hébert real-time sound programming for the installation series on "Sand" (2004-2005). Recently he has performed regularly using live coding techniques with SuperCollider in national and international conferences and festivals. Further collaborations and project include collaborations Jean-Pierre Hébert (2007-2011), Iannis Chrysides, 2014, and presentations at Athens Biennale (2016). Since 2018 he focusses on telematic dance performance with a series of works between Greece, Japan and other countries, and collaborations with composers Haruka Hirayama and Takumi Ikeda and dancers Jun Takahashi, Asayo Hisai, Mary Rantou, Justine Goussot and others.
Back