In this paper we report on efforts to expand an existing framework for sensor-enabled networked live coding of sound in order to achieve an integrated environment that includes 2D and 3D animated graphics. Additionally, we describe extensions of this framework to interface with a camera-based motion capture system based on Raspberry Pi equipped with a high-performance extension for local AI capabilities. For animated graphics, we use the Godot Open Source Gaming Engine. This engine includes its own scripting language, GDScript. As a starting point, we introduce an interface for data exchange between SuperCollider and Godot/GDScript based on the Open Sound Control protocol. Furthermore, we discuss the struture and scipting mechanism of GDScript and give examples of its operation when building interactive scenes. We discuss the live coding capabilities of GDScript, especially as regards on-the-fly modification of animation environments. We furthermore give examples of animating figures from OSC Data, using the above mentioned camera based motion capture system. We evaluate the capabilities, requirements and limitations of such a system and outline further work for the tighter integration between sound and graphics through code.
Back





