This paper introduces an integrated computational framework which establishes a novel methodology for the evolution of performance art. The main thesis of our research is that fundamentally new technologies demand fundamentally new forms of artistic expression, rather than mere adaptations of existing ones. To this end, we introduce a real-time Bio-responsive system integrating Electroencephalography (EEG) data and Generative AI in a dynamic relationship with the performer. Diverging from standard data visualization approaches, our methodology utilizes Language as the primary structural interface, leveraging Jacques Lacan’s axiom that "the unconscious is structured like a language." In this paper, we detail the system's technical architecture, analyze the language-centric methodology and present the preliminary results of our experimental implementation, based on a low-cost EEG headset, StreamDiffusion, and Touch Designer. Our work aims to define a new ontology of human-machine co-creativity in the context of contemporary performance.
Back





