6th International Conference

Digital Culture & AudioVisual Challenges

Interdisciplinary Creativity in Arts and Technology

Hybrid - Corfu/Online, May 24-25, 2024

ShareThis
Investigating emerging forms of narratives at the intersection of Art, EEG technology and eXtended Reality. The “Live Art Brainstorm” project
Date and Time: 24/05/2024 (09:00-10:00)
Location: Ionian Academy
Konstantina Vetsiou, Manthos Santorineos, Stavroula Zoi

Although relative research dates back to the 1960s, in recent years there has been a tremendous activity focused on exploring and leveraging different aspects of the relationship between brain activity and visual narratives. One of the most discussed ones, that of producing images from text prompts, is based on simulating brain function through deep neural networks, thus enabling interaction with the machine which is mediated by language. In contrast, the utilization of the technology of Electroencephalography (EEG) establishes immediate real-time channels of communication between the brain activity of the viewer and the digital narrative content, enabling both manipulation of existing flows of images, and creation of new ones.

This paper refers to the utilization of EEG technology for exploring new forms of narratives, leveraging the direct connection between the subject's brain signals and the generated image flows. We analyze the issues arising from the creator's physical narration, the EEG-driven narration and the perspectives that they jointly create. Addressing those issues has led to the development of a cohesive methodology, at the intersection of Art, EEG technology and contemporary eXtended Reality conceptual framework. This methodology unfolds based on the following axes:

- Study of EEG-based Brain–Computer Interface (BCI) systems, focusing on their specific characteristics and their potential to engender artistic experiences. This study encompasses both recent scholarly literature, and notable case studies from the history of contemporary Art. Emphasis is placed on EEG-based BCI systems that are now accessible to the artistic community (e.g., low-cost interfaces from the gaming industry, or DIY solutions such as from the OpenBCI). Those systems are subject to comparative evaluation, regarding their aptitude for facilitating experimental endeavors and their suitability for real-time artistic presentation to a diverse audience.

- Identification and differentiation between findings targeting scientific inquiry applications (e.g. therapeutic cases), and those constituting elements of artistic creation. The distinctions as well as the convergences between them are analyzed.

- Specialized processes from the perspective of the creators, such as algorithmic forms and conventions constituting the narrative structure of the artwork (named as poly-scenario).

- Design and implementation of communication mechanisms with the visitor – viewer. Different modalities of interaction, levels of immersion, and seamless transitions between the physical and the digital space are incorporated, based on the conceptual framework of eXtended Reality.

In the context of the above methodology, a set of experiments were developed in laboratory conditions, on recording brain responses to RGB color stimuli. The results of a case study of a sample of 15 participants were analyzed and observations were driven on how such an experience could be transferred to a real-time event.

Based on the above experimental results, the case of the Live Art Brainstorm project is analyzed which was designed and developed targeting a real-time event experience. 

The participant enters a multisensory hybrid environment of a cinema hall with other visitors. Wearing a widely available EEG headset, they are immersed in a virtual world of narrative imagery and stimuli, which they are called to influence solely based on their attention, as this is measured in real-time by the device. The projected images range from basic colors that evoke specific stimuli, up to images that contemporary individuals now see and are generated using advanced technological methods (e.g. non-existent faces produced through artificial intelligence, Metaverse-like virtual landscapes that the participant may navigate from different perspectives). The results are projected onto the screen of the cinema hall, while the stimuli are developed in the surrounding physical space through special lighting, sound, as well as the presence and participation of other visitors, feeding back into the mental state of the interacting subject.

The above presentation of Live Art Brainstorm project in real-time conditions took place in the scope of the event Μέσα στην Εικόνα - Dans l’ image – Inside the image (Reenacting cinema through extended media), that took place in the Alphaville old cinema, in Athens, 7-16 December 2023. It and was organized by Fournos Multivalent Communication Network in collaboration with École Nationale Supérieure des Arts Décoratifs (Paris).

The proposed approach aims to establish an intricately interconnected ecosystem of technology, physiology, and artistic methodology. In this way, it could become essential not only for producing functional and compelling outcomes, but also as a tool for interdisciplinary research in cognitive sciences or contemporary fields between art and psychology (e.g. empirical aesthetics).


Back

   
Text To SpeechText To Speech Text ReadabilityText Readability Color ContrastColor Contrast
Accessibility Options