The application of Virtual Reality (VR) theater has advanced a lot in recent years, enabling new interaction paradigms and experiences for actors, directors and the audience. Despite improvements in technology, when actors are shown as 3D virtual avatars, part of the liveness of the theater performance is challenged making it hard for the audience to feel immersed and present in the same imaginary space of the play. In this study we explore an innovative setting that establishes a hybrid virtual/physical stage enabling interactions between actors on stages of different stages simultaneously, live directions, and the audience.
In December 2023, the play "Brave New World," based on Aldous Huxley's novel, opened simultaneously in Theater Lithographeion in Patras and three other theaters worldwide. The play took place in both a virtual and physical space, across four different physical hubs: Theater Julius Osterwa in Gorzów Wielkopolski, Poland; Royal District Theater in Tbilisi, Georgia; Lithographion Theater in Patras, Greece; and Regional Academic Theater in Ivano-Frankivsk, Ukraine. This paper focuses on the Greek contribution to director Krzysztof Garbaczewski's visionary concept, specifically on technical advancements and Human-Computer Interaction (HCI) dynamics, within the innovative platform aimed at pushing hybrid virtual theater forward.
Each theater-hub consisted of one or more actors, two or more projection systems, and VR equipment. By using a multiplayer online platform, actors from each hub shared this common virtual layer, which was materialized in each hub with video streamings (input and output of streamings to VR and from VR), through multiple projectors creating a digital set design and adding a VR layer into the reality. In the Greek theater context, a single performer put on a VR headset throughout the entire performance, and there was also a live lab team on stage. The live lab members actively participated in the play, integrating virtual and physical elements to enrich the immersive experience. This stage-system setting supported many different aspects: live direction, distant collaboration, streams orchestration of holographic projections on physical stage through a combination of HMDs, live footage projected in the VR scenes, cameras/green screens in VR, and cameras on stage. As mentioned, a shared virtual space was essential for the project, where actors connected through VR headsets on several virtual worlds. This allowed them to interact using avatars they controlled with remote controls or hand gestures tracked by the HMDs. A virtual camera broadcast the events live in the physical space of each theater while it was operated by someone within the virtual space. The audience watched the joined digital projection on stage projections, seamlessly merging the virtual and real world. Additionally, both the music, sound effects heard during the performance, and the voice of the main actor, were reproduced through speakers strategically positioned on the stage – some closer to the audience and others high and far away. The sound was coordinated using a gaming platform by the live-lab team on stage. This experience allowed the audience to become fully engrossed in the unfolding drama, experiencing it alongside the live actors.
Through seamless integration of virtual worlds, real-time interaction between actors, and special control mechanisms, this work exemplifies the transformative power of merging art and technology to redefine the possibilities of performing arts and live entertainment. Concluding, the practical essence of any human based evolution or expansion, either in technology or philosophy, comes from the desire to reinvent the human process of performing, of being. Performing arts were always revolutionary in adapting new means that could transcend our existence or challenge it into becoming what each of our present stage desires. Decisions of seemingly low value in our everyday life, became tremendously important when "we are on stage": Where do I stand? Who am I talking to? Where is the place in which we stand? Only through deep and thorough research combining technology and real, basic type performing needs, could advancement could the “advancements in Human-Computer Interactions in a VR Theater” be free, as humans are.
Dr. Theodoropoulos Anastasios serves as an assistant professor at the department of Performing and Digital Arts – University of the Peloponnese, teaching Game Design & Development, Human-Computer Interaction, Virtual Reality and Character Animation. He is also a scientific associate-researcher in the field of games and immersive technologies at the HCI-VR Laboratory. He has extensive experience in teaching within the Computer Science domain (since 2006), in a wide variety of courses, to students aged from 6 years old to mature adults and in multicultural settings. He also serves as a national Ambassador for the European Code Week initiative, whereas he promotes games (game-design and game-play), to develop basic Computational Thinking skills. His research focuses on Game studies, Player Experience, Character Animation, Game Education, Immersive Technologies, and Human-Machine Interaction. He has co-authored several manuscripts published in peer-reviewed journals and conferences, and he serves as editor on highly recognized journals. In addition, he is an active member of the ACM, SIGCHI, SIGGRAPH, SIGCSE and IGDA organizations. He researches user/player experience modeling and technologies to enhance gaming experiences, and algorithms that dynamically adapt to player behavior, creating immersive and personalized gameplay scenarios. Lately, he has been involved in projects that combine interactive/immersive gaming technologies with arts and culture.
I am studying in the field of computer science and telecommunications at the University of Peloponnese. I am interested in evolving my skills in both sectors of my science. I have experience with customer service and inventory management from our family business. More info can be found at: https://users.uop.gr/~dit19166/
Back