Interactive storytelling is a complex challenge when you have to support real-time content in both virtual reality and 360-degree video at the same
time. You have to reinvent how to convey your story and how to interact with the audience while providing equivalent interactivity and immersion in
both contexts. The most important thing from the artistic point of view is to offer two compelling experiences of the same interactive creation.
Our objective here is to make sure that the actor sees and interacts with the artistic content the way it was intended by the author.
To achieve this goal we had to figure out if a 360-degree projection could reproduce the same immersion and interactive involvement as a VR environment.
Then we had to conceive a 3D live show making use of similar narration triggers to tell the same story on both support media.
When considering the 360-degree experience we established that a full-dome was the best possible approach to start with. That way the observer is surrounded by the creative content, evolving inside a half-sphere offering a convincing immersion.
An observer in the center of a dome, with the head free to rotate and wander around to follow the story, will have a point of view very similar to an equivalent 4-DoF VR experience. The same observer equipped with a VR headset will discover the creative content with pretty similar angles of vision.
For the interactive part of the storytelling, a VR spectator still has an edge by being able to move freely in the 3D space (6-DoF). To provide a counterpart in our spherical dome we made use of a simplified motion camera (Microsoft Kinect 2) to capture hand gestures and control the projection. Waving a hand in one direction will move the live camera accordingly, and result in full yaw of the dome. Moving the hands upwards and downwards will provide a basic pitch action.
To drive the story progression in the way expected by the artist, we had to implement a navigation mesh in VR, a simple rectangular box in our case, to encourage a move in that direction. For the 360-degree dome, we enforce a forward traveling of the camera at this very same position. Reaching this new area will trigger the next narration sequence and so on.
Another interesting point to increase the immersion is the ability to interact with elements from the virtual surroundings. For the VR part, it’s achieved via standard means, indicated by glowing cubes that can be grabbed. Under the dome, we decided to rely on real luminescent cubes that can be tracked. Moving a cube in both VR and the physical world gave the option to place virtual things around and provide real-time feedback of these actions, a particle emitter colliding with the environment for example.
In conclusion, we managed to deliver an interactive story in both real life and VR, while securing a narration within a shared virtual space. We achieved a very similar immersion and live interaction for the two supports. The comprehension of these challenges and heavy use of the 3D scene to merge the virtual and real worlds, and to create mixed-reality stories, will help future innovative creations to be more accessible to different spectator types. It will ease the setup and coordination of live art events with multiple supports for cross experiences while guaranteeing the same enjoyable immersion. This is opening so many roads to build a completely new generation of live content creation in the future.
Back