Abstract
We are in a time of changes due to the rapid progress of technology which affects many aspects of our life especially communication, artmaking, and education amongst others. There is also an increasing interest in three-dimensional environments and first-person perspective simulations, which causes the need of reaching highly immersive experience by developing a wide variety of modalities such as virtual reality headsets, navigation, and haptic controllers.
In this paper I present an experimental scheme for composing and experiencing digital soundwalks through a first-person perspective videogame. This scheme consists of a videogame player and a live coder. My aim is to create a synthetic environment such as computer games in which the game player needs to find and collect sound objects using the sense of hearing while being in an unlit three-dimensional environment and consequently to dynamically create digital soundwalks on the fly.
The videogame player can control the game with a low price prototyped wearable controller, as an alternative to other controllers, such as mouse input for navigation and interactive purposes.
The glove contains a built-in microcomputer with a 3-axis accelerometer-gyroscope sensor and push buttons. In this game I have built an algorithm to receive data from the wearable controller and translate them into input axis values which enables the player to move, turn and look around. By using the buttons embedded on the glove the user can also interact with objects and trigger sound events.
Furthermore, I have built an algorithm in Unreal Engine 4 which can send data of the game’s spatial information during the play to other software and programming environments such as the SuperCollider, pure data, among others, through OSC (Open Sound Control) communication. This information is the measurement of the player-character’s transformation (location, rotation) against the objects of the environment to create 3D audio experiences. Moreover, in SuperCollider I have created an algorithm to receive data from the videogame with which I experimented with “First Order Ambisonics” (FOA) and sound manipulation on the fly.
This project suggests an experimental way of composing sound for videogames and digital soundwalks using both live coder and the player’s actions.
Back
“Reflections: Bridges between Technology and Culture, Physical and Virtual”
is supported by: