On Soundscape: Listening as Interaction
Research + Design - Darren Woodland Jr.
Software + Hardware - Unity, Autodesk Maya, Adobe Creative Suite (Ae, Ps, Ai), Audiokinetic Wwise,
AnimVR, Zoom H3-VR, Oculus Quest
On Soundscape: Listening as Interaction is a research project conducted through the design and development of a virtual reality experience that looks to build on foundations of sound in emerging extended reality (augmented, virtual, and mixed realities) in order to explore presence and emotion through themes of oppression and mental illness. The final experience presented in On Soundscape references sound design techniques from cinema, video games, and music. Looking at sound in each of these media forms, translating and transforming them into a version that is more suited for use in spatial media creation platforms than what is more often used in the creation of immersive three-dimensional experiences.
How can sound be utilized to create engagement and presence within virtual, augmented, and mixed realities?
The perception of sound in any space or environment shapes the occupant’s perception of that place. That is to say, that sonic elements interact with the body and the mind in spontaneous and unexpected ways, influencing the cognitive processes of any user present within or fully engrossed with the sonic environment. Understanding the role sounds plays in how we interact with new and emerging mediums and technologies is made all the more important when those technologies are designed in a way to deliver experiences of immersive and narrative value. In order to begin to understand this role an examination of the multifaceted nature of sound is required. Throughout the process of researching, formulating, and building, On Soundscape has been guided by five research domains:
Sound as a Form of Interaction
The Nature of Sound in Digital Media
An Ecological Approach to Sound and Listening
Cognition, Perception, and Sensation from Aural Stimuli
Engagement and Presence through Audition and Embodiment
On Soundscape: Listening as Interaction takes the form of a virtual reality experience (truthfully, a mixed reality presented on an Oculus Quest headset) that utilizes spatialized audio, ambisonic recordings of soundscapes, sound effects, and spoken word poetry. It was created using the Unity 3D development platform, Wwise Audio middleware, Reaper DAW, and the Zoom H3-VR field recorder.
The Zoom H3-VR was the primary recording device for all sound effects and ambisonic soundscape recordings.
Maya was used to create a 1:1 scale model of my apartment to be placed in Unity for the bulk of the Mixed/Virtual reality experience.
C# Scripting through Unity was used to create many of the interactions presented in the project.