The Project: Explained
Updated: Mar 16
You know, I thought it high time I explained in a bit more detail, what exactly it is that I am trying to make in UE5.
The current project idea, the was explained in post #1, is to make a prototype experience in Unreal Engine 5.1 that will use body-tracked movements through a Kinect V2 sensor and map those movements to sound parameters created in a MetaSounds source.
What has not yet been explained is how the concept of a dancer-participant feedback loop works in generating a soundscape and manipulating that soundscape in a collaborative manner.
Dancer-Participant-Soundscape Feedback Loop
The above diagram is an early version of what the feedback loop could look like. Before the loop can be explained it is important to take a loop at the connectors in the graph: Generates, Manipulates, and Influences.
Generates: Created in real-time based on a system of rules or procedures.
Influences: Having an impact on the interpretation of data/information.
Manipulates: A real-time adjustment to data/parameters.
The main idea here is that everything currently starts with the dancer. The dancer's movements manipulate sound parameters that, along with movement languages like Laban and Dance Motif, create algorithms/procedures. These procedures are used to generate sound/music in real-time. The sound/music (soundscape) then influences, and has an impact on the dancer's movements as they listen to/interpret the sounds. Which, in turn, affects the sound parameters. This creates the first feedback loop.
It is also important to note that the dancer has an understanding of the movement languages used in the creation of the algorithms/procedures of sound generation. As the ones used are commonly taught and practiced in dance. So those movement languages influence the dancer as well.
The second feedback loop exists between the soundscape and the participant of the experience. As the participant listens to the soundscape, they are also given the opportunity to control similar (or the same) sound parameters as the dancer through a secondary screen interface. In doing so, they are also changing the soundscape generation in real time.
The two feedback loops together actually influence and intertwine with one another, creating the larger and more complex Dancer-Participant-Soundscape feedback loop. This loop is one of the main components of the project. The Dancer-Soundscape feedback loop is the focus of the Unreal prototype a the moment.