top of page
Writer's pictureDarren Woodland

Body Tracking w/ MetaSounds

Updated: Mar 16, 2023


NeoKinect body tracking with Pivot Arrows attached

Some more, tangible, and sonic, progress has been made. So, let's discuss.


I got body tracking with the NeoKinect plugin working inside a new project that allows me to focus on the project development with minimal bloat from other plugins. I was previously testing the body tracking using the Unreal Engine 5.0 Demo projects for NeoKinect. But there was a lot in this project that I simply did not need.


After spending more time with the demo and taking a closer look at the blueprint networks I settled on using the Pivot Arrows demo as the most optimal solution for body tracking at the moment. It does not require the setup of an animation blueprint (AnimBP) or the need for the UE Mannequin.


After failing to get a custom C++ patch of Pd working in UE5.1 I decided to recreate Pd patches using MetaSounds(MS). There is no 1:1 fit from Pd to MS, but I am trying to get things working as closely as possible to the original patches, and with as few nodes as possible.


I was able to get the Pd for Airports patch, discussed in a previous post, working as an MS Source in UE5.1. So, I guess it is called MetaSounds for Airports. There are some quirks with MetaSounds that made the network more complex, but all in all, it works basically the same as Pd. One such quirk, there is no support for sending an entire array of MIDI notes (as a chord) to be rendered by the audio stack at once. Another major difference is that MS, by default, likes to only work with frequency data (as Floats) or audio stream data as an output, not MIDI.


Not that it doesn't support MIDI data all. You can send MIDI signals out of the engine through a BP that reads in data from a MS source (theoretically, I have not tested this yet, but it should work).


Anyways...


Have a look at the node networks in the below videos.


Node Network Breakdowns
Node breakdown for basic body tracking with NeoKinect (UE5.1 Blueprints)


Node breakdown for MetaSounds for Airports MS Source

The final step was to connect the two together and control the playback of the MS with the body tracking BP.


This was a relatively simple process. I started by creating a custom BP Event to execute the processes. Afterward, I used the joint location of the left hand and a Calculate Velocity From Position History node to get the velocity of that hand. The idea here was to only trigger audio when motion is detected. Using a Compare Float node (Math) to check if the velocity was above a certain threshold, as the Kinect has some jitter in joint location/velocity. When the detected velocity is above the threshold, a sound source with the MS is spawned and is played.


I used a similar process to stop the sound if it was below the velocity threshold. This was done using the same compare float node and setting a custom Stop trigger parameter in the MS.


Two major issues arose while implementing the velocity-based BP to MS workflow. First, the trigger parameters for Play and Stop were being triggered on event tick (or once every frame) as is the default behavior for custom events. The fix was to use a Do Once node, a solution found on the developer forum. The other issue, which is another MetaSounds quirk, is that setting parameters for a MS source needs to be executed on the same data stream as the Spawn Audio node. Sometimes...


In a previous test, I did not get an error when calling a Set Boolean Parameter on a different execution stack, but this time I did. Doing so threw an "Accessed None Type" error for the parameter I tried to set. The solution was to move everything to a single execution stack and create a feedback loop to properly start and stop the audio. Which I am pretty sure is not an optimal solution, but it worked. See the loop at 1:50 - 2:00 in the first video posted above.


Take a look at the results in the below video.


MetaSounds for Airports v1
Body tracking MetaSounds for Airports

9 views0 comments

Comments


bottom of page