What do you get when you mix a choreographer, A musician and a mocap studio?
A unique, collaborative and multi-disciplinary project produced for the Mercury Theatres Digital Arts festival in 2020, we aimed to create a visual treat by combining contemporary movement, motion capture, visual special effects and a 'banging' soundtrack.
We began with the audio, an upbeat and electronic soundtrack created by talented musician; Jamie Pascoe. We handed the tracks over to our choreographer; Amber Jarman Crainey and tasked her with creating a movement sequence to accompany the audio. In the meantime, our Creative Director, Dave Norton began getting his head around creating a visual representation of the music. He was keen for the audio itself to drive the visual effects, rather than hand animating each moment. We used a very helpful feature of the Unreal Game Engine which allowed us to represent the input audio as a spectrum (The Louder the volume = the bigger the number gets). By separately feeding in each track of the audio (drumline, vocals, Raisers, base) we had an array of numbers reflecting the peaks, troughs and impacts of the music track.
We then created some visual effects such as colour change, brightness, noise and particle effects which could be driven directly by the audio, and then applied these effects to the virtual avatar and its environment. With all the planning and preparation completed, it was time to capture the movement sequence and bring all the separate elements together. A very tentative moment, as at the time, we had no idea if everything was going to work as intended. With a limited budget and tight deadline, we only had once chance to get it right.
We arrived at the studio, and began setting up the trackers and getting everything calibrated, within around 30 mins we were ready for our first take, a nail-biting moment... As this was our first time road testing the VR based tracking solution we have been working with. As the music began, and Amber started her movement routine, there was a sign of relief as the screen came to live with colour, motion and effects. It was working! We ended doing 3 separate takes and editing the results together into the final performance.
This had been a great moment for the whole company, as it proved our technology worked, we were really blown away by how little latency there was (delay between the real-life movement and the on-screen movement) and how little mocap clean-up was needed using this solution, normally a very time-consuming process. We're hoping to be even more ambitious with our next project by integrating some face tracking on top of the movement.
You can see the finished results here:
Looking for something similar for your own performance projects? Get in touch today:
A huge thank you to everyone that made this project possible.
Jamie Pascoe - Musician (https://soundcloud.com/jamie-pascoe-256127610)
Amber Jarman Crainey - Choreography/Performer (https://www.instagram.com/amberjarmancrainey/)
Jamie Weston (Signals Media)- Producer (https://www.signals.org.uk/)
Mercury Theatre - Venue (https://www.mercurytheatre.co.uk/)