We’ve just wrapped a video edit of our trip to Australia and India. It was a great experience to showcase our mocap solutions along with Xsens and Manus VR and our new resellers Tracklab and Galore System. We had prepared a demo showcasing a full-body motion capture pipeline and our 2 different real-time facial motion capture technologies.
Sandy De La Perdrix, Canadian TV reporter is looking for Superbot, a superhero robot to get him on stage for an interview for her famous TV show.
Sandy De La Perdrix was played by an actor and Superbot by random visitors picked out from the crowd.
The technical prep
- Capturing body, face and hands motion from an actor
The actor was wearing a Xsens Mocap suit, gloves from Manus VR and a Dynamixyz tethered Single-View Head Mounted Camera system running with Live Pro software solution. Live Pro works with a specific tracking technology, meaning the software has to be trained towards the actor’s dedicated set of features and learns from those examples.
A range of motions had been shot before the event with a Dynamixyz Head Mounted Camera and around 50 frames of the actor’s expressions got annotated to build his tracking profile in Performer software. Those expressions were then retargeted to the 3D character Sandy De La Perdrix in Maya thanks to our Bridge plug-in connecting Performer and Maya.
The profile was then exported to be used in Unreal Engine. We first tested it by streaming recorded videos of the artist in Grabber software, loading the Live Pro tracker.
- Capturing faces from random visitors
Random visitors were tracked using our Live Instant solution. Unlike Live Pro, Live Instant works with a generic tracking solution, able to track any face. The system is already pre-trained on millions of faces.
This solution only requires a 3D character with a retargeting profile to be created beforehand.
A retargeting profile for the 3D character Superbot was created with basics FACS expressions in Performer, and Live Instant tracker was loaded in Grabber software before tracking different people.
Body animations of Superbot were pre-recorded before the event with a Xsens suit.
People were tracked with a prototyped “freehead” steady cam that visitors could hold.
Streaming all the systems together in Unreal
That part was certainly the most challenging one. To get the highest framerates, we had to use 3 different computers :
- 1 to stream the Live Pro animation coefficients,
- 1 to stream the Live Instant animation coefficients,
- and 1 to run Unreal Engine and locally stream the Xsens MVN body animation coefficients.
The Unreal Engine scene used Dynamixyz Live Link Plugin to connect to the face mocap software, and XSens Live Link MVN Plugin to connect to the body mocap software.
Turning the demo into a real show
Our CG artists prepared some triggers in the scene to launch different animations:
- Sandy waiting for Superbot to show up.
- Superbot getting teleported from the crowd on the virtual stage.
- Superbot comfortly sitting on the couch.
- Superbot waving goodbye to the cameras.
So we could get the feeling to watch a 3D TV news show live!