Xmas release made your features requests come true

First great news of this end of year is that versions of Grabber and Performer now match and we’re talking about the “2.8 version” for both software.

A new version of Live Instant is now embedded in Performer and Grabber latest release enabling a better generic tracking. It includes:

  • a reshape of 3D models templates and revision of the algorithm to get a more accurate reconstruction
  • a set of landmarks more evenly distributed
  • a quicker tracking (frame per second higher)
  • a refinement of the post-processing step (retargeting)

We’ve tested it ourselves during SiggraphAsia and were happy about the results: check those out!

 

Coming with PERFORMER 2.8:

  • Our Xmas Performer favorite feature: managing multiple profiles of an actor during batch processing in one Performer project

Performer 2.8 now lets you handle multiple settings of profiles during batch processing. That means, you can refine your profile with new frames, take after take, and build a dedicated set of poses for each scene and then process multiple data with different settings.

That can be helpful when you get shots with a specific set of expressions (sadness, anger…) for a character or a specific illumination type. Your tracking will get more precise taking into account those different settings in multiple different videos for one same Performer project.

Read more

Live Pro and Live Instant demo during Siggraph Asia

We’ve just wrapped a video edit of our trip to Australia and India. It was a great experience to showcase our mocap solutions along with Xsens and Manus VR and our new resellers Tracklab and Galore System. We had prepared a demo showcasing a full-body motion capture pipeline and our 2 different real-time facial motion capture technologies.

The synopsis

Sandy De La Perdrix, Canadian TV reporter is looking for Superbot, a superhero robot to get him on stage for an interview for her famous TV show.

Sandy De La Perdrix was played by an actor and Superbot by random visitors picked out from the crowd.

The technical prep

  • Capturing body, face and hands motion from an actor

The actor was wearing a Xsens Mocap suit, gloves from Manus VR and a Dynamixyz tethered Single-View Head Mounted Camera system running with Live Pro software solution. Live Pro works with a specific tracking technology, meaning the software has to be trained towards the actor’s dedicated set of features and learns from those examples.

A range of motions had been shot before the event with a Dynamixyz Head Mounted Camera and around 50 frames of the actor’s expressions got annotated to build his tracking profile in Performer software. Those expressions were then retargeted to the 3D character Sandy De La Perdrix in Maya thanks to our Bridge plug-in connecting Performer and Maya.

The profile was then exported to be used in Unreal Engine. We first tested it by streaming recorded videos of the artist in Grabber software, loading the Live Pro tracker.

Australian Actor Myles Tamble video tracked with Live Pro tracker in Grabber software and retargeted to Sandy character in UE 4

Read more

Dynamixyz Asian tour in November

We’ve welcome two resellers in Dynamixyz Asian network: Tracklab in Australia and Galore Systems in India. That is a good opportunity to join forces and send part of Dynamixyz team to attend two major events happening in Asia this month:

Both events will be showcasing Live Pro, our real-time facial mocap system dedicated to a specific set of facial features of an actor and Live Instant tracking in real-time any face.

Visitors will discover a virtual TV news stage designed in Unreal by Dynamixyz CG team and attend live TV reporter Sandy De La Perdrix show interviewing live Mocap superhero Superbot.

Sandy de La Perdrix character will be animated live by an actor/actress with Live Pro for the face and Xsens MVN for the body while Superbot will be played by random visitors with Live Instant. Superbot body will be animated with pre-recorded footage with Xsens.

Perfomer: the machine-learning software behind Framestore’s Smart Hulk’s face animation

We were thrilled to learn our facial mocap technology had been used by Framestore in the the latest Marvel’s Avengers: Endgame.
Along with ILM, Framestore was responsible for creating Smart Hulk. Studio delivered 300 VFX shots of this never-seen-before Bruce Banner/Hulk’s Hybrid.
From ILM first developments, Framestore worked on their our own muscle system, added to the face shapes, and built their our own face rig. Hulk’s face rig was one one of the most complex rig Framestore had to create. Smart Hulk indeed needed to be able to express a wide range of emotions from comedy to seriousness. 

Read more

Update and track in a blink of an eye

Here comes August with a hot new release of Performer 2.7.1 including :

  • retargeting profile creation for Live instant,
  • blinks detection and interpolation of eyes moves,
  • options to import links from your 3D scene,
  • and a few GUI improvements.

Grabber3 2.1 revamped its main menu to integrate Live Instant and new tools for video playback.

Read more

Scan-based realistic facial animation

Together with Pixelgun Studio, a 3D high-end scanning agency, we’ve just released a Proof of Concept demonstrating facial tracking trained directly from scans, along with a direct solve on a rig in Maya, delivering high-fidelity raw results.
This technology breakthrough lets you track and solve to the digital double of an actor with a very high level of precision thanks to a process extracting automatically key poses from scanned data and solving them directly onto a rig taking advantage of its intrinsic logic.

Read more

Dynamixyz line-up at Siggraph 2019

 

Featuring:

  • Our new scan-based tracking and solving technology,
  • Live Pro Multi-View (real-time tracking system), streaming live recorded video data in Unreal Engine,
  • Performer and Grabber latest software addition.

 


Other happenings “on the floor”:

➡  Dynamixyz is a partner of the Birds of Feather Virtual Beings World held at the Siggraph Convention Centre on Monday, July 29, 2019 from 4.30 pm.
Emilie Guy, R&D engineer at Dynamixyz, will be presenting the panel of facial motion capture technologies available for virtual beings projects during the Virtual Beings, Fast Forward session.
Dynamixyz is providing a HMC and one year of Performer Factory license (including real- time) to the winner of the V-Tuber contest.

➡  Additional partners/customers demos:

ICVR – Booth 647: ICVR will unveil their digital human project– a photorealistic 3D human being, created from hundreds of 3D scans in partnership with The Scan Truck. One of the most impressive applications being shown off is the ability for an actor to drive the model real-time in Unreal Engine through a combination of Dynamixyz facial tracking software and an XSens motion-capture suit. ICVR chose Dynamixyz as its facial capture solution due to the unrivalled level of detail and customization it provides – allowing the facial capture data output to be tailored to the specific face and blendshapes of ICVR’s digital human.
ICVR is a full-service Los Angeles-based software development studio that specializes in AR, VR & other emerging technology with specific focus on Unreal Engine & Unity.

Mocap Now– Booth 748: Mocap Now will be showcasing a real-time playback of full body (Optitrack), face (Dynamixyz) and finger (Stretchsense) LIVE into Unreal.

Mocap Now is a full-service Virtual Production/Mocap Studio in Seattle Wa USA.

Standard DeviationBooth number to be confirmed: Standard Deviation will be presenting several innovative new head gears along with a real-time facial Motion Capture performance with Dynamixyz specific trackers embedded directly in the body-worn recorder.

Standard Deviation provides custom mocap gears for challenging captures. Their team has been perfecting real-time capture of face, body, and hands since the early 1990’s.

VeesoBooth number to be confirmed: Veeso will be showcasing Dynamixyz generic tracker technology directly in a VR gear environment, tracking and retargeting lip synch for VR applications.

Veeso provides VR face tracking technology and is located in UK.

Read more

Behind the Scenes of Eight VR

Eight VR is a mixed reality project by Dutch composer and director Michel van der Aa, a trailblazing fusion of musical theatre and VR. Eight tells the story of a woman’s life in reverse chronological order. Visitors, equipped with VR goggles and headphones, move through a physical installation. They can manipulate physical and virtual objects and meet the woman at various crucial moments of her life. The woman’s memories are spread out across the flexible walls in the form of interactive hieroglyphs, ready to be activated. Physical hallways merge with virtual ones in an incredible fashion, creating an infinite space.

Read more

Performer Multi-View use for Love Death + Robots – Beyond the Aquila Rift – facial animation

We were thrilled to visit our client​ ​Unit Image​ in Paris, France, last January to get their feedback after they included our Multi-View Performer software in their facial animation production pipeline for one of​ ​LOVE DEATH + ROBOTS​ short 3D animated film​s​:Beyond the Aquila Rift.

The whole story started more than one year ago when Unit Image was contacted by Blur Studio​ and​ ​Netflix​ to produce a​ 17 minutes episode of the ​LOVE DEATH + ROBOTS series​, produced by David Fincher and Tim Miller and released on Netflix on March 15, 2019.

Read more

Meet our new Smart Sync Box

Our R&D team has been hardworking for a few months prototyping a unique facial mocap shooting assistant.

This little clever electronic device comes with a user-friendly interface that:

  • Handles effective lighting by strobing LED ring(s) synchronized with the shutter of the camera(s).
    Strobing LED rings ensure more consistent illumination conditions and less interference with optical body mocap systems. Also, the actor can easily switches the LED ring(s) on and off when the shooting starts and stops for a maximum comfort.
  • Synchronises several cameras when recording with stereo systems.
  • Lets you monitor and adjust recording settings such as frames per second, exposure time, gain, HMC IDs… directly on the Head Mounted System.
  • Receives information and commands wirelessly from a server using FM band so there is no issue like with traditional wireless systems.  A server can handle up to 8 Smart Sync Boxes with a unique ID.
  • Exchanges information with Dynamixyz video recording and monitoring software, Grabber.

The Smart Sync box will be commercially available as a standalone component and embedded in each Dynamixyz stereo Head Mounted system from beginning of July 2019.

Prices are available upon request.