Developpeur R&D – H/F – Stages longue durée

Developpeur R&D – H/F –  Stages longue durée

 

À propos de Dynamixyz :

Dynamixyz est une entreprise d’une quinzaine de personnes spécialisée dans les systèmes de capture de mouvements faciaux de haute qualité pour les métiers de la production (jeux vidéo, films, broadcast, expériences VR). Elle a à son actif une participation à de nombreux jeux (Red Dead Redemption 2, Resident Evil 7, NBA 2K, Assassin’s Creed, …), ainsi que des films et séries (Avengers: Endgame, Love, Death and Robots, …).Fondée en 2010 par des chercheurs, l’entreprise est toujours tournée vers la recherche et l’innovation, et est considérée comme leader dans son industrie. Dynamixyz est une entreprise à la croisée de la technologie et de l’infographie. Elle est aujourd’hui composée de trois pôles : R&D, Expérience et Marketing&Communication.

L’équipe R&D est chargée de nos activités de recherche ainsi que de la conception et du développement de notre suite logicielle. L’équipe Expérience elle s’assure que nos solutions se transforment en expériences concrètes et concluantes pour nos utilisateurs et nos équipes.

 

Read more

Xmas release made your features requests come true

First great news of this end of year is that versions of Grabber and Performer now match and we’re talking about the “2.8 version” for both software.

A new version of Live Instant is now embedded in Performer and Grabber latest release enabling a better generic tracking. It includes:

  • a reshape of 3D models templates and revision of the algorithm to get a more accurate reconstruction
  • a set of landmarks more evenly distributed
  • a quicker tracking (frame per second higher)
  • a refinement of the post-processing step (retargeting)

We’ve tested it ourselves during SiggraphAsia and were happy about the results: check those out!

 

Coming with PERFORMER 2.8:

  • Our Xmas Performer favorite feature: managing multiple profiles of an actor during batch processing in one Performer project

Performer 2.8 now lets you handle multiple settings of profiles during batch processing. That means, you can refine your profile with new frames, take after take, and build a dedicated set of poses for each scene and then process multiple data with different settings.

That can be helpful when you get shots with a specific set of expressions (sadness, anger…) for a character or a specific illumination type. Your tracking will get more precise taking into account those different settings in multiple different videos for one same Performer project.

Read more

Live Pro and Live Instant demo during Siggraph Asia

We’ve just wrapped a video edit of our trip to Australia and India. It was a great experience to showcase our mocap solutions along with Xsens and Manus VR and our new resellers Tracklab and Galore System. We had prepared a demo showcasing a full-body motion capture pipeline and our 2 different real-time facial motion capture technologies.

The synopsis

Sandy De La Perdrix, Canadian TV reporter is looking for Superbot, a superhero robot to get him on stage for an interview for her famous TV show.

Sandy De La Perdrix was played by an actor and Superbot by random visitors picked out from the crowd.

The technical prep

  • Capturing body, face and hands motion from an actor

The actor was wearing a Xsens Mocap suit, gloves from Manus VR and a Dynamixyz tethered Single-View Head Mounted Camera system running with Live Pro software solution. Live Pro works with a specific tracking technology, meaning the software has to be trained towards the actor’s dedicated set of features and learns from those examples.

A range of motions had been shot before the event with a Dynamixyz Head Mounted Camera and around 50 frames of the actor’s expressions got annotated to build his tracking profile in Performer software. Those expressions were then retargeted to the 3D character Sandy De La Perdrix in Maya thanks to our Bridge plug-in connecting Performer and Maya.

The profile was then exported to be used in Unreal Engine. We first tested it by streaming recorded videos of the artist in Grabber software, loading the Live Pro tracker.

Australian Actor Myles Tamble video tracked with Live Pro tracker in Grabber software and retargeted to Sandy character in UE 4

Read more

Dynamixyz Asian tour in November

We’ve welcome two resellers in Dynamixyz Asian network: Tracklab in Australia and Galore Systems in India. That is a good opportunity to join forces and send part of Dynamixyz team to attend two major events happening in Asia this month:

Both events will be showcasing Live Pro, our real-time facial mocap system dedicated to a specific set of facial features of an actor and Live Instant tracking in real-time any face.

Visitors will discover a virtual TV news stage designed in Unreal by Dynamixyz CG team and attend live TV reporter Sandy De La Perdrix show interviewing live Mocap superhero Superbot.

Sandy de La Perdrix character will be animated live by an actor/actress with Live Pro for the face and Xsens MVN for the body while Superbot will be played by random visitors with Live Instant. Superbot body will be animated with pre-recorded footage with Xsens.

Perfomer: the machine-learning software behind Framestore’s Smart Hulk’s face animation

We were thrilled to learn our facial mocap technology had been used by Framestore in the the latest Marvel’s Avengers: Endgame.
Along with ILM, Framestore was responsible for creating Smart Hulk. Studio delivered 300 VFX shots of this never-seen-before Bruce Banner/Hulk’s Hybrid.
From ILM first developments, Framestore worked on their our own muscle system, added to the face shapes, and built their our own face rig. Hulk’s face rig was one one of the most complex rig Framestore had to create. Smart Hulk indeed needed to be able to express a wide range of emotions from comedy to seriousness. 

Read more

Scan-based realistic facial animation

Together with Pixelgun Studio, a 3D high-end scanning agency, we’ve just released a Proof of Concept demonstrating facial tracking trained directly from scans, along with a direct solve on a rig in Maya, delivering high-fidelity raw results.
This technology breakthrough lets you track and solve to the digital double of an actor with a very high level of precision thanks to a process extracting automatically key poses from scanned data and solving them directly onto a rig taking advantage of its intrinsic logic.

Read more