Want to create believable digital humans? Unlock the secrets to animating realistic facial performances in this second wave of a series of Unreal Online Learning courses, created in partnership with Faceware. Faceware’s facial motion capture software helps animators accurately capture facial performances and create believable facial animation quickly and reliably.Simon Habib, Lead Cinematic Animator at Nesting Games will take you through Faceware Analyzer and Retarger in these two partnered courses. You’ll explore the challenges of using different styles of performance capture—pre-recorded footage, webcams, or head-mounted cameras—and how lighting, framing, and frame rate can affect tracking. Then, we’ll dive headfirst into some of the features of Analyzer and explore how to batch process multiple performances.

Through the use of Retargeter, we will import a MetaHuman character into Autodesk Maya and apply a tracked performance. You’ll learn ways to clean up tracked animation before we transfer to Unreal Engine and render a sequence of our retargeted performance.

Source: Unreal Engine Blog