With that step complete, the MetaHuman Identity is used to interpret the performance by tracking and solving the positions of the MetaHuman facial rig. The result is that every subtle expression is accurately recreated on your MetaHuman target character, regardless of the differences between the actor and the MetaHuman’s features. In Unreal Engine, you can visually confirm the performance is coming through and compare the animated MetaHuman Identity with the footage of the actor, frame by frame.Another benefit is that the animation data is clean; the control curves are semantically correct, that is, they are where you would expect them to be on the facial rig controls—just as they would be if they had been created by a human animator—making them easy to adjust if required for artistic purposes.

Since we’ve used a MetaHuman Identity from the outset, you can be safe in the knowledge that the animation you create today will continue to work if you make changes to your MetaHuman in the future, or if you wish to use it on any other MetaHuman.

MetaHuman Animator’s processing capabilities will be part of the MetaHuman Plugin for Unreal Engine, which, like the engine itself, is free to download. If you want to use an iPhone—MetaHuman Animator will work with an iPhone 11 or later—you’ll also need the free Live Link Face app for iOS, which will be updated with some additional capture modes to support this workflow.

Can’t wait to get started? Make sure you’re signed up for the newsletter below, and we’ll let you know when MetaHuman Animator is available.

Source: Unreal Engine Blog