(Cartoon Brew) Epic Games has unveiled a new module of its Metahuman platform called Metahuman Animator that is among the most significant advances we’ve seen in automated lip sync and facial animation.
The new tech eliminates the need for bespoke facial motion capture technology, making it possible to accomplish with only an iPhone. The unedited animation output through Unreal Engine is nuanced and polished and provides a framework that animators can further tweak and refine. For the purpose of games and series, which require large outputs of cg animation, this tech could very well be a game changer.
The software was announced at the Game Developers Conference currently taking place in San Francisco. Metahuman Animator is planned for release in the next few months, and will be part of the Metahuman plugin for Unreal Engine.
Here’s a look at what Metahuman Animator generates:
And here’s the five-minute real-time demo from the State of Unreal keynote at GDC that explains how it all works:
Some key highlights of the tech: