Company used head-mounted camera and a seated 4D capture system in place of traditional method

Neil Newbon Double film DI4D

DI4D has produced a short CG film, completing facial capture for it without using a traditional facial animation rig.

Using it Pure4D 2.0 facial animation pipeline, released earlier this year, the company used performance data from a head-mounted camera and a seated 4D capture system to capture actor Neil Newbon’s performance in Double, the short CG film. Proprietary machine learning technology learns the actor’s facial expressions, aiming to reduce subjective manual clean-up and increase consistency and efficiency. 

DI4D Pure4D 2.0 Neil Newbon film Double

It hopes to reduce overall production costs and time by removing the hours spent adjusting traditional animation rigs, and redirect these resources to other areas. Pure4D 2.0 can also be used with a traditional workflow.

Double celebrates the historic artform of acting, from its most ancient roots to its most modern digital form, and you can watch it below.

Colin Urquhart, CEO and co-founder of DI4D said: “Double proves that it’s possible to produce high-end facial animation within a constrained budget. Acting lies at the heart of PURE4D 2.0’s approach, allowing studios to allocate more of their resources to talented performers, like Neil, who captivate audiences with their abilities. It’s this unique aspect that sets performance-driven animation apart from traditional pipelines.”

Douglas Green, COO and co-founder of DI4D added: “Developing an entire film from conception to the final render has been an incredible opportunity for the DI4D team. Not only have we demonstrated the level of quality possible with a small team and budget, but we now have an even greater understanding of our client’s processes which can only help to improve our services further.”