Facial Performance Capture

Nick Cobby created this video for electronica and techno music producer Max Cooper’s deconstruction/reconstruction of Michael Nyman & David McAlmonts’ classic recording, Secrets, Accusations & Charges. Cooper’s work as a sound artist developed in parallel with his work as a scientist. He holds a Ph.D. in computational biology from the University of Nottingham where his research centered on modeling the evolution of gene regulatory networks, with a particular focus on the evolutionary modeling of feed forward loops in gene regulatory networks. Motion graphics designer and animator Nick Cobby also studied in Nottingham receiving a BA in Graphic Design from Nottingham Trent University.

For this video Cobby wanted to add a human element to his work which had previously been in the realm of absolute animation.  He put out a call for Max Cooper fans to come down to his London studio to be filmed. Each participant was filmed individually, concentrating on their facial expressions and head movements. Depth data captured by a Microsoft Kinect was used to create 3D point cloud data of each performance. This 3D data map was then reconstructed via the creative application of generative code to re-create abstracted active representations of the gestural forms of the performers.

.

goertz-e1-1a-x640
One thread in the history of facial performance capture can be traced to the development of a series of electro-mechanical manipulator arms such as those used by the Atomic Energy Commission at the Argonne National Laboratory in the early 1950’s.  These devices were nicknamed “Waldoes” based upon the devices invented by Waldo Farthingwaite-Jones, the eponymous protagonist of Robert Heinlein’s popular 1942 science fiction story Waldo.

.

In 1991 Dr. Dave Warner of Loma Linda Hospital and Medical University collaborated with Steve Tice of SimGraphics and Rick Lazzarni of The Character Shop to create a real-time computer graphic cartoon puppet to interact with hospitalized children in a teaching and rehabilitation psychiatry study. The Character Shop developed a prototype facial animation controller, the “Face Waldo” for controlling the CG puppet in real-time.

 This work led to a commercial product used by Nintendo America to perform real-time animation of their game character Mario.  I had a chance to see Mario’s performance at the 1992 SIGGRAPH Conference and was suitably impressed with the power of the interactive audience experience.  The video above shows the Face Waldo in use at the 1992 Summer Consumer Electronics Show.

.

lanceWilliams_performanceDrivenFacialAnimation_1990

Just a couple of  years earlier computer graphics pioneer Lance Williams delivered a brilliant paper, Performance Driven Facial Animation, at the 1990 SIGGRAPH Conference which clearly laid out the territory to be explored.  In the abstract he states:

“As computer graphics technique rises to the challenge of rendering lifelike performers, more lifelike performance is required. The techniques used to animate robots, arthropods, and suits of armor, have been extended to flexible surfaces of fur and flesh. Physical models of muscle and skin have been devised. But more complex databases and sophisticated physical modeling do not directly address the performance problem. The gestures and expressions of a human actor are not the solution to a dynamic system. This paper describes a means of acquiring the expressions of real faces, and applying them to computer-generated faces. Such an “electronic mask” offers a means for the traditional talents of actors to be flexibly incorporated in digital animations. Efforts in a similar spirit have resulted in servo-controlled “animatrons,” high-technology puppets, and CG puppetry. The manner in which the skills of actors and puppeteers as well as animators are accommodated in such systems may point the way for a more general incorporation of human nuance into our emerging computer media. The ensuing description is divided into two major subjects: the construction of a highly-resolved human head model with photographic texture mapping, and the concept demonstration of a system to animate this model by tracking and applying the expressions of a human performer.”

.

Fast forward to 2015 and we see that the evolution of facial performance capture is continuing at a rapid pace.  From photorealism, to highly stylized cartoon characters,  to full abstraction, newly developed techniques in both real-time and post-processed facial performance capture provide a wide range of options.  The acceleration of computer processing power and efficient code structures coupled with an increased refinement in capturing methodologies  are opening up new opportunities for artistic expression.

The two videos below give some sense of the progress being made in the game industry with the the use of motion capture for the creation of in-game play and cutscenes approaching the verisimilitude seen in the photo realistic performance of digital characters in motion pictures.  In Kevin Spacey’s promotional interview  for the 2014 production, Call of Duty: Advanced Warfare, he addresses his experience with motion capture and how it relates to his work in television, film, and stage.

.

.

.

.

.

More to come…

.

.

.

.

.

.

.

http://www.fxguide.com/featured/the-art-of-digital-faces-at-ict-from-digital-emily-to-digital-ira/

.