FVEA 419/619: Motion Capture for Artists 2021

marey2red

“Progress is not possible without deviation” –Frank Zappa

Evolving Class Schedule for 2021:

September 13
First class meeting.  Depending upon installation issues, the class may open with a live demonstration of Jasper Brekelman’s Brekel Pro Body v2 mocap software for the Microsoft Kinect v2.    We will have a look at the first and last required reading of the semester: Maureen Furniss’ seminal and comprehensive overview paper, Motion Capture.  This will be followed by a general discussion of various motion capture technologies and the nature of the systems we have available to us here at CalArts.  Since we have lost access to the passive marker OptiTrack mocap facility located in E58 we will be looking at other options for motion capture that might be available to us including markerless single camera AI based technology such as that offered by DeepMotion. Another markerless AI based set of tools is available from Facebook Research’s FrankMocap project. We will explore the various ways that the course might work for the diverse group of students making up the class.  The scope of the specifics of the week-to-week course structure will be determined in large part by the individual interests of the group.  We will consider several approaches to motion capture as a performative art with a particular emphasis on figurative dance and abstraction.  As mentioned earlier the course was intended to focus on the theoretical and practical use of the 12 camera OptiTrack passive marker mocap system installed on a truss system in E58, however due to circumstances beyond our control E58 is undergoing renovation and will not be available to us this semester. It is hoped that it will be available during Winter Break, however that is not yet certain, in the meantime we will learn key aspects of the theory and application of marker based mocap ahead of actual hands on experience.  Today we will view the Producers Show opening video from 2007 featuring a brilliantly cynical warning about the type of mocap abuse we hope to avoid. That will be followed by viewing examples of several mocap based works created by students in previous years classes.  Most of the following videos include scenes created via the PhaseSpace active marker mocap system which was our initial mocap system at CalArts.

.

The California Institute of Motion Capture, CalArts Producers Show Intro 2007
We can do this the easy way or the hard way –or not do it at all.
Let’s push the boundaries of irony and do the dance of the mushroom cloud anyway!

.


Ke Jiang made Taxi while he was a student at CalArts.  He used The PhaseSpace mocap system to create a quirky performance by taking advantage of the artifacts that occur at the edge of the capture volume:

.


Visiting Artist Max Hattler conducted a workshop during the Program in Experimental Animation interim sessions in 2011.  The goal was to produce one or more short works using abstracted motion capture.  Forms I (Taekwondo) is one of those works:

.


Shimbe used the PhaseSpace motion capture system in a unique way for the making of this film. He rigged a Bunraku puppet with active markers and directed Danielle Ash as the puppeteer. The natural floppiness of the puppet provided an extraordinary quality to the targeted motion. You can see some still photos of the process in my photo album of the initial PhaseSpace tests at CalArts.

“The Wonder Hospital, a 3D & puppet animated film, is a surreal journey of oddity and empty illusion. In a mysterious hospital, modification of physical beauty is not what you would expect. A girl’s desire for superficial beauty leads her to chase after the luring ‘After’ images on a path of advertisements throughout the hospital. But in the end she finds something unimaginable and irreversible.”

.


A Maya playblast from 18 March 2010 of Sara Pocock‘s little girl character animated via a simple mocap T-pose test. The T-pose test was performed in class by Justin Leon in order to double check that we had setup the MotionBuilder marker mapping process correctly before moving on to a directed capture session. We came close to doing a brief capture session but ran out of time and had to postpone the session until the upcoming class. The realtime data from the T-pose test is all that we used in this test. No clean-up, filtering, retargeting, or other adjustments were done. Justin’s simple casual movements gave the character an unintended sense of presence. In subsequent class meetings Justin and Sara worked on directed performance tests in order to gain more experience with that form of mocap –even though Sara’s goal was to keyframe all of the animation in the final film.

.


For her MFA thesis project, A Scenic View of the End of the WorldIvy Flores chose to collaborate with choreographer Daniel Charon and composer Alex Wand in an iterative process wherein each participant would base what they were doing on the work the others had done in an earlier version.  This enfolding process modified the work with each iteration until Ivy reached the form she found most interesting.  The motion capture of the dancers was recorded with the PhaseSpace Impulse system mounted in the computer animation lab located in room F105 and processed to extract essential movement.  The final installation was presented in the Black and White Studio Gallery A404 from April 4-5, 2013.

.


Ivy Flores’ documentation of the process of creating  “A Scenic View of the End of the World”.

.


Prior to her work with the PhaseSpace Impulse motion capture system, Ivy created this performance animation piece using two orthogonally placed video cameras and Adobe After Effects motion tracking.

.


Rachel Ho collaborated with Julian Petschek, and Rob Gordon in the creation of the live motion capture performance, Mo Cap Mo Problems, staged in the Black and White Studio Gallery A404 as part of the Motion Capture for Artists course exhibition in the Spring of 2013.

Mo Cap Mo Problems is a 15 minute performance and video installation that employs live motion-capture in the engagement of virtual characters and spaces. The performance deals with issues of identity and technology in the service of pop culture, as explored through role-playing and the form of music gigs/concerts.”

.

Following upon the critical success of Mo Cap Mo Problems, Rachel Ho developed SLEIGHTING. This video features footage from 5 shows performed on April 3rd, 2014 in the CalArts B&W Studio A404 and was constructed to promote performances as part of the LAX 2014 Festival on September 20th, 2014, at the Bootleg Theater, Los Angeles.

SLEIGHTING is an unprecedented approach to multimedia performance using real-time motion capture and pre-visualization tools to enable a new breed of performer. It is about showmanship, hype and the future of entertainment.

Through the ability to pilot avatars before a live audience, SLEIGHTING creates a new type of superstar who is no longer confined to being one personality, but is able to be anyone and everyone. Like the superstar DJ or sportsman who commands arenas full of fans, SLEIGHTING presents itself as a future art and sport, and an event that people descend upon to witness and partake in. In this case, the arena is now the site of reinvention for the event film, and the director is now conductor and performer, empowered through technological extensions of the self.

The show has a running time of around 20 minutes and mainly consists of three sketches in which the spectacle of interfacing with virtual realities drives both narrative and design. Real-time motion capture and pre-visualization tools, typically used in the film and gaming industries, are now used to DJ virtual camera moves and special effects for this live event.”

“Penelope the penalized elephant has found himself in prison. Little does he know the true, sinister purpose of the prison.”

OCTOFORNIA is an experimental project created by Gordon Straub and Cole Mercier in their first year at CalArts.  They boldly decided to challenge themselves to make a film using 3D and Mocap computer software that they had not previously used –and were just in the initial process of learning.  They adapted their project ideas to incorporate and reveal  fundamental aspects and “flaws” of the technologies they were working with as a method of enhancing the unsettling aspects of their dystopian story.

.


Jorge Ravelo has experimented with using the Kinect v2 and Jasper Brekelman’s Brekel Pro Body v2 in the creation of both real-time performance and recorded work. In this film, Skip Jumbo, he worked with rapid montage layering of mocap video recordings from MotionBuilder with live action 16mm film shot with a Bolex camera.

“Skip smiles // Skip jumps // Skip eats cereal // Skip looks at the sky and the news all at once.”

.

pippaspanatt

Narae Kim, Celine Tien, and Julian Soros of the Pippa’s Pan team at the AT&T Developer’s Summit where they were declared third place winners of the AT&T VR/AR Challenge.

Pippa’s Pan   is a reactive VR short film that takes you, our Agent, through the forest of Pippa’s mind to help re-capture fragments of her forgotten memories. Experimenting with techniques in animation, world building, motion-capture, 3D spatial audio, and even light field technology, Pippa’s Pan is set to be one of virtual reality’s first hybrid live-action short films. A literal forest woven from the sinews of this team’s ideas and youthful naiveté, the group of young dreamers will deconstruct concepts of storytelling to re-invent the relationship between audience and narrative.

.

September 20
Second class meeting.   After a preliminary discussion we will move into a brief demonstration of Brekel Pro Body v2, Jasper Brekelman’s mocap software for the Microsoft Kinect v2  Some of the features and limitations of single depth camera mocap will be described and a demonstration of key aspects of Brekel Pro Body 2’s real-time performance capture and display will be conducted.  Later in the semester we will move into a look at the integration between Brekel Pro Body 2 and Autodesk MotionBuilder.  The process of loading the Pro Body 2 plug-in for MotionBuilder into the scene by dragging and dropping it from the Devices folder into the Producer Perspective window will be covered.  Loading a stock “characterized” polygonal mesh model already set up to deform via the integral bone system of the character will be demonstrated.  The procedure for setting the character up to be driven by the Kinect skeleton system motion data will be shown as well as the use of multiple models animated by a single real-time performer.  We will also look at the Brekel Pro Face v2 markerless facial capture software and discuss its advantages and disadvantages in comparison to more expensive professional facial capture solutions such as Dynamixyz and Faceware Technologies  (Faceware provided a live demonstration to the class in 2016). We will also learn of the sad loss of public access to the brilliantly designed markerless facial capture system, Faceshift,which was purchased by Apple in 2015. Instead of continuing to make it available, they removed it from the market unfortunately depriving artists of a great tool. Much research continues into facial mocap as facial expressions are a key modality of nonverbal communication.

In past years students would log into the online Autodesk Character Generator program which was free to students and educators using a valid school domain email address.  As of August 7, 2021 Autodesk is no longer providing Character Generator as a  standalone Autodesk application.  It will still be available for subscribers to certain Autodesk products including Maya LT. We will have to see how that will impact ease of access for CalArts students. Autodesk’s online help provided a good overview of Character Generator  workflow  however that link is now broken. At the end of the creation process it was possible to download the generated character for use in several programs.  The goal was to obtain an FBX file suitable for use in MotionBuilder  which has been the standard commercially available product for editing Motion Capture data for studios that cannot afford to create proprietary editing software.  There used to be  some confusion about the paid options actually being free with an education account.  The process indicated that you must pay to download premium features, however as a student or educator you were granted unlimited “credits” which suffices to “purchase” and download the character.  Hopefully we will be able to still work with the easy to understand and use Character Generator.  There are other character generation programs available, and we will be taking a look at those.  One of these versatile but complex to learn high end packages is Character Creator from Real Illusion which is capable of generating photoreal characters as well as “cartoony” characters –however it has a steep learning curve. The complex Metahuman Creator photoreal character creation program is being developed by Epic Games to work with their powerful game engine, Unreal, (to which it is limited), and is now available in an Early Access program.

Jasper Brekelman showcasing some of the key features of Brekel Pro Body v2

.

September 27
Third class meeting.  As previously planned we met via Zoom.  The original intention to take a look at the chapter on Joan Stavely in Robert Russett’s book, Hyperanimation: Digital Images and Virtual Worlds in preparation for viewing her seminal work Wanting for Bridge –as well as her work in the end by Chris Landreth was thwarted as a scan of that chapter had not been prepared in advance (the print edition of Hyperanimation is available in the CalArts Library stacks). After viewing Wanting for Bridge and the end, a discussion of those works was held.  One tangent of particular interest was around Chris Landreth’s use of the devices of metafiction in the end (Chris has a long history of experimenting with modes of fiction and representation dating back to his film Data Driven The Story Of Franz K.).  Pepi Eirew brought up a Chuck Jones Merry Melodies cartoon, Duck Amuck, as one example of the historic use of metafiction in animation. The discussion moved on to a description of  Joan Stavely’s participation, along with Sandy Stone, Rob Tow, and Brenda Laurel, in the brilliant 1995 SIGGRAPH Conference Panel organized by Greg Garvey to address questions raised at SIGGRAPH the previous year regarding sexist and non-sexist readings of inherent gender differences: Grids, Guys and Gals: Are you oppressed by the Cartesian Coordinate System?.  In past years we would log into the online Autodesk Character Generator program  At the end of the creation process it was possible to download the generated character for use in several programs.  Our goal was to obtain an FBX file suitable for use in MotionBuilder.  We will be exploring our options for this year, which may include the use of the single point of view camera app Animate 3D from DeepMotion which incorporates a  combination of physics simulation, computer vision, and machine learning to create mocap files which can be used with rigged models from a wide range of character generation programs including the  cartoon-like avatar character styles of, Ready Player Me, to the hyper-real character styles of Metahuman Creator. DeepMotion Animate 3D allows for direct export and import of the derived motion capture data into a wide variety of programs. Video tutorials are available for many programs such as the Unreal Engine, Unity 3D, Maya, and more.

Screen Shot 2018-01-30 at 1.51.19 AM

.

October 4
Fourth  class meeting. In past years Motion capture expert John Brennan served as a guest artist.  In the photo below you can see him meeting with the class in the E58 motion capture facility in order to explore basic principles for marker placement and the foundational anatomical understanding of locomotion that underpins theories of optical marker based performance capture.  Since we do not have access to E58 this semester we will explore these principles in theory –instead of a full practical demonstration– during our next in person class meeting (which takes place the week after Indigenous Peoples Day). During the October 18 class meeting I will go over those principles using MotionBuilder 2022 which was recently installed on the VR/Mocap workstation in F105. This week we will be meeting via Zoom and experimenting with the use of DeepMotion’s Animate 3D through a series of individual student experiments using the “Freemium” version of the app. We will use the general Video Capture Guidlines provided by DeepMotion. Since the “Freemium” account limits video recordings to 720p, recorded takes can be edited to the desired running time, then converted to 720p via Adobe Premiere (or other NLE) for upload to the “Freemium” account. I recommend experimenting with 10 second clips to start with. Exporting the generated mocap files from Animate 3D in the FBX file format will allow them to be viewed in a wide range of programs.  One very useful tool for quick viewing is Autodesk’s FBX Review which is free to download on a variety of devices.

IMG_20190304_141703
John Brennan explaining best practices in marker placement based upon a working knowledge of human bone and joint structures.  This included a description of methods for adapting that understanding to the specific simplifications required for  motion capture.

.

October 11
INDIGENOUS PEOPLES DAY

image (4)Still of indigenous language groups from the interactive map located at: https://native-land.ca/

“We strive to map Indigenous lands in a way that changes, challenges, and improves the way people see the history of their countries and peoples. We hope to strengthen the spiritual bonds that people have with the land, its people, and its meaning.

We strive to map Indigenous territories, treaties, and languages across the world in a way that goes beyond colonial ways of thinking in order to better represent how Indigenous people want to see themselves.

We provide educational resources to correct the way that people speak about colonialism and indigeneity, and to encourage territory awareness in everyday speech and action.”

.

October 18
Sixth class meeting.  Due to a heavier than expected reaction to a Pfizer-BioNTech booster this class was cancelled and some of the planned material for that day is being postponed to November 1  — We will explore basic principles for marker placement and the foundational anatomical understanding of locomotion that underpins theories of optical marker based performance capture.  Since we do not have access to E58 this semester we will explore these principles in theory (instead of a full practical demonstration). We will also go over the differences between Forward Kinematics (FK) and Inverse Kinematics (IK) in working with CGI character rigs.  We will move on to experimenting with pose estimation FBX files generated with Animate 3D imported into Autodesk Maya. We will explore the principles inherent in character definition and the requisite re-assigning of bone names as well as a general overview of the retargeting process. The photo below from 2019 shows Program in Character Animation faculty member John Yoon discussing various aspects of working with rigged characters using as an example characters that he had created for students in his Maya rigging course.  We may have a visit this year from John to discuss the latest developments in working with rigs in Maya and other applications.

vlcsnap-00119

John Yoon recording Bleu Cremers testing an Inverse Kinematic rig for John Brennan

.

October 25
Seventh class meeting.  A presentation  and discussion of  material from the Drawing in Space  and Chronophotography class pages.  We will look at a wide range of approaches to capturing and displaying motion over time. If pandemic protocols permit students may be able to have a direct experience  actively drawing in the immersive three dimensional space space of room scale using the HTC Vive and the seminal VR drawing program  Tilt Brush.

“Google has been working closely with more than 60 artists to help them explore their style in virtual reality as part of the Tilt Brush Artist in Residence program (AiR). Coming from a wide range of disciplines, these graffiti artists, painters, illustrators, graphic designers, dancers, concept artists, creative technologists and cartoonists have all brought their passion and talent to create some amazing art with Tilt Brush.”

.

November 1
Eighth class meeting. We will cover much of the material originally scheduled for October 18. We will explore basic principles for marker placement and the foundational anatomical understanding of locomotion that underpins theories of optical marker based performance capture.  Since we do not have access to E58 this semester we will explore these principles in theory (instead of a full practical demonstration). We will also go over the differences between Forward Kinematics (FK) and Inverse Kinematics (IK) in working with CGI character rigs.  We will move on to experimenting with pose estimation FBX files generated with Animate 3D imported into Autodesk Maya (and by extension –Autodesk MotionBuilder. We will explore the principles inherent in character definition and the requisite re-assigning of bone names as well as a general overview of the retargeting process.

In 2013  John Brennan took over the class in my absence while I was away conducting a workshop at the ENSAV of  the Université de Toulouse-Le Mirail.  John demonstrated a way to perform multiple characters live with the PhaseSpace motion capture system and Autodesk MotionBuilder which inspired Rachel Ho to work with Julian Petschek and Rob Gordon to workshop and produce the innovative live performance, Mo Cap Mo Problems, in the space of only two weeks. Subsequently John worked with Rachel and Julian et al, on the piece  SLEIGHTING, which was initially staged at CalArts, and then later at the Bootleg Theater in Los Angeles as part of the 2015 LAX festival.

Mo Cap Mo Problems is a multimedia performance utilizing real-time motion capture with a playful irreverence that straddles the gap between high tech and low tech. It is self-referential in its engagement with the technology and tools of the trade, while involving improvisation and audience participation more suited for a club type setting. The 15-minute show consists of a few musical segments including characters such as a trigger happy sci-fi warrior, Aragor, and a “magic trick” involving beer. Issues of identity and the relationship of technology to pop culture are implicated through role-play involving clip art characters native to Autodesk’s software for motion capture.

This video features excerpts from five shows performed April 25th, 2013 at CalArts Black & White Studio Gallery A404.

.

November 8
Ninth class meeting.  Several years ago mocap class alumnx Narae Kim presented the latest build of her collaborative VR project Pippa’s Pan . Students had a chance don the HTC Vive and use the handheld controllers to experience the interactive immersive VR story of two lovers reliving memories across the gap of encroaching Alzheimer’s Disease. She and her partners have since created a start-up company, Flowly,  dedicated to using Biofeedback and VR to assist people in managing chronic pain.

.

November 15
Tenth class meeting. A presentation on Chronophotography beginning with its history as an initial form of motion capture, followed by a review presentation of Drawing in Space, and pandemic protocols permit we will have one-on-one sessions providing a direct experience of the latest version of my long standing VR project, Anaphorium. The video below is from the early 3D drawing program SANDDE shown as part of the Drawing in Space presentation.

.

November 22
Eleventh class meeting. Many years ago we had a presentation and demo by a team from Faceware Technologies.  Academic Program Manager Max Murray provided an overview of the company’s history in facial performance capture while the other members of the team set up for a demo of their real-time head mounted facial mocap system.  Since Brian Lecaroz had done some tests with Faceware’s free demo software (and had been planning on using that for his installation in the course exhibition until there was a license collapse that could not be rectified in time for that show) Brian was selected as the person to wear the HD Pro Headcam for the initial demo session (and perhaps come away with some files that he could use to continue work on his AI interview project).   The team also brought along their lower cost GoPro based solution. This year we will take a look at the history of facial motion capture and a range of approaches that have been taken.   One of the best markerless systems was Faceshift which was purchased by Apple in 2015. Instead of continuing to make it available, they removed it from the market unfortunately depriving artists of a great tool.

This year we may have a presentation on virtual camera systems.

.

November 29
Twelfth class meeting. Two years ago John Brennan continued covering the setup and use of the Unreal Virtual Camera, and also conducted a workshop on using the Optitrack system with the Oculus Rift and the Unreal game engine in order to provide two different options inside Unreal. The first option is to have the user wearing the Rift tracked with the Optitrack so that they appear embodied.  The second option is to have the user wearing the Rift see and interact with a visual  embodiment of a second person being tracked via the Optitrack system.

In previous years the course was held in the Spring and students developed projects for a semester end show. With the introduction of CalArts Expo we combined the year end show with the Expo. As an example, for CalArts Expo 2018 we developed  a show entitled,  Ampersand: Experiments in Kinesthesia, which was held in the B&W Gallery A404 on Thursday, May 3. This year I have reserved the B&W Gallery A404 from December 1-3 so that it might be possible to mount a combined exhibition of the Motion Capture for Artists and Absolute Animation Workshop courses to open at 7:00 pm, December 2, 2021 (and struck the following day, December 3)

In 2018  Naomi Cornman and Darby Johnston from Oculus demonstrated the use of their innovative VR sculpting program Medium, providing students with direct hands on experience via a set of one-on-one sessions.Following is a list of links that Naomi provided us.

Monthly Artist Spotlight series:

Goro Fujita of Oculus Story Studio’s live stream.

Landis Fields of ILMxLab’s live stream.

Krystal Sae Eua of The Mill’s live stream.

Brandon Gillam of Carbon Games’ live stream.

Gio Nakpil of the Oculus REX team.

Steve Lord, amazing traditional and Zbrush sculptor. (This is the only live stream we have with the updated UI and features of the current build).

Two Facebook groups for viewing and sharing work:

Virtual Sculpting

Oculus Medium Artists

ZBrush educator and animator, Glen Southern’s  tutorials.

.

https://www.facebook.com/oculusmedium/videos/657258727789200/

Goro Fujita live stream sculpting session with an earlier version of Medium back in September 2016.

.

December 6
Thirteenth class meeting.  TBD.  As was noted in the schedule description for the November 29 meeting, in previous years the course was held in the Spring and students developed projects for a semester end show. With the introduction of CalArts Expo we combined the year end show with the Expo. As an example, for CalArts Expo 2018 we developed  a show entitled,  Ampersand: Experiments in Kinesthesia, which was held in the B&W Gallery A404 on Thursday, May 3. This year I reserved the B&W Gallery A404 from December 1-3 so that it might be possible to mount a combined exhibition of the Motion Capture for Artists and Absolute Animation Workshop courses to open at 7:00 pm, December 2, 2021 (and struck the following day, December 3).  Since that will have happened last week, this will allow us to do something else during the this penultimate class meeting.

.

December 13
Fourteenth class meeting. Semester wrap up.

.