FVEA 419/619: Motion Capture for Artists 2017

marey2red

“Progress is not possible without deviation” –Frank Zappa

Evolving Class Schedule for 2017:

January 9
First class meeting. The first and last required reading of the semester:  Maureen Furniss’s comprehensive overview paper, Motion Capture.  We will hold a general discussion of various motion capture technologies and the nature of the systems we have at CalArts. During the discussion we will explore the ways in which the course might work for the diverse group of students making up the class.  Narae Kim will show and discuss the work she has been doing on the project Pippa’s Pan. The specifics of the week-to-week course structure will be determined in large part by the individual interests of the group.  Following the discussions we will view several approaches to motion capture –starting with a brilliantly cynical warning about the type of mocap abuse we hope to avoid.  Most of the following videos include scenes created via the PhaseSpace mocap system at CalArts.

.

The California Institute of Motion Capture, CalArts Producers Show Intro 2007
We can do this the easy way or the hard way –or not do it at all.
Let’s push the boundaries of irony and do the dance of the mushroom cloud anyway!

.


Ke Jiang made Taxi while he was a student at CalArts.  He used The PhaseSpace mocap system to create a quirky performance by taking advantage of the artifacts that occur at the edge of the capture volume:

.


Visiting Artist Max Hattler conducted a workshop during the Program in Experimental Animation interim sessions in 2011.  The goal was to produce one or more short works using abstracted motion capture.  Forms I (Taekwondo) is one of those works:

.


Shimbe used the PhaseSpace motion capture system in a unique way for the making of this film. He rigged a Bunraku puppet with active markers and directed Danielle Ash as the puppeteer. The natural floppiness of the puppet provided an extraordinary quality to the targeted motion. You can see some still photos of the process in my photo album of the initial PhaseSpace tests at CalArts.

“The Wonder Hospital, a 3D & puppet animated film, is a surreal journey of oddity and empty illusion. In a mysterious hospital, modification of physical beauty is not what you would expect. A girl’s desire for superficial beauty leads her to chase after the luring ‘After’ images on a path of advertisements throughout the hospital. But in the end she finds something unimaginable and irreversible.”

.


A Maya playblast from 18 March 2010 of Sara Pocock‘s little girl character animated via a simple mocap T-pose test. The T-pose test was performed in class by Justin Leon in order to double check that we had setup the MotionBuilder marker mapping process correctly before moving on to a directed capture session. We came close to doing a brief capture session but ran out of time and had to postpone the session until the upcoming class. The realtime data from the T-pose test is all that we used in this test. No clean-up, filtering, retargeting, or other adjustments were done. Justin’s simple casual movements gave the character an unintended sense of presence. In subsequent class meetings Justin and Sara worked on directed performance tests in order to gain more experience with that form of mocap –even though Sara’s goal was to keyframe all of the animation in the final film.

.


For her MFA thesis project, A Scenic View of the End of the WorldIvy Flores chose to collaborate with choreographer Daniel Charon and composer Alex Wand in an iterative process wherein each participant would base what they were doing on the work the others had done in an earlier version.  This enfolding process modified the work with each iteration until Ivy reached the form she found most interesting.  The motion capture of the dancers was recorded with the PhaseSpace Impulse system mounted in the computer animation lab located in room F105 and processed to extract essential movement.  The final installation was presented in the Black and White Studio Gallery A404 from April 4-5, 2013.

.


Ivy Flores’ documentation of the process of creating  “A Scenic View of the End of the World”.

.


Prior to her work with the PhaseSpace Impulse motion capture system, Ivy created this performance animation piece using two orthogonally placed video cameras and Adobe After Effects motion tracking.

.


Rachel Ho collaborated with Julian Petschek, and Rob Gordon in the creation of the live motion capture performance, Mo Cap Mo Problems, staged in the Black and White Studio Gallery A404 as part of the Motion Capture for Artists course exhibition in the Spring of 2013.

Mo Cap Mo Problems is a 15 minute performance and video installation that employs live motion-capture in the engagement of virtual characters and spaces. The performance deals with issues of identity and technology in the service of pop culture, as explored through role-playing and the form of music gigs/concerts.”

.

Following upon the critical success of Mo Cap Mo Problems, Rachel Ho developed SLEIGHTING. This video features footage from 5 shows performed on April 3rd, 2014 in the CalArts B&W Studio A404 and was constructed to promote performances as part of the LAX 2014 Festival on September 20th, 2014, at the Bootleg Theater, Los Angeles.

SLEIGHTING is an unprecedented approach to multimedia performance using real-time motion capture and pre-visualization tools to enable a new breed of performer. It is about showmanship, hype and the future of entertainment.

Through the ability to pilot avatars before a live audience, SLEIGHTING creates a new type of superstar who is no longer confined to being one personality, but is able to be anyone and everyone. Like the superstar DJ or sportsman who commands arenas full of fans, SLEIGHTING presents itself as a future art and sport, and an event that people descend upon to witness and partake in. In this case, the arena is now the site of reinvention for the event film, and the director is now conductor and performer, empowered through technological extensions of the self.

The show has a running time of around 20 minutes and mainly consists of three sketches in which the spectacle of interfacing with virtual realities drives both narrative and design. Real-time motion capture and pre-visualization tools, typically used in the film and gaming industries, are now used to DJ virtual camera moves and special effects for this live event.”

“Penelope the penalized elephant has found himself in prison. Little does he know the true, sinister purpose of the prison.”

OCTOFORNIA is an experimental project created by Gordon Straub and Cole Mercier in their first year at CalArts.  They boldly decided to challenge themselves to make a film using 3D and Mocap computer software that they had not previously used –and were just in the initial process of learning.  They adapted their project ideas to incorporate and reveal  fundamental aspects and “flaws” of the technologies they were working with as a method of enhancing the unsettling aspects of their dystopian story.

.


Jorge Ravelo has experimented with using the Kinect v2 and Jasper Brekelman’s Brekel Pro Body v2 in the creation of real-time performance and recorded work. In  this work-in-progress fragment he is experimenting with layering mocap recordings from MotionBuilder with live action 16mm film shot with a Bolex camera.

.

pippaspanatt

Narae Kim, Celine Tien, and Julian Soros of the Pippa’s Pan team at the AT&T Developer’s Summit where they were declared third place winners of the AT&T VR/AR Challenge.

Pippa’s Pan   is a reactive VR short film that takes you, our Agent, through the forest of Pippa’s mind to help re-capture fragments of her forgotten memories. Experimenting with techniques in animation, world building, motion-capture, 3D spatial audio, and even light field technology, Pippa’s Pan is set to be one of virtual reality’s first hybrid live-action short films. A literal forest woven from the sinews of this team’s ideas and youthful naiveté, the group of young dreamers will deconstruct concepts of storytelling to re-invent the relationship between audience and narrative.

.

January 16
Martin Luther King Holiday

.

January 23
Second class meeting.  Last year Visiting artist John Brennan discussed his work with motion capture for theatrical motion pictures, virtual reality, and more.  We suited up class member Tristan Kilmer and learned about the placement of markers on the suit relative to the body, examined the other components of the PhaseSpace Impulse system, and  explored several ways the live data may be viewed in the PhaseSpace Impulse software.  We started up Autodesk MotionBuilder and covered loading the PhaseSpace OWL plug-in into a scene, loading in an “actor” element, assigning markers to the “actor”, dropping a “character” onto the “actor” and recording some performances.  We added additional “characters” to demonstrate multiples driven by a single “actor”.  I had intended to do those demos on my own this year, however am postponing that to the following week due to illness.

vlcsnap-2016-02-18-17h27m45s553

John Brennan pointing out details in the legacy PhaseSpace LED marker placement diagram compared to contemporary optimizations of marker pair placement at the end of major limb bones (i.e. the distal of the humerus) in order to better isolate motion relative to the joints.

.

January 30
Third class meeting. We suited up class TA Narae Kim and learned about the placement of markers on the suit relative to the body, examined the other components of the PhaseSpace Impulse system, and explored several ways the live data may be viewed in the PhaseSpace Impulse software.  We started up Autodesk MotionBuilder and covered loading the PhaseSpace OWL plug-in into a scene, observing the markers as Narae moved around the scene, recorded a T-pose and some simple ROM and walking around. We loaded an “actor” element into the scene, aligned the “actor” proportions to match the marker positions.  created a marker set, dropped markers into the appropriate “balls” on the MarkerSet, and created rigid bodies.  We then dropped a “character” onto the “actor”, and saved a .fbx and .c3d file and of the project.   After the break students ran MotionBuilder under the Parallels virtual machine on their individual workstations and  followed along with a demo of the process covered earlier. Each student loaded a .c3d file from the first part of the afternoon and had the direct experience of mapping markers to an “actor”, making rigid body groups.  We ran out of time and planned to pick up the work in the following weeks class.

February 6
Fourth class meeting. We continued the MotionBuilder under Parallels lesson from the previous week.  This process was successful and after the break we held a brief discussion on the direction to be taken in coming weeks.  This was followed by viewing several videos on the use of performance capture in feature film production along with contextualizing  explanations and discussion of some of the details being brought forth in the videos.

One of the promos we viewed illustrating advanced approaches in performance capture.
.

Another promo from an earlier state-of-the-art approach innovating facial capture.

.

February 13
Fifth class meeting. Continue with PhaseSpace system with attention to the logic of marker placement and range of motion (ROM) tests.  This will be followed by an introduction to the Kinect based Brekel Pro Body 2 system.

.

February 2o
President’s Day Holiday.

.

February 27
Sixth class meeting.  A demo on the use of Autodesk Character Generator to create a customized character by selecting from and blending pre-existing templates.  The completed character model can be downloaded in several formats including a generic .fbx file which can then be loaded into Autodesk MotionBuilder as a fully rigged deformable model. Each student was able to create a character (although there was some difficulty with advanced mode character creation which allowed for the inadvertent creation of characters that could not be downloaded without paying a fee).  Our goals had been for each student to load their generated character into MotionBuilder and then set up and driven with a previously recorded motion file created in class with the PhaseSpace mocap system.  All the workstations in the lab had been updated to run Microsoft Windows 10 under the Parallels Virtual Machine over the weekend, however something had broken and less than half the class was able to open Windows and run MotionBuilder. We determined to put aside the MotionBuilder exercise until the following week and instead view and discuss video examples of the early and recent pioneering developments in the art of facial performance capture.

 .

March 6
Seventh class meeting. We tested the login to Windows 10 via Parallels (which IT had tested as working just before class) and found that there were still difficulties.  Matt and Scott checked into it and came up with a workaround.  We proceeded to have each student use previously recorded PhaseSpace mocap data in MotionBuilder to work through the process of creating markersets on the Actor module, assigning rigid bodies, and then driving their custom Character from the Actor.  We then took a look at the use of the Parent/Child flavor of constraint to attach Primitive shapes and Lights to various markers and/or Rigid Bodies.

.

March 13
Eighth class meeting. In researching methods for realizing some of the areas of interest expressed by students in the class I reviewed one of Vimeo’s Pick of the Year winners. Method Studios Director’s Cut of the 2016 AICP Sponsor Reel.  The performance capture used for the series was done at House of Moves using their Vicon mocap system  and then processed in Houdini to create a wide range of abstractions of the dancer’s bodies.  Nick Campbell of Greyscalegorilla used this work as an inspiration for showing how Cinema 4D could be used to create similar motion graphics from stock mocap from Adobe’s Mixamo libraries. Mixamo’s models and animation are available at no additional cost to Adobe Cloud software users. Visiting artist and EA faculty member Theo Vaillant  will demonstrate some of the principles covered in.  Greyscalegorilla’s  8 tutorials on Motion Capture in C4d.

.

March 20
Ninth class meeting .  Last year we held a review of student works-in-progress  in preparation for the upcoming course exhibition.  We discussed hardware requirements for the show, including options for multiple VR projects potentially employing two HTC Vive Pre’s, two Oculus Rift DK2’s, and perhaps one Oculus Rift CV1. We understood that if we were to utilize all these VR systems (and support any Kinect v2 realtime projects) simultaneously , we would need to locate more high end Windows PC’s to run them.

.

March 27
Spring Break

.

April 3
Tenth class meeting. Narae Kim presented the latest build of her collaborative VR project Pippa’s Pan .  Students had a chance don the HTC Vive and use the hand held controllers to experience the interactive immersive VR story of two lovers reliving memories across the gap of encroaching Alzheimer’s Disease.

Pippa’s Pan   is a reactive VR short film. It takes you, our Agent, through the forest of Pippa’s mind to help re-capture fragments of her forgotten memories. Experimenting with techniques in animation, world building, motion-capture, 3D spatial audio, and even light field technology, Pippa’s Pan is set to be one of virtual reality’s first hybrid live-action short films. A literal forest woven from the sinews of our team’s ideas and youthful naiveté, our group of young dreamers will deconstruct concepts of storytelling to re-invent the relationship between audience and narrative.”

After people had a chance to experience Pippa’s Pan we set up an additional 5 PhaseSpace mocap cameras in a ring at waist level to complement the 8 ceiling mounted cameras. Gordon Straub suited up in the recently repaired large PhaseSpace suit in order to demonstrate the higher fidelity of the mocap data and to prepare for a performance capture session scheduled for the period immediately after the end of class.

.

April 10
Eleventh class meeting. A presentation on Chronophotography beginning with its history as an initial form of motion capture. This was followed by a review presentation of Drawing in Space, and one-on-one sessions providing a direct experience of  the latest version of my long standing VR project, Anaphorium.

.

April 17
Twelfth class meeting.  Last year we had a presentation and demo by a team from Faceware Technologies.  Academic Program Manager Max Murray provided an overview of the company’s history in facial performance capture while the other members of the team set up for a demo of their real-time head mounted facial mocap system.  Since Brian Lecaroz had done some tests with Faceware’s free demo software (and had been planning on using that for his installation in the course exhibition until there was a license collapse that could not be rectified in time for Thursday’s show) Brian was been selected as the person to wear the HD Pro Headcam for the initial demo session (and perhaps come away with some files that he could use to continue work on his AI interview project).   The team also brought along their lower cost GoPro based solution. This year we will take a look at the history of facial motion capture and a range of approaches that have been taken.   One of the best markerless systems was Faceshift which was purchased by Apple in 2015. Instead of continuing to make it available, they removed it from the market unfortunately depriving artists of a great tool. Amanda Vincelli has asked to learn more about the practical use of facial capture for her projects so we will be exploring how to work with the Brekel Pro Face 2 software for the Kinect V2 and MotionBuilder as one option.

.

April 24
Thirteenth class meeting. Naomi Cornman and Darby Johnston from Oculus will demonstrate the use of their innovative VR sculpting program Medium, providing students with direct hands on experience via a set of one-on-one sessions.  Oculus is supplying us with an Oculus Touch workstation under their NextGen program so that students and faculty will be able to explore the potential of Medium more deeply and have the option of using it for projects.  Following is a list of links that Naomi sent us.

Monthly Artist Spotlight series:

Goro Fujita of Oculus Story Studio’s live stream.

Landis Fields of ILMxLab’s live stream.

Krystal Sae Eua of The Mill’s live stream.

Brandon Gillam of Carbon Games’ live stream.

Gio Nakpil of the Oculus REX team.

Steve Lord, amazing traditional and Zbrush sculptor. (This is the only live stream we have with the updated UI and features of the current build).

Two Facebook groups for viewing and sharing work:

Virtual Sculpting

Oculus Medium Artists

ZBrush educator and animator, Glen Southern’s  tutorials.

.

 

 

Goro Fujita live stream sculpting session with an earlier version of Medium back in September 2016.

.

May 1
Fourteenth class meeting. We will have already set up on Sunday, April 30, for the the end of semester Mocap for Artists exhibition in the B&W Studio Gallery A404 as a part of the Digital Arts Expo. Our exhibition is scheduled to open at 2:00pm on Thursday May 4. I will be attending the Unity Vision VR/AR Summit 2017 in Hollywood on Monday, however students wishing to do more installation work in A404 can contact me about continued access during the days leading up to the exhibition.  In lieu of the Monday class meeting we will gather at the class exhibition on Thursday for our final meeting of the semester.

.

May 8
The School of Film/Video Bijou Festival.

.

.

Advertisements