FVEA 419/619: Motion Capture for Artists 2016

marey2red

“Progress is not possible without deviation” –Frank Zappa

Evolving Class Schedule for 2016:

January 11
First class meeting. The first and last required reading of the semester:  Maureen Furness’s comprehensive overview paper, Motion Capture.  We will hold a general discussion of various motion capture technologies and the nature of the systems we have at CalArts.  During the discussion we will explore the ways in which the course might work for the diverse group of students making up the class.  The specifics of the week-to-week course structure will be determined in large part by the individual interests of the group.  Following the discussions we will view several approaches to motion capture –starting with a brilliantly cynical warning about the type of mocap abuse we hope to avoid.  Most of the following videos include scenes created via the PhaseSpace mocap system at CalArts. The final video is a look at the innovative early approach to motion capture for computer graphics used in the pioneering television commercial Brilliance created by Robert Abel and Associates.  All of these projects serve as useful examples for contemplating the possibilities of what we might do in this semester’s course.

.

The California Institute of Motion Capture, CalArts Producers Show Intro 2007
We can do this the easy way or the hard way –or not do it at all.
Let’s push the boundaries of irony and do the dance of the mushroom cloud anyway!

.


Ke Jiang made Taxi while he was a student at CalArts.  He used The PhaseSpace mocap system to create a quirky performance by taking advantage of the artifacts that occur at the edge of the capture volume:

.


Visiting Artist Max Hattler conducted a workshop during the Program in Experimental Animation interim sessions in 2011.  The goal was to produce one or more short works using abstracted motion capture.  Forms I (Taekwondo) is one of those works:

.


Shimbe used the PhaseSpace motion capture system in a unique way for the making of this film. He rigged a Bunraku puppet with active markers and directed Danielle Ash as the puppeteer. The natural floppiness of the puppet provided an extraordinary quality to the targeted motion. You can see some still photos of the process in my photo album of the initial PhaseSpace tests at CalArts.

“The Wonder Hospital, a 3D & puppet animated film, is a surreal journey of oddity and empty illusion. In a mysterious hospital, modification of physical beauty is not what you would expect. A girl’s desire for superficial beauty leads her to chase after the luring ‘After’ images on a path of advertisements throughout the hospital. But in the end she finds something unimaginable and irreversible.”

.


A Maya playblast from 18 March 2010 of Sara Pocock‘s little girl character animated via a simple mocap T-pose test. The T-pose test was performed in class by Justin Leon in order to double check that we had setup the MotionBuilder marker mapping process correctly before moving on to a directed capture session. We came close to doing a brief capture session but ran out of time and had to postpone the session until the upcoming class. The realtime data from the T-pose test is all that we used in this test. No clean-up, filtering, retargeting, or other adjustments were done. Justin’s simple casual movements gave the character an unintended sense of presence. In subsequent class meetings Justin and Sara worked on directed performance tests in order to gain more experience with that form of mocap –even though Sara’s goal was to keyframe all of the animation in the final film.

.


For her MFA thesis project, A Scenic View of the End of the WorldIvy Flores chose to collaborate with choreographer Daniel Charon and composer Alex Wand in an iterative process wherein each participant would base what they were doing on the work the others had done in an earlier version.  This enfolding process modified the work with each iteration until Ivy reached the form she found most interesting.  The motion capture of the dancers was recorded with the PhaseSpace Impulse system mounted in the computer animation lab located in room F105 and processed to extract essential movement.  The final installation was presented in the Black and White Studio Gallery A404 from April 4-5, 2013.

.


Ivy Flores’ documentation of the process of creating  “A Scenic View of the End of the World”.

.


Prior to her work with the PhaseSpace Impulse motion capture system, Ivy created this performance animation piece using two orthogonally placed video cameras and Adobe After Effects motion tracking.

.


Rachel Ho collaborated with Julian Petschek, and Rob Gordon in the creation of the live motion capture performance, Mo Cap Mo Problems, staged in the Black and White Studio Gallery A404 as part of the Motion Capture for Artists course exhibition in the Spring of 2013.

Mo Cap Mo Problems is a 15 minute performance and video installation that employs live motion-capture in the engagement of virtual characters and spaces. The performance deals with issues of identity and technology in the service of pop culture, as explored through role-playing and the form of music gigs/concerts.”

.

Following upon the critical success of Mo Cap Mo Problems, Rachel Ho developed SLEIGHTING. This video features footage from 5 shows performed on April 3rd, 2014 in the CalArts B&W Studio A404 and was constructed to promote performances as part of the LAX 2014 Festival on September 20th, 2014, at the Bootleg Theater, Los Angeles.

SLEIGHTING is an unprecedented approach to multimedia performance using real-time motion capture and pre-visualization tools to enable a new breed of performer. It is about showmanship, hype and the future of entertainment.

Through the ability to pilot avatars before a live audience, SLEIGHTING creates a new type of superstar who is no longer confined to being one personality, but is able to be anyone and everyone. Like the superstar DJ or sportsman who commands arenas full of fans, SLEIGHTING presents itself as a future art and sport, and an event that people descend upon to witness and partake in. In this case, the arena is now the site of reinvention for the event film, and the director is now conductor and performer, empowered through technological extensions of the self.

The show has a running time of around 20 minutes and mainly consists of three sketches in which the spectacle of interfacing with virtual realities drives both narrative and design. Real-time motion capture and pre-visualization tools, typically used in the film and gaming industries, are now used to DJ virtual camera moves and special effects for this live event.”

“Penelope the penalized elephant has found himself in prison. Little does he know the true, sinister purpose of the prison.”

OCTOFORNIA is an experimental project created by Gordon Straub and Cole Mercier in their first year at CalArts.  They boldly decided to challenge themselves to make a film using 3D and Mocap computer software that they had not previously used –and were just in the initial process of learning.  They adapted their project ideas to incorporate and reveal  fundamental aspects and “flaws” of the technologies they were working with as a method of enhancing the unsettling aspects of their dystopian story.

.


Jorge Ravelo has recently been working with the Kinect for Windows v2 and Jasper Brekelman’s Brekel Pro Body v2 in the creation of real-time performance and recorded work.   In  this work-in-progress fragment he is experimenting with layering mocap recordings from MotionBuilder with live action 16mm film shot with a Bolex camera.

.


This promo documents several aspects of an early marker based motion capture system. Markers were hand tracked from a 2D screen and then plotted into the 3D character’s motion channels. The software used in this project grew out of the code used to drive motion control cameras. That code was soon incorporated into the first off-the-shelf 3D CG package created and marketed by Wavefront, Inc. Robert Able and Associates created the ground breaking Brilliance commercial for a single airing during Super Bowl XIX. Notice the subtle and not so subtle interplay of hype and confidence building between studio, agency, and client.

.

January 18
Martin Luther King Holiday

.

January 25
Second class meeting.  Visiting artist John Brennan will discuss his work with motion capture for theatrical motion pictures, virtual reality, and more.  We will suit up class member Tristan Kilmer and learn about the placement of markers on the suit relative to the body, examine the other components of the PhaseSpace Impulse system, and  explore several ways the live data may be viewed in the PhaseSpace Impulse software.  We will start up Autodesk MotionBuilder and go over loading the PhaseSpace OWL plug-in into a scene, loading in an “actor” element, assigning markers to the “actor”, dropping a “character” onto the “actor” and recording some performances.  We will add additional “characters” to demonstrate multiples driven by a single “actor”.  If time permits we will look at the use of PhaseSpace data within Unity 4.

vlcsnap-2016-02-18-17h27m45s553

John Brennan pointing out details in the legacy PhaseSpace LED marker placement diagram compared to contemporary optimizations of marker pair placement at the end of major limb bones (i.e. the distal of the humerus) in order to better isolate motion relative to the joints.

.

February 1
Third class meeting.  Continuing work with the PhaseSpace system and MotionBuilder under Parallels virtual machine on individual student workstations had to be put on hold due to technical problems with MotionBuilder running the PhaseSpace OWL plugin data from MotionBuilder.  Plan B involved obtaining an overview of how motion capture has been used to create a broad range of projects.

.

February 8
Fourth class meeting. Gordon Straub conducted a follow along demo on the use of the Autodesk Character Generator program to create personalized characters from templates. These characters were then exported as “characterized” models that could be imported into MotionBuilder for animating via mocap.  As students completed the lessons of the demo the were invited one-by-one to put on the Oculus Rift DK2 and get a taste of using the PhaseSpace pen to draw out forms in a broken version of Elijah Kleeman’s BFA thesis project Traces.  The project was broken by our upgrade to Unity 5 which has several key features missing –such as the ability to disengage the native inertial tracking native of the DK2 in order to track the DK2 via our custom gnomon rig constructed with PhaseSpace active markers.  Working with both tracking systems operating results in double transformations so that a 90º rotation of the user’s head results in a 180º rotation of the virtual image.  It is hoped that a way to disable the native tracking of the DK2 will be found soon.  Unity developers are working on that issue. The last half of the class was spent in suiting up Tristan Kilmer, adjusting the LED markers, and running a casual motion capture session. The resulting motion data was exported as an FBX file to the class directory where students could download and import it into MotionBuilder for use with the custom characters built during the first half of the class.

IMG_20160208_141426517

Tristan Kilmer wearing the Oculus Rift DK2 (with room scale head tracking implemented via the PhaseSpace LED marker gnomon) and inscribing space with looping 3D traces extruded  from the tip of the PhasePace marker based stylus.

.

February 15
President’s Day Holiday.

.

February 22
Fifth class meeting. Visiting artist John Brennan was scheduled to assist students in directing a performance capture session and optimizing the data for use in a personal project.  Unfortunately John had to cancel due to illness and we proceeded as best we could without his expert advice.

.

February 29
Sixth class meeting.  Students explore the newly acquired HTC Vive Pre 360ºroom scale VR system  through the SteamVR introductory tutorial and a direct experience of the animated immersive VR environment theBlu: Encounter from Wevr.  For these in-class demos we did not employ much of the advice in The Art of the Demo for VR installations that we will be employing in our course exhibition in the B&W Studio Gallery A404 on Thursday, April 14.

 .

March 7
Seventh class meeting. Students continue exploring the HTC Vive Pre headset and controllers with the latest version of the Tilt Brush 3D drawing demo.  Tilt Brush was originally developed by Patrick Hackett and Drew Skillman who took home an award for Best GUI at the Proto Awards in 2014.  Subsequently their fledgling company, Skillman & Hackett,  was acquired by Google and Tilt Brush was awarded Best VR Experience at the Unity Awards in 2015.   A few students explore a beta version of the 3D sculpting program, sculptVR being developed by Nathan Rowe.

.

March 14
Eighth class meeting. Introduction to Jasper Brekelman’s Brekel PointCloud Pro V2 for the Kinect V2 depth sensor.  Students continued researching and developing project ideas, while individual meetings with students working on particular project plans were held.

.

March 22
Spring Break.

.

March 28
Ninth class meeting. Review of student works-in-progress  in preparation for the upcoming course exhibition.  Discussions of hardware requirements for the show, including options for multiple VR projects potentially employing two HTC Vive Pre’s, two Oculus Rift DK2’s, and perhaps one Oculus Rift Cv1. If we are to utilize all these VR systems (and support any Kinect v2 realtime projects) simultaneously , we will need to locate more high end Windows 10 PC’s to run them.  Exhibition planning should be far enough along by the end of class to prepare advance PR for the show.

.

April 4
Tenth class meeting. TBD

.

April 11
Eleventh class meeting. Plan the physical setup for the course exhibition, Trajectories, to be held in the B&W Studio Gallery A404, Thursday, April 14. Load in equipment and begin the setup of two separate HTC Vive Lighthouse base units mounted in opposite corners of the studio.  Prepare for shooting Luca Cioci’s 360º mocap VR  recording with Tilt Brush using the Vive headset as a camera mounted on the Fisher Dolly rolling on a 360º path around Luca as he traces the body in 3D space. Installation work will continue throughout the week and up until the opening of he exhibition on Thursday evening. Special thanks to William Chen of Bell Technologies for supporting the exhibition with the loan of three VR workstations.

.

April 18
Twelfth class meeting.  A presentation and demo by a team from Faceware Technologies.  Academic Program Manager Max Murray will provide an overview of the company’s history in facial performance capture while the other members of the team set up for a demo of their real-time head mounted facial mocap system.  Since Brian Lecaroz has done some tests with Faceware’s free demo software (and had been planning on using that for his installation in the course exhibition until there was a license collapse that could not be rectified in time for Thursday’s show) Brian has been selected as the person to wear the HD Pro Headcam for the initial demo session (and perhaps come away with some files that he can use to continue work on his AI interview project).  If time permits we may be able to have others perform as well.  The team will also be bringing along their lower cost GoPro based solution. Students fill out release forms for the following week’s class field trip to Radiant Images.

 

.

April 25
Thirteenth class meeting. A voluntary field trip to Radiant Images for a demonstration of the Nokia OZO real-time 360º spherical camera by Art Christe and his team at Radiant. The demo is scheduled to be held from 2:00pm to 3:00pm.  Students will arrange to carpool to and from CalArts and Radiant Images between the class hours of 1:00pm and 3:50pm.

.

May 2
Fourteenth class meeting. Final class session and semester wrap up.

.

May 9
The School of Film/Video Bijou Festival.

.

.