FVEA 419/619: Motion Capture for Artists 2018

marey2red

“Progress is not possible without deviation” –Frank Zappa

Evolving Class Schedule for 2018:

January 8
First class meeting. The first and last required reading of the semester:  Maureen Furniss’s comprehensive overview paper, Motion Capture.  We will hold a general discussion of various motion capture technologies and the nature of the systems we have at CalArts. During the discussion we will explore the ways in which the course might work for the diverse group of students making up the class.  The specifics of the week-to-week course structure will be determined in large part by the individual interests of the group.  Following the discussions we will view several approaches to motion capture –starting with a brilliantly cynical warning about the type of mocap abuse we hope to avoid.  Most of the following videos include scenes created via the PhaseSpace mocap system at CalArts.

.

The California Institute of Motion Capture, CalArts Producers Show Intro 2007
We can do this the easy way or the hard way –or not do it at all.
Let’s push the boundaries of irony and do the dance of the mushroom cloud anyway!

.


Ke Jiang made Taxi while he was a student at CalArts.  He used The PhaseSpace mocap system to create a quirky performance by taking advantage of the artifacts that occur at the edge of the capture volume:

.


Visiting Artist Max Hattler conducted a workshop during the Program in Experimental Animation interim sessions in 2011.  The goal was to produce one or more short works using abstracted motion capture.  Forms I (Taekwondo) is one of those works:

.


Shimbe used the PhaseSpace motion capture system in a unique way for the making of this film. He rigged a Bunraku puppet with active markers and directed Danielle Ash as the puppeteer. The natural floppiness of the puppet provided an extraordinary quality to the targeted motion. You can see some still photos of the process in my photo album of the initial PhaseSpace tests at CalArts.

“The Wonder Hospital, a 3D & puppet animated film, is a surreal journey of oddity and empty illusion. In a mysterious hospital, modification of physical beauty is not what you would expect. A girl’s desire for superficial beauty leads her to chase after the luring ‘After’ images on a path of advertisements throughout the hospital. But in the end she finds something unimaginable and irreversible.”

.


A Maya playblast from 18 March 2010 of Sara Pocock‘s little girl character animated via a simple mocap T-pose test. The T-pose test was performed in class by Justin Leon in order to double check that we had setup the MotionBuilder marker mapping process correctly before moving on to a directed capture session. We came close to doing a brief capture session but ran out of time and had to postpone the session until the upcoming class. The realtime data from the T-pose test is all that we used in this test. No clean-up, filtering, retargeting, or other adjustments were done. Justin’s simple casual movements gave the character an unintended sense of presence. In subsequent class meetings Justin and Sara worked on directed performance tests in order to gain more experience with that form of mocap –even though Sara’s goal was to keyframe all of the animation in the final film.

.


For her MFA thesis project, A Scenic View of the End of the WorldIvy Flores chose to collaborate with choreographer Daniel Charon and composer Alex Wand in an iterative process wherein each participant would base what they were doing on the work the others had done in an earlier version.  This enfolding process modified the work with each iteration until Ivy reached the form she found most interesting.  The motion capture of the dancers was recorded with the PhaseSpace Impulse system mounted in the computer animation lab located in room F105 and processed to extract essential movement.  The final installation was presented in the Black and White Studio Gallery A404 from April 4-5, 2013.

.


Ivy Flores’ documentation of the process of creating  “A Scenic View of the End of the World”.

.


Prior to her work with the PhaseSpace Impulse motion capture system, Ivy created this performance animation piece using two orthogonally placed video cameras and Adobe After Effects motion tracking.

.


Rachel Ho collaborated with Julian Petschek, and Rob Gordon in the creation of the live motion capture performance, Mo Cap Mo Problems, staged in the Black and White Studio Gallery A404 as part of the Motion Capture for Artists course exhibition in the Spring of 2013.

Mo Cap Mo Problems is a 15 minute performance and video installation that employs live motion-capture in the engagement of virtual characters and spaces. The performance deals with issues of identity and technology in the service of pop culture, as explored through role-playing and the form of music gigs/concerts.”

.

Following upon the critical success of Mo Cap Mo Problems, Rachel Ho developed SLEIGHTING. This video features footage from 5 shows performed on April 3rd, 2014 in the CalArts B&W Studio A404 and was constructed to promote performances as part of the LAX 2014 Festival on September 20th, 2014, at the Bootleg Theater, Los Angeles.

SLEIGHTING is an unprecedented approach to multimedia performance using real-time motion capture and pre-visualization tools to enable a new breed of performer. It is about showmanship, hype and the future of entertainment.

Through the ability to pilot avatars before a live audience, SLEIGHTING creates a new type of superstar who is no longer confined to being one personality, but is able to be anyone and everyone. Like the superstar DJ or sportsman who commands arenas full of fans, SLEIGHTING presents itself as a future art and sport, and an event that people descend upon to witness and partake in. In this case, the arena is now the site of reinvention for the event film, and the director is now conductor and performer, empowered through technological extensions of the self.

The show has a running time of around 20 minutes and mainly consists of three sketches in which the spectacle of interfacing with virtual realities drives both narrative and design. Real-time motion capture and pre-visualization tools, typically used in the film and gaming industries, are now used to DJ virtual camera moves and special effects for this live event.”

“Penelope the penalized elephant has found himself in prison. Little does he know the true, sinister purpose of the prison.”

OCTOFORNIA is an experimental project created by Gordon Straub and Cole Mercier in their first year at CalArts.  They boldly decided to challenge themselves to make a film using 3D and Mocap computer software that they had not previously used –and were just in the initial process of learning.  They adapted their project ideas to incorporate and reveal  fundamental aspects and “flaws” of the technologies they were working with as a method of enhancing the unsettling aspects of their dystopian story.

.


Jorge Ravelo has experimented with using the Kinect v2 and Jasper Brekelman’s Brekel Pro Body v2 in the creation of both real-time performance and recorded work. In this film, Skip Jumbo, he worked with rapid montage layering of mocap video recordings from MotionBuilder with live action 16mm film shot with a Bolex camera.

“Skip smiles // Skip jumps // Skip eats cereal // Skip looks at the sky and the news all at once.”

.

pippaspanatt

Narae Kim, Celine Tien, and Julian Soros of the Pippa’s Pan team at the AT&T Developer’s Summit where they were declared third place winners of the AT&T VR/AR Challenge.

Pippa’s Pan   is a reactive VR short film that takes you, our Agent, through the forest of Pippa’s mind to help re-capture fragments of her forgotten memories. Experimenting with techniques in animation, world building, motion-capture, 3D spatial audio, and even light field technology, Pippa’s Pan is set to be one of virtual reality’s first hybrid live-action short films. A literal forest woven from the sinews of this team’s ideas and youthful naiveté, the group of young dreamers will deconstruct concepts of storytelling to re-invent the relationship between audience and narrative.

.

January 15
Martin Luther King Holiday

.

January 22
Second class meeting.  After a preliminary discussion we moved into a demonstration of Brekel Pro Body v2, Jasper Brekelman’s mocap software for the Microsoft Kinect v2 (soon to be surpassed by the Intel RealSense Depth Camera D435) .  Some of the features and limitations of single depth camera mocap were described and a demonstration of key aspects of Brekel Pro Body 2’s real-time performance capture and display was conducted.  We then moved into a look at the integration between Brekel Pro Body 2 and Autodesk MotionBuilder.  The process of loading the Pro Body 2 plug-in for MotionBuilder into the scene by dragging and dropping it from the Devices folder into the Producer Perspective window was covered.  Loading a stock “characterized” polygonal mesh model already set up to deform via the integral bone system of the character was demonstrated.  The procedure for setting the character up to be driven by the Kinect skeleton system motion data was shown as well as the use of multiple models animated by a single real-time performer.  We also looked at the Brekel Pro Face v2 markerless facial capture software and discussed its advantages and disadvantages in comparison to more expensive facial capture solutions. The unfortunate loss of public access to the brilliantly designed markerless facial capture system, Faceshift, was mentioned as was the availability of alternative solutions such as that from Faceware Technologies which provided a live demonstration to the class in 2016. We also took a look at Joan Stavely’s chapter in Robert Russett’s book, Hyperanimation: Digital Images and Virtual Worlds in preparation for viewing and discussing her seminal work.

Jasper Brekelman showcasing some of the key features of Brekel Pro Body v2

.

January 29
Third class meeting. Students visited the Autodesk Education Community website to login to their account if they already had one to set one up if not. This provided access to the Autodesk Character Generator program which is free to students and educators using a valid school domain email address.  Autodesk provides a good workflow model online. At the end of the creation process it is possible to download the generated character for use in several programs.  Our goal was to obtain an FBX file suitable for use in MotionBuilder.  There was some confusion about the paid options actually being free with an education account.  The process indicates you must pay to download premium features, however it turns out that as a student or educator you are granted unlimited “credits” which suffices to “purchase” and download the character.  Apparently there can be a problem with newly set up accounts that may not be fully active for up to 4 hours after they are set up.  Everyone should be able to download and use their characters during class next week. We suited up class TA Vincent De La Torre and learned about the placement of markers on the suit relative to the body, examined the other components of the PhaseSpace Impulse system, and explored several ways the live data may be viewed in the PhaseSpace Impulse software.  We started up Autodesk MotionBuilder 2017 and covered loading the PhaseSpace OWL plug-in into a scene, observing the markers as Vincent moved around the scene and recorded a T-pose. Since we had ambitiously reached past point in the process prepared for the class demo we put off further work until the coming week. In next week’s class we will load an “actor” element into the scene, align the “actor” proportions to match the marker positions.  Create a MarkerSet et, drop markers into the appropriate “balls” on the MarkerSet, and create Rigid Bodies to enable better quality motion data.  We will drop a “character” onto the “actor”, and save a .fbx and .c3d file of the project.   After the break students will run MotionBuilder under the Parallels virtual machine on their individual workstations and  follow along with a demo of the process covered earlier. Each student will load a .c3d file from the first part of the afternoon and have the direct experience of mapping markers to an “actor”, making rigid body groups, loading the “actor” onto the “character”, and record the performance to file.  We looked at two interrelated  films from the incorrect locations and will view them next week from the correct locations, Wanting for Bridge by  Joan Stavely , and the end by Chris Landreth. Then we will further discuss that work and its relevance to Joan Stavely’s participation in the 1995 SIGGRAPH Conference Panel: Grids, Guys and Gals: Are you oppressed by the Cartesian Coordinate System?

Screen Shot 2018-01-30 at 1.51.19 AM

Actual education benefit: Unlimited credits – Account (Shared) 999999999999.

.

February 5
Fourth class meeting. The two interrelated films we looked at the previous week which suffered from poor image quality due to incorrect links, Wanting for Bridge by  Joan Stavely , and the end by Chris Landreth, were viewed from proper sources this week. We continued the discussion of the unique aspects of those works, followed by a discussion of Joan Stavely’s participation in the 1995 SIGGRAPH Conference Panel dealing with sexist and non-sexist readings of inherent gender differences: Grids, Guys and Gals: Are you oppressed by the Cartesian Coordinate System?

We then returned to doing a follow along demo covering the techniques involved in importing PhaseSpace motion capture marker data from a previously recorded .c3d file into MotionBuilder, creating a marker set, and mapping the markers into the prefered points on an “actor” figure (that process is shown in the rapid fire silent video PhaseSpace Mapping into MotionBuilder)created with a now older version of MotionBuilder.  Unfortunately the demo hit a wall just before class was over when clicking on the “Optical” branch in the “Navigator” window and opening and the “OWL:optical” branch by clicking on it failed to activate the markers so that they could be assigned in to Rigid Body groups.  It turns out that it is necessary to double click on the “OWL:Optical” branch to enable the selection of sets of markers. This enables the creation of rigid body groups by selecting the appropriate markers and then pressing the “b” key to create the rigid body.  Next week we will pick up where we left off with the demo.

“‘the end’ is a short film I did in 1995, while I was with Alias/Wavefront (now Autodesk). I did this to test out new facial animation software AW was developing at the time, but it turned into much more than that. ‘the end’ was nominated for an Oscar the following year, and a smattering of other awards around the world.”

–Chris Landreth

.

selectactiveactormobu_calarts_updated_02

One method of marker assignment that John Brennan recommended for real-time performance capture during live shows such as Rachel Ho’s productions, Mo Cap Mo Problems, and SLEIGHTING.

.

February 12
Fifth class meeting. We will pick up where we left off with last week’s follow along demo by opening the last saved MotionBuilder file.  In the “Navigator” window we will click on the “Optical” branch to twirl open the “OWL:optical” branch and double click on that branch to make it possible to create Rigid Body groups by selecting the appropriate marker sets and then pressing the “b” key to create the rigid body.  Once that process is complete we will move on to importing a character (such as the one each student created and saved in the class covering Autodesk Character Studio). The saved character FBX file can be imported into the MotionBuilder scene by right clicking inside the “Asset Browser” tabbed window locate inside the “Resources” window and choosing “Add favorite path”, followed by navigating to the folder on the server where the Character file is saved.  It is then possible to select the generated characters .fbx file and drag and drop it into the Producer’s Perspective window.  The next step is to assign the “Actor” as the source for driving the “Character” and playing the animation via the “Transport Controls” play bar.

 

markers_option_mediumsuit_20170121

Optimum marker placement and rigid body assignment for live performance.

NOTE: in this diagram the superscript numbers refer to the actual strings which in this instance are from CalArts’ anomalous Medium suit where instead of the head markers on being String 1 they are on String 6 and so on.

.

February 19
President’s Day Holiday.

.

February 26
Sixth class meeting. TBD

.

March 5
Seventh class meeting. I injured my back Monday morning and could not come in.  However Theotime Vaillant was able to accept my last minute request to teach the class in my absence and cover the material that I had planned on presenting from the Drawing in Space page.  Students were able to experience the mocap tracking of  head and hand in the immersive room scale VR working with Tilt Brush to draw in space.

“Google has been working closely with more than 60 artists to help them explore their style in virtual reality as part of the Tilt Brush Artist in Residence program (AiR). Coming from a wide range of disciplines, these graffiti artists, painters, illustrators, graphic designers, dancers, concept artists, creative technologists and cartoonists have all brought their passion and talent to create some amazing art with Tilt Brush.”

.

March 12
Eighth class meeting. Hand motion capture.  Class member Benjamin Scott has been experimenting with hand capture using the Leap Motion depth camera and MotionBuilder and has volunteered so show and discuss some of his tests with that system.  We will also look at other techniques for motion capture of hands using PhaseSpace and  Manus VR gloves as well as a look back at the early and later history of the VPL DataGlove and more.

“Leap Motion’s mission is to remove the barriers between people and technology. Our unprecedented hand tracking lets you reach into virtual and augmented reality to interact with new worlds. We’re currently partnering with major VR manufacturers to embed Leap Motion technology into mobile VR/AR headsets.

.

March 19
Ninth class meeting. We will follow-up on last week’s demo of Jasper Brekelmans’ Brekel Pro Hands plugin for MotionBuilder (which leverages the power of the Leap Motion depth camera system) with a viewing and discussion of the video document, “Sweetie”, a humanitarian project which Brekelman worked on.  We will then continue work on projects for the end-of-semester class exhibition which will be held in the Black & White Studio Gallery A404 on Thursday, April 19 from 8:00 pm to 11:00 pm.  We will be able to lay out the space and begin loading into the B&W on Monday April 16, run the show on Thursday night, and then whatever we do not strike on Thursday night will be struck on Friday so that the space is available for the next group of A404  users who are scheduled to arrive on Saturday.

Jasper Brekelmans’ work developing motion capture software such as Brekel Pro Face 2 designed for use with Microsoft’s latest Kinect sensor has been of great importance to the community of artists working with commodity tools. In addition to this work Jasper is employed in the animation industry, working at companies such as Motek Entertainment.  Jasper recently lent his skills to the Sweetie campaign for Terre des Hommes.

.

March 26
Spring Break

.

April 2
Tenth class meeting. Last year Narae Kim presented the latest build of her collaborative VR project Pippa’s Pan .  Students had a chance don the HTC Vive and use the handheld controllers to experience the interactive immersive VR story of two lovers reliving memories across the gap of encroaching Alzheimer’s Disease

This year we set up an additional 5 PhaseSpace mocap cameras in a ring at waist level to complement the 8 ceiling mounted cameras.  The use of the additional cameras will demonstrate the higher fidelity of the mocap data available with more complete coverage. Vincent and I  recalibrated the PhaseSpace system for the 13 cameras before class in order to allow for more performance time.  Either Vincent will  suit up in the large PhaseSpace suit which is already populated with LED markers, or students who want to capture their own performance will do so.  If anyone wants to do a personal performance, but does not fit the large suit, we will repopulate one of the other suits.

.

April 9
Eleventh class meeting.Last year we had a  presentation on Chronophotography beginning with its history as an initial form of motion capture. This was followed by a review presentation of Drawing in Space, and one-on-one sessions providing a direct experience of  the latest version of my long standing VR project, Anaphorium.

This year we will be setting up the lower course of PhaseSpace cameras in F105 and running capture sessions using the medium mocap suit for those students wanting to capture performances for use in their projects.  Those who do not need to create mocap data today will continue working on their projects or engage in other forms of self directed study.  Next week we will meet in F105 and then head up to the B&W Studio Gallery A404 to lay out and load in this years end of semester class exhibition.

.

April 16
Twelfth class meeting. We will meet in F105 as usual and then go up to the Black & White Studio Gallery A404 to lay out and begin loading in for this year’s end of semester class exhibition.

Two years ago we had a presentation and demo by a team from Faceware Technologies.  Academic Program Manager Max Murray provided an overview of the company’s history in facial performance capture while the other members of the team set up for a demo of their real-time head mounted facial mocap system.  Since Brian Lecaroz had done some tests with Faceware’s free demo software (and had been planning on using that for his installation in the course exhibition until there was a license collapse that could not be rectified in time for Thursday’s show) Brian was been selected as the person to wear the HD Pro Headcam for the initial demo session (and perhaps come away with some files that he could use to continue work on his AI interview project).   The team also brought along their lower cost GoPro based solution. This year we will take a look at the history of facial motion capture and a range of approaches that have been taken.   One of the best markerless systems was Faceshift which was purchased by Apple in 2015. Instead of continuing to make it available, they removed it from the market unfortunately depriving artists of a great tool. Amanda Vincelli has asked to learn more about the practical use of facial capture for her projects so we will be exploring how to work with the Brekel Pro Face 2 software for the Kinect V2 and MotionBuilder as one option.

.

April 23
Thirteenth class meeting. Last year Naomi Cornman and Darby Johnston from Oculus demonstrated the use of their innovative VR sculpting program Medium, providing students with direct hands on experience via a set of one-on-one sessions. Oculus is supplying us with an Oculus Touch workstation under their NextGen program so that students and faculty will be able to explore the potential of Medium more deeply and have the option of using it for projects.  Following is a list of links that Naomi sent us.

Monthly Artist Spotlight series:

Goro Fujita of Oculus Story Studio’s live stream.

Landis Fields of ILMxLab’s live stream.

Krystal Sae Eua of The Mill’s live stream.

Brandon Gillam of Carbon Games’ live stream.

Gio Nakpil of the Oculus REX team.

Steve Lord, amazing traditional and Zbrush sculptor. (This is the only live stream we have with the updated UI and features of the current build).

Two Facebook groups for viewing and sharing work:

Virtual Sculpting

Oculus Medium Artists

ZBrush educator and animator, Glen Southern’s  tutorials.

.

 

 

Goro Fujita live stream sculpting session with an earlier version of Medium back in September 2016.

.

April 30
Fourteenth and final class meeting. Last year we had already set up on Sunday, April 30, for the the end of semester Mocap for Artists exhibition in the B&W Studio Gallery A404 as a part of the Digital Arts Expo. Our exhibition opened at 2:00 pm on Thursday May 4 and was particularly successful.   I was attending the Unity Vision VR/AR Summit 2017 in Hollywood on Monday, but arranges for students wishing to do more installation work in A404 to contact me about continued access during the days leading up to the exhibition.  In lieu of the Monday class meeting we gathered at the class exhibition on Thursday for our final meeting of the semester.

.

May 7
The School of Film/Video Bijou Festival.

.

.