Projects that I have been associated with used motion capture for both facial animation, dialogue, and gross body movement.
-Recore (Armature Studio)-
Facial motion capture in Recore was retargeted by Faceware using their proprietary analyzation methods and baked onto the facial controls. The animation data was editable in Maya through animation layers with the facial controls if further refinement was needed. final animation was baked out to FBX for playback into the Unity game engine.
-PTSD Simulation (Total Immersion / Intific)-
Facial animation in the PTSD simulation for Total Immersion was captured by House of Moves onto a joint hierarchy and retargeted onto the facial controls of the character rig. These controls were also editable if further refinement was needed in the performance.
-HAPE Simulation (Total Immersion / Intific)-
Motion capture for gross body movement was used on the project HAPE (High-Altitude Pulmonary Edema) for total Immersion / Intific. This simulator allowed Navy Corpsmen to study the physical effects of high altitude on Marines. I was responsible for character rigging, direction of motion capture actors, and implementation. Motion capture was shot at House of Moves.
The video shows the Mocap blending rig where motion capture could be imported and the character rig then would follow the motion capture. The Character rig then could be offset and manipulated with the original motion capture animation not destroyed. Any key-frame animation could then be layered upon the existing motion capture.
Iron Man 2 (SEGA Studios / Secret Level)
Facial animation for in game cinematics was captured and re-targeted directly onto the in game character joint hierarchy. The facial rig was primarily composed of joints and skin weighted geometry.