I was the character rigger for Total Immersion / Intific’s Alameda office. There we worked on various simulation software projects for DARPA. PTSD Simulator, HAPE Simulator, and DARPA Dog Simulator

Auto Rigging system written in Python

Auto Rigging Sytem

This is an example of the python scripted auto rigging system I developed while at Total Immersion / Intific. It shows building a bipedal rig from a joint layout.


PSTD SIMULATOR

PTSD Simulator

This project was a PTSD simulator where personnel could study the signs of PTSD through micro-expressions of the face of an avatar. I was responsible for facial rig setup and motion capture implementation. Motion capture was shot at House of Moves.

The video above displays the bone-based facial rig that could have motion capture applied to it and still retain the ability to tweak the animation via traddtional key-frae animation on the facial controls.

HAPE SIMULATOR


HAPE Simulator

This project was a HAPE (High-Altitude Pulmonary Edema) simulator where Navy Corpsmen could study the physical effects of high altitude on Marines. I was responsible for character rigging, direction of motion capture actors, and implementation. Motion capture was shot at House of Moves.

The video above shows the Mocap blending rig where motion capture could be imported and the character rig then would follow the motion capture. The Character rig then could be offset and manipulated with the original motion capture animation not destroyed. Any key-frame animation could then be layered upon the existing motion capture.