Final Major Project – Rigging & Motion Capture

Square

To rig both characters I created a skeleton using the Human IK menu, scaled both skeletons to fit each model and repositioned the joints as needed. I then created character definitions for both characters by clicking on each bone and assigning them to the corresponding bone on the definition. After this I bound the skin (model) to the skeleton and painted the weight influences of each joint. As the eyes are parented to the head joint and will follow the direction of the head, to be able to have more control over their movement I made a locator and aim constraint for the eyes. The motion capture suit doesn’t animate the fingers so I created set driven keys to do so. I did this by parenting a NURBS curve to each wrist and adding attributes to it. These attributes were the finger names; Index, Middle, Ring, Pinky and Thumb, giving them a maximum value of 10, default value of 0 and minimum value of -10. These attributes appear in the channel box for the NURBS curve and can be key framed when a value between -10 and 10 is typed in to the box, -10 being fingers raised, 0 neutral and 10 being fingers fully curled.

 

For the facial animation I’m using blendshapes. I made the blendshapes by extracting the face from the head mesh, duplicating it for each blend shape and moving the vertices to make the expressions. I used Paul Ekman’s book ‘Unmasking the Face’ to help with creating these shapes. As the boy character is wearing a mask that covers the mouth and nose I didn’t worry about moving the mouth vertices too much, just enough to distort the cheek areas that will be on display.

 

My model for the motion capture was fellow animation student Scott Knight, acting both the role of the robot and young boy. After capturing the data, I went through all the motion capture data and cleaned it up by simplifying the animation curves to make the key frames more visible and tweaking them where needed. The feet sensors in the motion capture suit were faulty so the skeleton in Maya is routed to a zero position on the ground which made the walk cycles appear as if they were on the spot. To overcome this I parented the skeleton to a locator which I could then move around the grid as needed. I also had to do more clean up than intended on the legs and feet as they weren’t always following the correct axis.

Yahni 🙂