top of page

Rigging and Creature FX

Rigging Overview:

Rigging is the process of adding a skeleton to a mesh, allowing the mesh to be animated and manipulated. The images below show the process of creating a simple rig, animating it using expressions and using MASH to expand the animation.

Autodesk Maya 2018 - Student Version_ un

The first step is to create a mesh, in this case a simple tentacle shape.

Autodesk Maya 2018 - Student Version_ un

The second step is to create a joint hierarchy. The joints are placed in line with the edges of the mesh, this should result in cleaner deformation when the mesh is skinned.

Autodesk Maya 2018 - Student Version_ un

The next step is to skin the mesh. This links the geometry to the joints, you can see in the image above how the mesh now follows the skeleton.

Autodesk Maya 2018 - Student Version_ un

To animate the tentacle in a random, organic way, expressions can be used. Above is an example of the expression used, it uses a sine wave to change the rotation of a joint in a certain axis, the example above is for the rotation in the X axis. The (time*10) part of the expression controls the speed of the wave, and the *70 outside of the brackets is the maximum angle of change, in this case 70 degrees. This expression is applied to each rotational axis of each joint with slight variations to create the animation seen above.

Autodesk Maya 2018 - Student Version_ un

MASH is a plugin that allows for instancing and procedural animation. The example above shows how you can create a small creature using MASH and the animated tentacle. Using MASH to instantiate the tentacle you can distribute the instances over the surface of a mesh, in this case a sphere. Adding a time node to the MASH network allows you to offset the animation of each tentacle, this adds a randomness to the animation of the creature.

Above is an example of how you can instantiate the previous MASH network to create a gaggle of the creatures. This network also has a signal node applied, this is what gives the creatures their own motion, this node uses trigonometrical functions to create the motion, similar to the expression used to animate the individual tentacles.  

Joint Creation:

Expanding on the basic joint hierarchy used with the tentacle model, we went through the process of a rigging a humanoid robot character (Below). This involved using several new techniques. 

JointCreation_01.JPG
JointCreation_02.JPG

The first step was to create the joints for the arm of the character. Using the joint tool you can see that the primary axis of the created joints will be the X-axis, using this information you create a line of joints along the grid in the X-axis, this ensures that the joints will rotate correctly when in position. You place on more joint than the model requires, this forces what will be the last joint in the chain to also have the correction orientation.

JointCreation_03.JPG

Next, using the front, side and top orthographic views, you align the joints to the center's of the parts of the model that they will control. When positioning the joints you should only translate the joints in the X-axis and use the rotation of the previous joint to move them up and down, this ensures that the primary axis follows the direction of the joint chain. 

JointCreation_04.JPG

Using this technique results in a rotation value being stored in the joints, this would be confusing when it comes to animation, as a value of 0 wouldn't be the joints original location. To fix this problem the rotation value should be copied and pasted into the respective axis of the 'Joint Orient' attribute. This keeps the all of the joint chain in the correct location, while resetting all of the rotation values to 0. 

JointCreation_05.JPG
JointCreation_06.JPG

When rigging a humanoid character you can utilise the 'Mirror Joints' tool, this allows you to mirror one side of characters joints to the other, this really helps to speed up the rigging workflow. It is important to correctly name your joints as you are creating them as the mirror joints tool can replace letters or words in duplicated joints. The example below shows the setup to change any L's found in the duplicated joints to an R.

JointCreation_08.JPG

After the joints have been mirrored they then need to be connected. This forms a complete hierarchy as oppose to the separate joint chains you currently have at this stage. The options of the 'Connect joints' tool needs to be changed, the parent joint option is used to correctly create a connection between the joints.

JointCreation_07.JPG

The image below shows the finished rig.

JointCreation_09.JPG
Skinning:

Skinning is the process of binding a skeleton to a mesh, so that when a skeleton is moved the mesh deforms and follows the skeleton. For this exercise we were provided with a basic arm mesh and joint chain (Below).

Skinning_01.JPG

The first step is to bind the mesh to the joints. This is done by using the 'Bind Skin' tool. When using this tool Maya automatically assigns weight maps to each joint in the chain, these weight maps control what part of the mesh move when the joint moves and how much it moves.

Skinning_02.JPG

These weight maps often have issues, for example parts of the mesh that would not be influenced by a joint will deform slightly when a joint is moved. Once a mesh has been skinned the 'Paint Skin Weights tool' can be accessed by holding down right-click. This tool allows you to manually create weight maps for the joints.

Skinning_03.JPG
Skinning_04.JPG

Each vertex of the mesh has to have a total influence of 1.0 applied to it, this means that when you are painting weights on a joint and you remove its influence on a vertex, that vertex has to replace that influence from another joint. This can result in your weight maps changing on joints that you are not actively editing. To get around this problem you can lock joints while editing others. In the example below the shoulder joint is currently having its weight map edited, to avoid changing the weights of the root joint that have already been fixed, that joint has been locked. The joint below the shoulder in the hierarchy is also unlocked, this has to be done so that any vertex's that need to make their influence up to 1.0 can do so using the unedited weights of the elbow joint. 

Skinning_05.JPG
Controls:

Adding controls to a rig makes the animation process a lot easier and faster. With controls you can also add shortcuts to specific movements and actions. Adding controls to a rig is usually the last step when creating a character rig. Below is an example of a finished rig with controls.

Rig_Controls.JPG

Limbs like the arms and legs can be controlled in two different ways, below is an example of an FK system. An FK system uses rotation controls on each joint in the joint chain, this system can be quite time consuming to animate as all of the joints need to be animated separately, however it does give a lot of control over the motion of the limb. The control system is setup using NURBS curves as drivers for the joints, this allows the controls to be easily selected during animation. The curves are then orient constrained to the joints, this allows the curve to control the rotation of the joint. Once all of the controls have been placed and constrained they are then parented, this means that the control curves can follow the position of the joints.

Controls_01.JPG

Below is an example of an IK system. Unlike the FK system an IK system only requires one control, this is because it uses the position of the last joint in the chain, the wrist in this case, to drive the position and rotation of the joints further up the chain. To set the IK system up a control is placed at the wrist joint, with the wrist joint selected shift select the shoulder joint and add an IK handle, the wrist control can then be point constrained to the IK handle, this allows you to control the IK system using the wrist joint.

Controls_02.JPG
Assignment 1:
Mechanical Rig

The above videos show the final mechanical rig and its capabilities. (Technical animation left, Render with Mocap data applied right).

Process

The first step was to create a model to rig. I decided to go for a humanoid robot with joints located in the same place as a human. I decided against having an articulating spine as I thought it would result in too much of a human look and wanted the character to move distinctively robotic. I used large, blocky shapes to try and create a distinctive silhouette. 

Model_WiP.JPG

Once modeled and unwrapped I took the character into Substance painter to create the textures for it. When unwrapped in Maya I decided to create multiple UDIM tiles (see below), this allowed me to edit the whole character at once in Substance painter and use one material in Maya. The material is made up of two elements, a dark metal base and a golden yellow metal pattern that is visible as wear. I used the instancing tool in substance painter in order to repeat the material setup over the whole model.

Substance.JPG
Substance_Udim_Instancing.JPG

The next step was to create the rig. To ensure correct joint orientation I created the joint chains along the X axis of the grid and then moved the joints into place using only translations in the X axis and rotations. The rig has the same joint placement and structure as a human rig would, except from having less spine joints and an additional shoulder joint. The additional shoulder joint is in place due to the way in which the robot is modeled. The shoulder area has one part that rotates around the Z axis and another connected part that rotates around the X axis, this gives the shoulder full range of motion without using a ball joint system (See video below).

Rig_Polys.JPG
Rig.JPG

I then moved on to creating controls for the rig. This was done using NURBS curves as easily selected objects to drive the rig. Once created the NURBS curves were moved into place by using the 'Match Transformations' tool, this allows you to easily match the location of one object to another. Once placed, it is important to remember to freeze the transformations on the curves, this resets all of the input translations and rotations to 0, making animation a lot easier. Next the curves are constrained to their respective joints, this is done using an orient constraint. Once constrained the curve then needs to be parented to the joint at the start of the chain that it is in, if a control is for a joint that isn't at the top of the chain it can be parented to the control one higher up the chain.

Rig_Controls.JPG

I decided to make use of an IK system for the legs. This was added to the rig using the IK Handle system (see below). An IK system uses the position of the joint at the end of a chain to drive the position and rotations of the joints above it in the chain, this can result in a more organic movement, especially for arms and legs. When initially created the IK system points the knee in the wrong direction; facing along the X axis. This problem is solved by creating a knee control that acts as a pole vector, a pole vector ensures that the IK handle points in the right direction, it also allows for control of the rotation of the knee.

Rig_IKHandles.JPG

The final part of the rigging process was to create some extra attributes for movements that would be commonly used when animating, this means that a slider can be used to perform a certain motion rather than having to be animated each time by hand. This is done by using the 'Driven Key' function. A driven key allows you to control the movement of one object by using an attribute of another, this could be a rotation, translation or an extra attribute. In this case I used extra attributes, additional attributes can be created in the attribute editor using the add attributes option. When adding a new attribute you can define its name, data type and a minimum and maximum input. In this case the data type was set to 'Float', with a minimum value of 0 and a maximum value of 1.

AddAtrribute.JPG
Rig_DrivenKey.JPG
AddAttribute2.JPG

Once the custom attribute has been setup the driven key can then be setup. This is accessed via the animation tab>Key>Set Driven Key>Set. The set driven key window allows you to set a driver, in this case the L_Arm_Spin attribute of the shoulder control, and a driven object, in this case the Rotate X of the shoulder joint. Once set a key can then be created with the attribute and rotation in a default state, the attribute can then be changed to 1 and the rotation of the joint set to 360, and another key set. This should mean that when the attribute is animated from 0 to 1, the rotation of the joint will be changed, allowing for the spin of the first shoulder joint to be easily controlled using an attribute.

SetDrivenKey.JPG
SetDrivenKey2.JPG
Assignment 2:
Organic Rig

The above videos show the final organic rig, showing off some of its capabilities in a short rendered sequence. The gallery below includes a handful of stills, showing the rig in different poses.

The videos below show the same animation as the rendered sequence above, but without the fur and showing the rig and controls.

Breakdown

The images below show a step-by-step breakdown of the creation of the rig.

ZbrushMesh.PNG
Cat Ref.jpg

Above you can see a screen grab of the mesh from ZBrush, I decided to use ZBrush to sculpt to the mesh as it would allow me a greater amount of creative control over using the sculpting tools in Maya. You can also see an example of the type of reference images that I used, I also utilised the skeletal reference images later on when rigging.

MayaRig_NoControls.PNG
MayaRig_Final.PNG
MayaRig_Bare.PNG

The above images show the rig created in Maya. The first image shows the rig placed in the character, the second image show the rig with controls and the final image shows the rig isolated. The main features of the rig include IK driven fore and hind legs, secondary sub-controls on fore and hind legs/feet (See below), spline Ik driven spine and tail, and induviually rigged ears with controls.

FootSubControls.PNG

One of the main features of the completed character is the inclusion of Xgen fur. Below is an image of the final character as seen in the Maya viewport, showcasing the Xgen fur.

CharacterFinal.PNG

The images and descriptions below show the process of creating and grooming the Xgen fur.

Xgen_Groom01_Fur.PNG
Xgen_Groom01.PNG

The first step was creating a density map for the fur, this dictates where fur will appear, in this case there is fur over most of the character, apart from the nose, around the mouth, the bottom of the feet and as the eyes are separate pieces, the area behind the eyes. With the density map applied the image above shows the base fur that is created.

Xgen_Groom02_Fur.PNG
Xgen_Groom02.PNG

The image above shows the fur after a sculpt of the guides. You can see in the Xgen interactive groom editor window the separate layers used; a comb layer, smooth layer and length layer. These layers combined bring the correct shape and flow to the fur.

Xgen_Groom03_Fur.PNG
Xgen_Groom03.PNG

The next step was to use a clump modifier, this modifier clumps areas of fur together helping to create a more realistic groom. In order for the clumps to be placed correctly, following the fur guides, the node editor had to be used to input the data from the guides into the clump modifier. Without this change the clumps are randomly generated.

Xgen_Groom04_Fur.PNG
Xgen_Groom04.PNG

The next step was to add another clump modifier. Adding a second clump modifier clumps the previously generated clumps together, this also helps with the final look of the fur, creating a more natural, imperfect look.

Xgen_Groom05_Fur.PNG
Xgen_Groom05.PNG

The final step was to add a noise modifier. This adds noise to the attribuites of the fur, resulting in a more natural looking disorder to the fur. When rendering the sequence at the start of this section the fur had to be cached, this means that Maya and Xgen doesn't have to recalculate the fur every frame when rendering, improving render times and reducing the likelihood of a crash.

Bifrost Graph

Below is an example of a fire simulation that uses an animated character as a source. I used a free rig from Truongcg, which I then retextured and shaded.

The first step was to aquire and apply motion capture data from the Rokoko Motion Libary onto the character. Due to the complexity of the downloaded rig I was unable to create a working HumanIK character definition for the character, so I removed the rig from the character, leaving only the mesh in the scene. I then used the HumanIK auto-rig function to create a new rig and character definition, I could retarget the motion capture animation onto the character, you can see a playblast of the character in the scene above.

BifrostGraphEditor.PNG

With the motion capture data applied to the character, I could then create an alembic cache of the animated mesh. This .abc cache file allows the animation to be used with the Bifrost Graph Editor much more efficently than an uncached, keyframed rig and mesh. Above you can see the Bifrost Graph window, I used a preset torch flame to create the main part of the graph and later edited various parameters to create a simulation that I was happy with. A read_Alembic node is used to input the cached animation, however a time node has to be connected to this in order to for Bifrost to play through the whole animation, otherwise it only uses a single frame. The read node is then fed into the air source node, this acts as a source for the aero simulation. In the playblasted example the read node is also connected to the burning_geo output, this is to show how the final comp might look when combined with the rendered animation. Below is a final shot with the fire and character composited onto a live action background.

MASH

MASH is a plug-in for maya that allows for the creation of motion graphics and other guided animation. MASH allows you to utilise instancing to create large numbers of geometery without having huge performance issues, below is a short introduction into MASH.

Below is a final render of a dynamic MASH network applied to a mesh animated with motion capture data.

Mesh As An Influence

When a mesh is skinned a basic deformation is used when animating, however this doesn't always create realistic results, for example when a humanoid arm bends at the elbow the bicep contracts, creating a bulge. You can recreate this effecting by using a mesh as an influence for skin weights. An example can be seen below.

The first step is to create a mesh sphere and position it inside the arm mesh, the sphere is also scaled to roughly resemble the shape of the bicep and tricep muscles. The sphere is then parented to the shoulder joint.

SphereWireframe.PNG

Next the sphere is added as an influence, this done by navigating to Skin>Edit Influences>Add Influence. Once the sphere has been added as an influence it will appear in the skin weight painting editor.

AddInfluence.PNG

Next the weights of the sphere have to be painted, you can see below how an area around the upper arm has been painted, with most of the influence being on the frontal, bicep area, with the influence dropping around the sides and back of the arm.

WeightPainting.PNG

With the weights correctly setup the rotation of the elbow joint can now be used to drive the scale of the sphere. This means that when the elbow joint is rotated the scale of the sphere changes in relation to this, which in turn, deforms the mesh using the newly painted skin weights, creating a bulge at the bicep of the arm.

SetDrivenKey.PNG
Blendshapes

Blendshapes allow you to animate a mesh between different poses, these poses can be created be either sculpting the mesh, using skinned joints or a combination of both. Below is an example of using blendshapes to creating a smiling mouth, with induvidual control over each side of the mouth.

The shape editor is used to create and edit blendshapes. First a base blendshape is created, this is the unedited mesh.

BS_01.PNG

A target is then created, using the add target button. This creates an editable layer for the original mesh.

BS_02.png

The mesh can then be edited, in this case by moving vertices on the corner of the mouth upwards. You can see a red edit indicator on the right hand side of the screen, this shows that the layer is recording the changes being made to it.

BS_03.PNG

Once one side of the mouth has been edited the target can be duplicated, renamed and then using the flip target option can be changed to the opposite side of the mouth. This means that you don't have to try and match the original target by hand.

BS_04.PNG

The eyelid below has been animated in the same way as the mouth using blendshapes to drive custom deformation.

The animation below was created using the pose editor. This allows you to create blendshapes using a combination the rotation of joints and sculpting of the mesh.

Squash and Stretch

Squash and stretch is one of the 12 principles of animation, the animation below shows how this can be added to a rig, as oppose to having to manually animate the scale of the mesh.

One of the main challenges with a squash and stretch system is that the volume of the spheres need to be maintained. This can be done buy using several multply divide nodes in the Maya node editor. The outputs of these nodes are then fed into the scale attributes of the rigged spheres.

NodeEditor.PNG
bottom of page