Saturday 21 May 2011

Post 51 - Submission & Presentation

I have uploaded my rig files as well as the capture files. After taking part in the presentation, the initial feedback from both my supervisor and Brain Larkman was very positive. I feel that I have demonstrated a wide variety of rigging techniques and documented this in what I feel is a constructive and helpful manner. I have managed to follow the initial proposal very closely and have picked up a number of skills along the way.


I have learnt how to rig using bones, joints and blend shapes / morph targets (amongst many others)


Additional skills I have picked up includes Maya software, muscle systems and skinning strategies (as well as blend shape modelling).


As for future improvements and development, there are a couple of things which I would be interested in. One would be to add skin influence to the eyes so that when they rotate, they move the skin surrounding the eyes slightly. Additionally, there are a couple of items, which rather than being directly parented, would benefit from being position constrained. This effectively would do the same job, but has a slightly different hierarchy and is more animation friendly.  


I await the feedback, but I am very pleased with the outcome of this module as a viewed it as an opportunity to learn something that I had almost no knowledge of at the beginning. I hope the videos that I have put on YouTube are useful to people as so far I have had some responses and I have been able to help a few people with rigging related questions.














> I'm so happy, but my hands and feet have fallen off.

Post 50 - Desktop recording procedures


Now that I have finished the rig, I am going to record myself using it. The software that I have used was Cam Studio and I used the Xvid the compresses it. This is the same as all of the other videos I have been capturing. 




Cam Studio - LINK

Post 49 - Joint Facial Rig Evaluation

THE MAYA RIG IS DONE.



I have added some final touches such as the following:


- Skinning tweaked as right shoulder had 0.05% influence on the chin (which was a weight painting accident)


- Bicep pole vector influence was not equal on both sides, so bicep pole vectors re-done


- A third hip manipulator has been created so that The main one moves all, the middle rotates the hips and the smaller one moves all but the hands (if an animation sequence required the hands to be in the same position, such as holding onto a railing). These have now been parented to the main one so that when the main one is moved, the other two move with it. 


I am very pleased with the results of my Maya rig. It demonstrates a different fundamental strategy, which uses joints. This system very good and is similar to the systems seen in the games industry. This is because it uses real time manipulation of the joints and is a direct connection. In other words it does not rely on blend shapes for the majority of it's features and so takes up much less memory. It is a versatile rig as the advanced setting allows for individual points to be animated on all axis. I have enjoyed this process and have successfully shown that using a mixture of joints and blend 'corrective' shapes can work well. This rig also has the most elaborate user interface which incorporates a lot of key driven manipulations.


There are a implications with this joint based Maya system. The main one is the work around needed for naming mirrored blend shapes, plus the fact that you can't seem to be able to key frame a motion in order to record a driven key as it relies to an extent on the key frame on the time slider. This is quite a big flaw for me and although there is a work around, it would be much easier to use the same pipeline as 3ds Max. In addition, there were a number of occasions where being able to animate a particular controller or joint has been impossible due to a parenting issue. These have been resolved, but any which have not would simply need to be re-parented to a similar item (for example, the eye controllers can be parented to the neck rather than the spine).


However, this is a good system and one which I would like to look further into in the future.








Final Maya rig demo - LINK

Post 48 - Face controls & Additional Manipulation

I have finally finished constructing the Maya face user interface (seen above). This has taken a very long time to construct, but I used a lot of the principles seen in the Max rig. The slider and joystick controllers were very easy to make (the clip from PolyFaceCom demonstrates this in the previous log). In Max however, the method for a constrained controller in a box was done using scripting (although I have heard that there is a much simpler way to do this in 3ds Max 2011). In Maya, it is a case of:


1. create a circle (joystick)
2. create a box to house it
3. parent the circle to the box
4. ensure that the transformations for both items have been frozen (zeroed out)
5. go into the attributes editor - constraints and set the maximum and minimum values to that of the box
6. DONE
7. HAPPY


Very straight forward and clean. Once these have been made, it was a case of assessing the facial / animator's needs. Some sliders control several joints. All connections with joints or blend shapes are done using driven keys as they can be used to wire pretty much anything. Above is the UI, with descriptions of what they drive. On top of the jaw, phoneme and brows, I have also created various eye controller,s as well as a sneer function, and pupil sliders. The pupils are also a series of blend shapes, (dilate and contract) which have been wired using driven keys. So when the slider is down, dilate shape kicks in and when it is down, contract kicks in. Another way to manipulate the eyes would be to have a set driven key on the scale of the pupil (siumilar to the 3ds max rig). This way, the pupils would be self sufficient and wouldn't have to rely on blend shapes. But for the purposes of this rig, I think using blend shapes is fine.


Blend Shapes applied to main eyes


I have also decided to create an advanced toggle slider, which controls all of the face manipulator's visibility channels, so when it is toggled to the right, the manipulators become visible, so you can animate individual points. In addition, there is a main brow controls, which moves all six joint controls, which make up both brows.



Post 47 - Jaw Skinning & Corrective / Phoneme blend shapes via driven keys



Making blend shapes is the same as 3ds Max except it is an option in the tabs and not a modifier. Instead, after selecting the rig head and the modified head, go up to the deformer tab and selecting blend shape. For the corrective shapes, these are then wired into the jaw using set driven keys. So, the jaw is the driver and the corrective shapes are the driven. Same as Max, as the jaw opens the blend shape is set to 1 (100% on). So when the jaw opens it is initiated. This is the same with the closed, the left and the right positions. 





Tutorial Stuff
Jaw skinning and corrective shapes - LINK


My Stuff
Jaw skinning and corrective shapes - LINK
Phonemes - LINK

Post 46 - Maya Mirroring Blend Shapes -work around-

As with 3ds Max, individual vertices cannot be mirrored over. The work around for this almost identical to that of the 3ds Max solution, which is to create a skin wrap on a mirrored default model, so that the verts are truly on the opposite side. This technique was used only for one blend shape (mirroring a jaw right to a jaw left). Although this is a joint based facial rig, there is still the need for corrective shapes to fix mesh issues with the mouth and to simulate skin moving across the face.




This can be done by driving several joint manipulators with the rotational values of the jaw. However, I feel that the blend shape system is much easier to put into practice and doesn't rely on several points, but rather one deformer. There shouldn't be any reason why both techniques can be used on the same rig either, so I have started to experiment. The precise way the mirroring technique was done is explained in the link below.








Mirroring using skin wrap article - LINK

Post 45 - Face Joints, manipulators and linking strategies

Now the eyes have been created, the next phase is to create the face structure. The concept is that joints are made in the same way as the eyes (created from the head base). The ends of these joints are positioned at various points of the face which manipulation is desired. Adding them as influence to the skinned geometry. Manipulators would then be parented to each joints so that they can be animated. I found THIS clip which gave me a good idea as to where the joints would need to be for face manipulation. The reason for having them joined to the head is so that the hierarchy is clean and the joints are parented to the head, so when the head moves, so do the joints as they are part of the head's hierarchy. However, one major issues I came up with was that when tying to work on adding NURB controllers / geometry etc, as there are 20-30 joints in the head, it was almost impossible to select what you wanted to. A simple solution to this would be to switch the joints to its own layer and selected the 'reference' mode, which makes them unalterable.


Then I found THIS  video clip by 'Polyfacecom', which shows the process of creating joints using joint nubs only. It demonstrates an eye brow being rigged in this way and helped me a lot in understanding what procedures would need to be adhered to. This was good, but I came across a big problem, which was that you need the joints to be influenced 100% to their manipulators, but you also need both the joints and the manipulators to be influenced 100% to the head so that when the head is animated, all involved would move 100%. 




Early tests would result in influence being shared, so manipulators would only have 50% influence and the joints would only move 50% when the head was moved (very messy and not a usable system). This is a problem which has resulted from using the joint nubs and by not having them jut out of the head (cleaner but has its' own implications and no longer parented sufficiently). The major work around that I came up with was as follows:


1) Parent a face joint (eye brow for example) to it's NURB 'disc' manipulator.
2) Parent the NURB disc manipulator to the head.


Although not directly parented to the head, this means that when the head is moved, the manipulators (parented 100%) would move with the head. As the face joints are parented to the manipulators 100%, these follow suit. This allows for the head to be moved and the joints to be controlled 100%. I am quite pleased with this work around as I came to this solution with no aid of tutorials or videos. The only issues I can think of for this method though is that the joints are not connected directly to the head, but as you need manipulators to animate the joints, as long as the manipulators are present, it would be fine. Plus if manipulators were no longer needed, it would be a simple case of either deleting the disused joint, or re-parenting it to the head. 


So to sum up, for a cleaner rig, you can either add the head joints onto a separate layer and hide / reference (in hindsight, probably a quicker solution), or do what I did and create face joints, then parent these to controllers, then parent controllers to the head which also works. In addition, head joints can at any point be parented back into the head if this system suits you better (select joints, then hit p to parent). So good flexibility available. 




Joint placement example - LINK
Joint and brow set up - LINK