Saturday 21 May 2011

Post 51 - Submission & Presentation

I have uploaded my rig files as well as the capture files. After taking part in the presentation, the initial feedback from both my supervisor and Brain Larkman was very positive. I feel that I have demonstrated a wide variety of rigging techniques and documented this in what I feel is a constructive and helpful manner. I have managed to follow the initial proposal very closely and have picked up a number of skills along the way.


I have learnt how to rig using bones, joints and blend shapes / morph targets (amongst many others)


Additional skills I have picked up includes Maya software, muscle systems and skinning strategies (as well as blend shape modelling).


As for future improvements and development, there are a couple of things which I would be interested in. One would be to add skin influence to the eyes so that when they rotate, they move the skin surrounding the eyes slightly. Additionally, there are a couple of items, which rather than being directly parented, would benefit from being position constrained. This effectively would do the same job, but has a slightly different hierarchy and is more animation friendly.  


I await the feedback, but I am very pleased with the outcome of this module as a viewed it as an opportunity to learn something that I had almost no knowledge of at the beginning. I hope the videos that I have put on YouTube are useful to people as so far I have had some responses and I have been able to help a few people with rigging related questions.














> I'm so happy, but my hands and feet have fallen off.

Post 50 - Desktop recording procedures


Now that I have finished the rig, I am going to record myself using it. The software that I have used was Cam Studio and I used the Xvid the compresses it. This is the same as all of the other videos I have been capturing. 




Cam Studio - LINK

Post 49 - Joint Facial Rig Evaluation

THE MAYA RIG IS DONE.



I have added some final touches such as the following:


- Skinning tweaked as right shoulder had 0.05% influence on the chin (which was a weight painting accident)


- Bicep pole vector influence was not equal on both sides, so bicep pole vectors re-done


- A third hip manipulator has been created so that The main one moves all, the middle rotates the hips and the smaller one moves all but the hands (if an animation sequence required the hands to be in the same position, such as holding onto a railing). These have now been parented to the main one so that when the main one is moved, the other two move with it. 


I am very pleased with the results of my Maya rig. It demonstrates a different fundamental strategy, which uses joints. This system very good and is similar to the systems seen in the games industry. This is because it uses real time manipulation of the joints and is a direct connection. In other words it does not rely on blend shapes for the majority of it's features and so takes up much less memory. It is a versatile rig as the advanced setting allows for individual points to be animated on all axis. I have enjoyed this process and have successfully shown that using a mixture of joints and blend 'corrective' shapes can work well. This rig also has the most elaborate user interface which incorporates a lot of key driven manipulations.


There are a implications with this joint based Maya system. The main one is the work around needed for naming mirrored blend shapes, plus the fact that you can't seem to be able to key frame a motion in order to record a driven key as it relies to an extent on the key frame on the time slider. This is quite a big flaw for me and although there is a work around, it would be much easier to use the same pipeline as 3ds Max. In addition, there were a number of occasions where being able to animate a particular controller or joint has been impossible due to a parenting issue. These have been resolved, but any which have not would simply need to be re-parented to a similar item (for example, the eye controllers can be parented to the neck rather than the spine).


However, this is a good system and one which I would like to look further into in the future.








Final Maya rig demo - LINK

Post 48 - Face controls & Additional Manipulation

I have finally finished constructing the Maya face user interface (seen above). This has taken a very long time to construct, but I used a lot of the principles seen in the Max rig. The slider and joystick controllers were very easy to make (the clip from PolyFaceCom demonstrates this in the previous log). In Max however, the method for a constrained controller in a box was done using scripting (although I have heard that there is a much simpler way to do this in 3ds Max 2011). In Maya, it is a case of:


1. create a circle (joystick)
2. create a box to house it
3. parent the circle to the box
4. ensure that the transformations for both items have been frozen (zeroed out)
5. go into the attributes editor - constraints and set the maximum and minimum values to that of the box
6. DONE
7. HAPPY


Very straight forward and clean. Once these have been made, it was a case of assessing the facial / animator's needs. Some sliders control several joints. All connections with joints or blend shapes are done using driven keys as they can be used to wire pretty much anything. Above is the UI, with descriptions of what they drive. On top of the jaw, phoneme and brows, I have also created various eye controller,s as well as a sneer function, and pupil sliders. The pupils are also a series of blend shapes, (dilate and contract) which have been wired using driven keys. So when the slider is down, dilate shape kicks in and when it is down, contract kicks in. Another way to manipulate the eyes would be to have a set driven key on the scale of the pupil (siumilar to the 3ds max rig). This way, the pupils would be self sufficient and wouldn't have to rely on blend shapes. But for the purposes of this rig, I think using blend shapes is fine.


Blend Shapes applied to main eyes


I have also decided to create an advanced toggle slider, which controls all of the face manipulator's visibility channels, so when it is toggled to the right, the manipulators become visible, so you can animate individual points. In addition, there is a main brow controls, which moves all six joint controls, which make up both brows.



Post 47 - Jaw Skinning & Corrective / Phoneme blend shapes via driven keys



Making blend shapes is the same as 3ds Max except it is an option in the tabs and not a modifier. Instead, after selecting the rig head and the modified head, go up to the deformer tab and selecting blend shape. For the corrective shapes, these are then wired into the jaw using set driven keys. So, the jaw is the driver and the corrective shapes are the driven. Same as Max, as the jaw opens the blend shape is set to 1 (100% on). So when the jaw opens it is initiated. This is the same with the closed, the left and the right positions. 





Tutorial Stuff
Jaw skinning and corrective shapes - LINK


My Stuff
Jaw skinning and corrective shapes - LINK
Phonemes - LINK

Post 46 - Maya Mirroring Blend Shapes -work around-

As with 3ds Max, individual vertices cannot be mirrored over. The work around for this almost identical to that of the 3ds Max solution, which is to create a skin wrap on a mirrored default model, so that the verts are truly on the opposite side. This technique was used only for one blend shape (mirroring a jaw right to a jaw left). Although this is a joint based facial rig, there is still the need for corrective shapes to fix mesh issues with the mouth and to simulate skin moving across the face.




This can be done by driving several joint manipulators with the rotational values of the jaw. However, I feel that the blend shape system is much easier to put into practice and doesn't rely on several points, but rather one deformer. There shouldn't be any reason why both techniques can be used on the same rig either, so I have started to experiment. The precise way the mirroring technique was done is explained in the link below.








Mirroring using skin wrap article - LINK

Post 45 - Face Joints, manipulators and linking strategies

Now the eyes have been created, the next phase is to create the face structure. The concept is that joints are made in the same way as the eyes (created from the head base). The ends of these joints are positioned at various points of the face which manipulation is desired. Adding them as influence to the skinned geometry. Manipulators would then be parented to each joints so that they can be animated. I found THIS clip which gave me a good idea as to where the joints would need to be for face manipulation. The reason for having them joined to the head is so that the hierarchy is clean and the joints are parented to the head, so when the head moves, so do the joints as they are part of the head's hierarchy. However, one major issues I came up with was that when tying to work on adding NURB controllers / geometry etc, as there are 20-30 joints in the head, it was almost impossible to select what you wanted to. A simple solution to this would be to switch the joints to its own layer and selected the 'reference' mode, which makes them unalterable.


Then I found THIS  video clip by 'Polyfacecom', which shows the process of creating joints using joint nubs only. It demonstrates an eye brow being rigged in this way and helped me a lot in understanding what procedures would need to be adhered to. This was good, but I came across a big problem, which was that you need the joints to be influenced 100% to their manipulators, but you also need both the joints and the manipulators to be influenced 100% to the head so that when the head is animated, all involved would move 100%. 




Early tests would result in influence being shared, so manipulators would only have 50% influence and the joints would only move 50% when the head was moved (very messy and not a usable system). This is a problem which has resulted from using the joint nubs and by not having them jut out of the head (cleaner but has its' own implications and no longer parented sufficiently). The major work around that I came up with was as follows:


1) Parent a face joint (eye brow for example) to it's NURB 'disc' manipulator.
2) Parent the NURB disc manipulator to the head.


Although not directly parented to the head, this means that when the head is moved, the manipulators (parented 100%) would move with the head. As the face joints are parented to the manipulators 100%, these follow suit. This allows for the head to be moved and the joints to be controlled 100%. I am quite pleased with this work around as I came to this solution with no aid of tutorials or videos. The only issues I can think of for this method though is that the joints are not connected directly to the head, but as you need manipulators to animate the joints, as long as the manipulators are present, it would be fine. Plus if manipulators were no longer needed, it would be a simple case of either deleting the disused joint, or re-parenting it to the head. 


So to sum up, for a cleaner rig, you can either add the head joints onto a separate layer and hide / reference (in hindsight, probably a quicker solution), or do what I did and create face joints, then parent these to controllers, then parent controllers to the head which also works. In addition, head joints can at any point be parented back into the head if this system suits you better (select joints, then hit p to parent). So good flexibility available. 




Joint placement example - LINK
Joint and brow set up - LINK

Wednesday 18 May 2011

Post 44 - Maya facial rig - Eyes

Now that I've finished the 3ds Max head rig using morph targets and reaction manager (and then applying this to the full figure rig), I felt it was time to explore another method of face rigging. Although you can also rig faces using stretchy bones and skinning them as influence to the mesh, I also wanted to return to Maya to 'mix it up a bit'.




I began by importing the Max eye geometry into the Maya file. These would then need to be given basic textures (which despite not knowing how to texture was very easy to do). The technique for rigging eyes is different from that of 3ds Max and is done by placing the eyes into position, then creating joints from the head joint, aligning them and then adding them to the joints as skinning influence. Now skinned, they will move with the head. Next up was to create look at controls. In Maya, this is  done by by applying an aim constraint on the eyes, making them aim towards a spline controller. This is the same method as Max essentially, the main difference being the way in which they are parented to the head (however, there are other strategies for parenting, which I will get into with the face joints in the next post). Influenced by what I learnt it 3ds Max, I did try to do most of this process, but I did find some tutorial clips which confirmed that what I was doing was the right way to go about it.



Joint based eye set up - LINK
Aim constraint - LINK

Post 43 - Full Figure rig: facial additions



The reason I said in my proposal that I was going to make the face rigging on a separate head model was that I wasn't convinced the full figure model has a good enough head. I had another look at it and it seems that it is more than capable of taking on morph targets and would work well for face rigging. So to cut a long story short, I have repeated all of the steps taken to make the head rig and applied it to the original full figure rig. This process took a week or so but it was well worth the time as I learnt the methods whilst making the first one, and then applied this knowledge to the second one.


The set up is identical and I copied over both the UI and the eye system and then relinked the eyes and made morph targets out of the figure's face. Considering the rig was going to have nothing done to its face, I am very pleased I made this decision. It is going over the same process, I feel it is important to repeat these processes so that they stick and I get better at doing them. The link below is the desktop capture that I handed in as part of my submission, as well as another example of the jaw rigging.






Full figure rig Reaction Manager jaw demo - LINK
Full figure rig wih facial rigging clip - LINK

Post 42 - Final head setup rig: Evaluation



The head rig is now finished. Since my last post, I went ahead and made additional morph targets for the eyelids opening and closing. These were then wired into the Face UI. I decided, after a discussion with Darren, to create some phonemes. As a follow on from the last post, I have made phonemes for A, W, O, B/M/P, U and V/F. I then redesigned the face UI to incorporate these extra features. In theory, it is possible to relocate the pupil controls into some sliders, but this would mean making the original pupil sliders not work, plus I feel that rather then by sight, using numbers is much more accurate.


New Face UI
I am very happy with the outcome of this face rig. It was a lot more understandable then I first thought and the use of morph targets made the process for a lot of the time linear. The use of reaction manager to take it a step further was a great way to make the rig a lot more animator friendly. Being introduced to the notion of corrective shapes was very helpful and allowed me to see what was important and a what time it was useful to do. 


However, there were a few issues. Firstly, the reaction manager at times is very confusing as it displays all reactions unless you toggle the show selected button a few times. Also, one major problem with the use of morph targets is that asymmetrical morph targets cannot simply be mirrored over. A very long winded work around needs to be used in order for this to work. There are scripts out there which apparently mirror geometry and vert information, but I have not been able to make any of them work.




But finding out the work around aloud me to seek out the solution and made the project all the more better for it. In addition, morph targets a great, but currently aren't particularly used in games due to the extra data needed. Games usually use a few corrective morph targets, but mostly bones / joints as they do not require 20-30 morph targets. This system is more suited for CG film and animation. 


Another issue I have with morph targets is that unless the targets are very exaggerated and a lot of them, the motions possible when wiring these into a controller are to an extent limited as they are already based on morphed models. It isn't direct manipulation , but for a first head / face rig, I am very pleased with the outcome and I am sue that this will help when I move onto a different method for rigging a face.






Final head demonstration rig - LINK

Post 41 - Parenting considerations and work around

After working with phonemes (shapes used in speech which can be difficult to animate) I came across a reaction manager / rotation problem. The jaw is already wired to the jaw controller and so cannot be given additional driven keys. However, there is a way around this. I have created several morph targets for some phonemes, but both the 'A' and 'O' need a slight jaw movement. Setting a state and slave would be the way to do this, but we need to move / influence the jaw bone without interfering with the jaw control reaction. I ended up using the rotation of the point helper, which is parented to the top of the jaw (as used earlier to zero out the transformation information). As this is connected to the jaw, it moves the jaw when it is rotated, so this was used for the 'A' phoneme. Next is the 'O' shape. the jaw bone has a 'cap', or 'nub' bone at the end of it which sticks out of the chin. As it was a small amount of influence I needed, this one was connected using it's position rather than rotation.


This work around has paid off and is an industry standard way of working. This is one of the main rules of rigging, which is to try and avoid parenting or having direct connections with the geometry and use parented point helpers instead. This technique is true for the full figure model also in the arms, legs, back, eyes and muscle bones.

You will notice in the video that the jaw has been animated to open twice. What can be seen is that the first 'jaw open' is driving the jaw as the point helper doe not move, but the second 'jaw open' is the 'O' phoneme and so is being driven by the point helper (seen by it rotating, casusing the jaw to rotate also).










Jaw parenting clip - LINK

Post 40 - Eye brow set up

The eye brows have been a little bit trickier to rig. With the help of the Paul Neale rigging DVD set, I was able to do this and to understand the process. Each motion of the eye brow has been created out of morph targets (and again , using the mirror morph target work around, these have been mirrored for the other side also). So the morph targets for each brow are:


Up
Outer corner, Middle, Inner corner
Down
Outer corner, Middle, Inner corner


This is also done using the Reaction Manager. The main difference with the eye brows is that ideally, the controller should be able to move the eye brows using both the slider's X and Y directions to get full motion of the corners of the eye brows. However, the problem is that whilst each morph target needs the local Y (up/down) axis, the local X (left/right) control would do nothing as the brow's channel has already been used. The way to do this is to add a second euler layer (bezier float). So when using reaction Manager, the Y axis should influence an up and down position, but the X axis should influence the outer and inner morph targets.




So, this requires a 'counter reaction'. When the controller is in the bottom left corner, the Y axis has full influence over all of the 'down' morph targets. But the second layer controlled by the X would have a series of counters, so the outer would have 0% influence, the middle would have -50% and the inner would -100%. Its a bit confusing until you actually do it, but essentially the result is what appears to be a smooth controller which seamlessly blends all the morph targets together as they are constantly influencing and counter influencing to get the correct kind of look.


I've recorded a video showing the results of this where I try to highlight what is happening in the scene.




Eye brow demo clip - LINK

Post 39 - Reaction Manager

Just a quick post this time to explain the key similarities and differences between Max's Reaction Manager and Maya's Driven Keys. Both systems are essentially used to do the same thing, which is to drive a transformation of certain objects; namely, to control one object using another. In both systems, you are required to load a master and a slave (driver and driven in Maya). In Max, you set a series of states, which house the translational information of the objects. In Maya, instead of states, it is keys, but apart from the name difference, they are the same. I found the Max system slightly confusing to start with as when you load you master and slave objects, it automatically creates an initial state. 


The problem with this is that if you want several things to be controlled at once, it creates 3 or more states. However, it is simply a matter of deleting these and creating a new state which will include all of the slaves. Also, in Max, all of the reactions which have been set up are displayed so if you are trying to locate a specific rotational value of a bone for example, it can be a tad confusing. But there is a work around by toggling the show selected button a few times, this will show only the master object which is being added to. 


In Maya, there is the almost opposite problem. Setting up a driven key is extremely simple and easy to do. However, once a driven key has been made, it can be difficult to find it ever again. This can lead to having to delete the object which has the driven key and starting over again. It can be found by selecting the controlled object and finding the link in the hierarchy view but a bunch of options need to be toggled before you can find it. Additionally, Max's advantage over Maya is that in Max, key framed objects can be used when recording states. This makes it very clear and quick to set states up. Maya on the other hand isn't as key frame friendly and can often lead to the key frame itself being a driven key. A major issue I had with this was when I deleted the key frames used to set up driven keys, the driven objects were still being driven, but by noting so not a good thing to happen. I resulted in not putting down key frames and did it by hand when setting driven information. 


Both systems can be used to great effect, but they do have their software specific problems. They are however extremely useful and tends to be the main method for wiring bones/joints to controls (essentially its saying; when object A moves up, object B reacts to it and the reaction is what ever to tell it to be such as rotate, scale, move, all a combination).





Tuesday 17 May 2011

Post 38 - Spline face user interface



For the face controls, I used a series of spline based shapes. There was a really good tutorial on how to create a slider from Paul Neales' DVD set which involved scripting. The script essentially is saying that as the child 'joystick' is moved inside the space of its parent 'rectangle', it can only travel a certain distance before stopping. To create the illusion that it is stuck in the box, the script called for a lot of if scripting and needed the same unit measurements of the rectangle. The script cleverly was made by using length data rather than name dependant. This allows for the controllers to be scaled or altered into sliders, or squares and yet the joystick will still remain inside the box. This is good because when controlling a rig, it is crucial that there are limits as to what can be animated.


So far I have rigged up the jaw with the corrective shapes, as well as some motions for the corners of the mouth to go up, down, in and out. I have still yet to do the eye lids and eye brows.

Post 37 - Mirroring problem and work around


One major problem I came up against is that for symetrical morph targets to be made, the best thing to do would be to simply mirror the morph target and job done. However, each vertex has unique ID, so mirroring geometry will not be a true mirror and will only morph what has already been done. I came across this video online which explains the problem and the work around. The solution to the problem involving skin wrapping and key framing morph targets can be seen below. So the stages needed to do this are as follows:

1. Copy morph target you wish to mirror (i.e. right smile and call it left smile).

2. add a morpher modifier to left smile it and select the 'default' head (usually a copy of the actual rigged head)

3. Now mirror the left smile head.

4. Set a key frame at 0 and one and frame 10 (at frame ten, select the morpher channel and turn it up to 100)

5. turn auto key off

6. Make a copy of the default head and align it to the left smile head.

7. Add a skin wrap modifier to the copied default head and add the left smile head as the influence. 

8. Now, move the time line back to zero (you should now have the copied default head mimic what the left smile head does.)

9. After slight tweaking, collapse the copied default head, delete the left smile and rename the copied default head, left smile. 




You will now have a head which has the correct vertex id and so can be used as a mirrored morph target. It is very surprising that there is not a standard button or option for this function. I have used this technuiqe to make all of the jaw and eye morph targets. I have yet to make eye brow targets so this is still to come.






Mirroring for morph targets - LINK

Monday 16 May 2011

Post 36 - Blend Morph technique



So, for this rig I have decided to learn how to set up a series of blend morphs. Its fairly straight forward; firstly, apply a morpher modifier to the default head. Then, copy the default head as the base for all of the morphed heads. Then its a case of altering the head to create a pose, phoneme or action (eye lids close for example). I have began to look into corrective shapes. Corrective shapes allow you fix any issues which occur when for example the jaw is opened, or rotated side ways. A common issue that riggers and animators have is teeth crashing into the mesh when the jaw is manipulated. Although you have to find a balance between the animator's freedom and the corrective procedures, you can fix pretty much all problems this way. I have made corrective shapes for the jaw opening, closing (when closed too much to produce a squash effect), and for a left and right rotation (grinding teeth for an example of implication).


Once made, it is a case of using the Reaction Manager to manage several transformations (set driven keys in Maya). So, by selecting the jaw's Z rotation channel (master) and set the open corrective shape (slave to the jaw) a series of states can be made. So I then animate the jaw moving in a series of extremes. Using Reaction manager is also eventually how I intend to control the face using a UI.






Morph Target setup clip - LINK

Post 35 - Jaw skinning

The rig is operational, but is not skinned. I should point out that I did not factor in having to learn the skinning process of rigging, but I am very pleased that I decided to jump into it and can now happily skin, so I have now skinned the head to the rig. However, their is a knack to skinning the jaw so that it deforms the skin correctly. Firstly, add the rig bones to the skin using a skin modifier on the head (standard procedure). Play with the envelopes. When as good as they can be, bake the envelopes and finish the job by painting influence. HOWEVER, the jaw needs to allow the jaw bone to rotate and stretch the skin from the top to the bottom. Essentially, you need the majority of the influence on the actual jaw and have it fade out towards the cheek area. Also, special attention needs to be made to the neck. There are a couple of methods for doing this level of tweaking.




One is by painting with a very low value. Another is to use the blend paint weights tool, which reads the surrounding area and normalises the verts, but this can be tricky if the brush is too large as you'll spend ages going back and forth. My favourite method to get true control on a single point basis is by allowing for vertices to be selectable. Then by using a slider called Absolute influence, you can refine it's influence on the mesh and get an instant feedback as to how it looks. This is good for when verts get lost under geometry and you need to fish them out. So I would recommend block painting, then blending verts and then using the Absolute vertex manipulation tool for any renegade geometry.


Also, animating the jaw to open and close as a great way of getting feedback as to how it looks in motion.






Jaw skinning - LINK

Post 34 - Eyes modelling and rigging



It wasn't my original proposal, but it hadn't occurred to me that eyes are extremely important in a head rig. Using the DVD rigging series by Paul Neale (of which has been a tremendous help so far) I quickly picked up how to model a rig a pair of eyes. The eye (which when finished is simply duplicated) consists of two spheres: The eye and the lens, which engulfs the eye. The pupil and iris of the eye is scaled back so that it is flat (as eyes are in real life) Then a multiple texturing material was applied allowing the pupil, iris and body of the eye to be appropriate colours. I then made a spline controller and applied a look at constraint on the eyes so that they follow the controller. I made separate controller for each eye and parented them the the main one (main one moves both eyes, individual ones move the corresponding eye). Then, in order to attach them to the head correctly, a point helper was aligned to the eye and the eye was linked top the helper. The helper was then linked to the head. This way, the linking is not done directly onto the geometry. In avoiding direct linking, alterations can be made without upsetting the connections (i.e. If you needed to change the eyes, you could unlink from the point helper, alter them, and them reattach).


In order to control the pupil dilation, an XForm modifier was placed on the vertices which made up the pupil. A bounding box was then provided. Then by creating custom attribute on the eye controllers and a 'float expression', the scale of the pupil (XForm modifier) could be manipulated by a slider attribute.




This was fairly straight forward but would prefer to avoid scripting if possible. I do understand it but feel that it is still a case of 'monkey see, monkey do'. I should point out that I have also added a set of teeth and a tongue (as seen in the clip below). The top set of teeth were linked to the head and the bottom set of teeth plus tongue to the jaw




Pupil Manipulation clip - LINK
Eye Manipulation clip - LINK

Post 33 - Return of the Max: Face rigging

Now that the Maya rig is finished for the time being, I have decided to go back to 3ds Max and look at rigging a face / head. As I proposed in the Task one proposal, I plan on rigging a a full figure rig (done) and a separate head rig. this will be done using mostly what is known as the morph target system. This is also known as the blend shape method, which is essentially modelling a series of poses, expressions etc and then adding them to the original model using a 'morpher' modifier. 




This then allows the various targets to be morphed or blended into the original giving it a natural transition. As this is just a head mesh, the first stage is to quickly add some bones for the neck, head and a separate jaw bone. As with the first bone based rig, I have constrained the neck and head to spline based manipulators so that they only rotate when the controller is rotated. The jaw has been linked to a helper node, which has been aligned to it (green cube). This will be it's pivoting origin and by parenting the jaw to this helper, it zeros out the transforms. This will help with setting the jaw up as well as making the manipulation clearer to animators (staring from zero rather than a random number).

Post 32 - Rig tweaks Two: Muscle deformations & driven key techniques

I have been recently looking into a series of techniques for creating muscle deformations. In the Max rig, stretchy bones were used and then partially skinned. This worked rather well and helped to correct some mesh issues. The first method I did was by modelling a bicep and tricep, parent constrain it to the arm, and then set a series of driven keys. So when the forearm was rotated inwards, the bicep would flex and the tricep would stretch. This worked very well and can be seen working in the link below. I later on found that there is a tool in Maya specifically used to model muscles. 




I am very pleased at this point as it uses the same method as I was experimented with, so worked it out for myself using my intuition. The main difference with the the muscle tools is that instead of setting driven keys, this has been replaced with a simple squash, stretch and relaxed button selection. This technique is very interesting, but I decided that this could be a another project in itself, so I have opted to just manipulate the biceps, using set driven keys in a different way. I have parented a helper node to the arm, then set a series of driven keys so that when the forearm is rotated inwards, the helper moves what from the arm. Then, by adding the helper to the skin as an 'influential object, it essentially pulls the bicep geometry and simulates a muscle flex. This technique is quick to set up and very logical.






Initial NURBS modelled muscle with driven transform keys experiment clip - LINK
Maya Muscle tool video LINK
Chosen locator muscle deformation strategy clip - LINK

Post 31 - Rig tweaks One: Belly Jiggle

For the 3ds max rig, I rigged up a controller to manipulate the stomach. In Maya, I used the jiggle deformer; a tool specifically designed to script a delayed recoil for geometry. This can also be used for any pat of the anatomy in order to create more realistic motion without key framing. First you select the vertices that you wish to deform. Then, select jiggle deformer. This adds a deformer. By typing the deformer's name in the command area, you can then adjust the settings such as weight and amounts. This rig is a slim male and so does not require a lot of influence, but the weighting would be increased should the character be larger, heavier or in order to enhance motion, (e.g. in cartoon like characters).





Belly Jiggle set up clip - LINK

Wednesday 13 April 2011

Post 30 - Skinning

I have begun the skinning process. This is a very similar, if not identical process to that of 3ds Max. However, in Maya I had to select every bone individually. This was the case as not all bone required skinning (e.g. the hip bone as there is a hip override on top of it). There is a way of using envelopes, which is the 'Interactive Skin Bind Tool'. However, I used the 'Smooth Bind Tool'. This one is the most mainstream and commonly used. This from the start when skinned looked rather nice and seemed to work rather well. Then, I proceeded to use the 'Edit Paint Weights' tool. The video link below demonstrates this tool and how to use it when faced with bad deformity.


The paint weights tool is almost the same as the one in 3ds Max, but with a couple of improvements. The scale isn't done by colour but rather from black to white (white - high influence, black - little to no influence). 3ds Max had a blend weight tools, which did work, but was a tad slow. Maya's equivalent is the flood feature. When the paint brush is set to smooth, clicking the flood button smooths all the the influence out over the selected joint. Very quick and useful. What took 3-5 days in Max took about a day in Maya.

I am happy with the result. There are other things you can do to tweak it, such as deformers and driven keys if there are problems with the mesh but the smooth bind skin tool is so effective there is no need to do so at this stage.




'Paint Weights' tool clip - LINK

Post 29 - Maya Joint system: Summary & Evaluation

Here is the Maya completed rig using the joint system. As with the 3ds Max 'bones' rig, I also intend to use Maya's skinning tools to bind the rig to the model as I believe that a rig's true capabilities can only be measured when taken to this stage. I found the joint system of creating a rig extremely flexible and intuitive. The ability to alter the pivot orientation, position and other attributes after it has been created only adds to its appeal and allows for mistakes and issues to be addressed rather than ignored or limiting the animator. The rig is in fully working order, from the toe and finger manipulators, to the hips, IK solvers and pole vectors. There is a huge range of real world and industry applications for this method of rigging as there is so much flexibility; it is very evident when using these systems that a quick turnover and production work-flow has been taken into consideration. This has resulted in an array of tools for adding, editing attributes and connecting them. This was a main disadvantage of 3ds Max. As it stands, creating custom attributes either means that they cannot be edited, or would require knowledge of MaxScript (both options which take time). These rigging techniques can be applied to both film and games. 

In addition, many driven key and connection tools can be used not just of joints but any other vectors and manipulators, which needed this type of relational transformation.
 
Maya Final Rig

Customisation is key to this tool set, and with it the learning curve is very gentile. Joints can even be used to rig animals, such as a birds wings or a fish's bone system. The amount of time taken to rig in Maya was significantly less than 3ds Max. However, a couple of negative areas I came up against are as follows:


Orientation confusion.
Manipulators need to have zeroed out attributes. This is done by freezing the transformations. However, when this is done the orientation is reset (a problem I had with the wrist manipulator). Not all manipulators are perfectly positioned in 3d space and it depends on the mesh. As I researched this problem many other people were also stumped at this issue. I did find a work around but was rather tedious and seemed to me that it could of been foreseen by the developers (involved grouping, parenting and un-grouping). I can see that these sort of issues would potentially stop a project in its tracks and create a hazardous and erratic work-flow.

Grouping and multiple views
Another area which for someone starting out in Maya was the grouping and multiple views of the scene. There are an array of schematic views and each serve a different purpose. Some include the Outliner, Hierarchy Hypergraph, Connection Hypergraph, Connection Viewer and son on. It is true that they are useful in being able to pick certain items which may be lost in a mesh or hidden from the main viewport, but there was a period for me that they just all became a bit confusing. One or two panel, with tabs for switching views would possibly be better and lass cluttered. Grouping and un-grouping items in a scene was also touch a go for a bit. It makes perfect sense to me now, but when I was in the middle of the project, you had to be very careful when dealing with joints and ik solvers. Un-grouping the wrong thing and the whole of the arm's hierarchy (example) would detach from the main hierarchy and become its own entity in the Hypergraph. However, as it is a flexible system, there are ways of re-introducing it into the chain. may benefit from having warnings when detecting major changes to the operation the user is about to perform.

None the less, very happy with this Maya rig; joints are definitely an improvement on 3ds Max's bones as it provides to flexibility after they have been created. Also, the custom tools creators speed up the process and does not require scripting. As this was the first time I used Maya, I found the process to be a good start.

Next stage is to skin it. This process will also be evaluated against 3ds Max and in relation to industry usage.
 

Post 28 - Hips, jaw and master controls

The rig is now technically fully established, the last process to take care of is to create the final manipulators and to colour coordinate it (same as 3ds Max, green for the right and red for the left).

Hip & Hip Override Manipulators
Very quick and simple, first up are the hip manipulators. These (also like 3ds Max) are circle 'hoops'. Their pivot points are snapped and aligned to the hip joint and then the hip joint is parented to the hip manipulator. This one controls the upper body and when lowered effects the bend of the legs. a smaller circle is next, the hip override joint is parented to this one. This will control the rotation of just the hip and so will not effect the upper body.


Jaw Manipulator
Next up is a control for the jaw. For facial rigging, which will be covered a a few weeks time, blend shapes tend to be favourable, but for the purpose of rigging, I shall rig a simple jaw also. A custom controller is made, then its pivot point is aligned with the jaw pivot point. The jaw is then given a parent constraint. This means that the jaw will 'look at' where ever the jaw manipulator is (look at constraint in 3ds Max). All but the jaw manipulator's translation attributes are then locked and hidden to avoid unwanted transforms.





Master Manipulator
The final control is the master control. This is a NURBS curve, which has been used to create a N,S,E,W, arrow controller. This is placed at the center, under the rig. Then ,all of the rig items are then grouped to it. This means that the whole rig and its controllers will move with the master control. 




 

Here is a breakdown of the rig so far.

Progress status...

Coords X,Y,Z, 0,0,0,

"1"(         
(Toes_Left&Right:CustAttributes_SetDrivenKeys)
(Foot_Left&Right:IKSCSolver_CustAttributes+MinMax_ConnectionEditor)
(Leg_Left&Right:IKRPSolver_CustAttributes_ConnectionEditor_IKFKBlend)
(Hip)
(Hip Override)
(Arm_Left&Right:IKRPSolver_CustAttributes_ConnectionEditor_
IKFKBlend)
(Hand_Left&Right)
(Fingers_Left&Right:CustAttributes_SetDrivenKeys)
(Spine:IKSplineTool_VertToCluster_CustControl_
IKFKBlend)
(Neck&Head:IKSplineTool_
VertToCluster_CustControl_IKFKBlend)
(ParentManipToJoint/s)
(Group)

"2" (
(Manipulators:_Toes_Foot_Knee_Hip_HipOveride_Clavicle_Elbow_ArmWrist_Fingers_Spine_
Neck_Jaw_Master)
)

In Addition to this breakdown, many other common procedures and tools were used such as: Freeze Transformations, Channel Controls (Lock and Hide), Mirroring items and joints (many IK systems cannot simply be mirrored over so the process had to be repeated for left to right).

Colour Coordinated with All controls

Here is a clip of the controls being used.

Main control usage clip - LINK