Halloween Costume ideas 2015

AN / Pipeline MDD Cache and Offset Animations

PYTHON AND ANIMATION PIPELINE


You can skip all my rambles to the actual topic by direcly go to (*) below.

Few days ago, I was watching this 2 hours video regarding "Python and CG Pipeline. There is one part about ANIMATION that I am interested in: A mention of MDD Cache for animation transfer between softwares.

You can watch the video here: (original video is in Russian by Russian TD, then translated by another Technical Director)
https://vimeo.com/88080700

Basically this Technical Director (TD) was giving an example scenario where Python might be useful to pass on DATA between softwares, like in this case from Maya to Houdini (it can be IN and OUT from and into Blender too of course).

FBX format is apparently (still) not perfect for exporting and importing. Houdini FBX Import from Maya does not guarantee a fully compatible correctly read data. This is a common situation in any production, the format conversion is often not perfect out of the box. Using Python, the TD wrote this "translator" that export out MESH data as BGEO format (Houdini native, more robust than the basic OBJ) and the ANIMATION data as MDD Cache.

I remember that Blender has a pretty good MDD tool built in, and it is part of Blender's Modifier. Cool, let's talk how this later relate to the AN business.

ROUGH PIPELINE SUMMARY

I will try to explain a bit of summary about "CG Pipeline":
In CG Production, there will always be a stage where everything is "baked" prepared for (Final) Rendering. Asset and Rig are baked, Camera animation is baked, Shader is baked, Character animation is baked, VFX is baked, and so on.

BAKED DATA can be "final", but does not always mean final, meaning that artist can always go back and make edit, just in case there is an issue.

However having this BAKING STEP implementation allows for CHECKPOINT and VERSIONING before re-assembly. Definitely allows for flexibility for a lot of situations.

BAKING can sometimes expensive, especially for VFX simulations.

Ideally, the whole production is made inside "just one" software/tool that does every stage of production smoothly. For example, Blender is one tool with a complete and robust tools. Including compositing.

In typical film or game productions, many artists may use different tools:
- MODELING: ZBrush, Mudbox, Sculptris, Blender
- RIGGING: Maya, Houdini, Softimage
- ANIMATION: Maya, Houdini, Softimage
- VFX: Houdini, Maya, Softimage
- RENDERING: Arnold, Renderman, VRay, Octane
- COMPOSITING: Nuke, After Effects
- VIDEO EDITING: Premiere

Blender, I believe, WILL eventually fit every stages to help artists. If not already.

What really important is usually the way we "BAKE" animation or "TRANSFER" data between tools, along the pipeline, to ensure smooth process.

Remember this:
For any "procedural" workflow, at some point we need to bake.

WATCH OUT:
Although baking of animation is nice, sometimes this is also not always possible. Sometimes Anim Curves need to be retained and re-applied to the RIG ASSET.

I believe current convention in CG industry is to use Alembic for baking mesh data. Sometimes we actually need to bake the Animation Curves into generic data that can be passed on between softwares. All this needs to be taken into consideration.

In the past, I have experienced in using Maya and Houdini as ASSEMBLY tools. Blender actually has this potential to be the assembly tools because of its open source and always available to use in any system.

But well, it's all back to the Studio Production Pipeline.

MDD CACHE FOR BAKING ANIMATION

MDD Cache, although old and may does not seem much, has a lot of potential and advantages. It is really simple to use too.


ALEMBIC is probably the next thing to look at, once Blender implemented it (!)

Anyways, MDD Cache allows DEFORMING MESH ANIMATION to be baked fully. By doing this, we are no longer dealing with hundred of animation curves (f-curvds) which is good!

NOTE:
One note, althouth mesh caching is great for film animation, maybe not for Game Animation that requires joint, but there is always a way to "reverse engineer" MESH ANIMATION to JOINT ANIMATION.

ANIMATION NODES TO CONTROL MDD ANIMATION (*)

Ok, FINALLY ....
we got to cover the actual topic on using Animation Nodes toolset for MDD Animation.

Above, I am trying to give us a background scenario of typical MDD usage to bake animation in pipeline. Beyond that, MDD can be used to create complex animation. For examples, maybe we want to:
- Generate multiple VARIATIONS of animations
- Create OFFSET ANIMATIONS for same rig
- Have easier control over thousands of rigs (simple crowd system)

What I really want to cover here is one way we can think procedurally using nodes to offset animations :-)

Simple right? Well, apparently not that simple.

Although we already have "F-Curve Animation Offset" AN Template, that can help offset animation curves, but I believe it works only for a single object transformation, and that its function is simply to offset animation to INSTANCES (duplicate) of same object.

What if we have a rig with many controls? Do we need to deal with bunch of controls and f-curves and then re-apply it back into rig? Although this is probably possible, I think it's really too complicated for now.

That is why I am looking at MDD Cache for now.

Caching/baking animation really simplifies a lot of animation complexity. By baking, we don't have to deal with hundreds if not thousands of Animation Controls and Animation Curves and Attributes.


BLENDER ACTION BLOCKS AND NLA (Non Linear Animation)
Blender has this function built in to turn animation curves into block of animations, and we can sort of arrange and edit animation like NLE Editor.

It might be handy for certain scenario but I honestly have very little experience with Blender NLE, animation block layering, etc. I think it can do what we are trying to do (offseting animation) using block of Animation Actions, but we will take it a little further and tackle the problem procedurally using animation nodes.

I am definitely more familiar with MANUAL ANIMATION and PROCEDURAL ANIMATION, and SEMI-PROCEDURAL ANIMATION is really my desire to achieve using Blender and Animation Nodes. It needs to be simple enough concept that animators can learn in technical level.

LET'S BEGIN...

We will use a simple rig to explain this concept. By understanding this example, you will learn further about LOOP and data GENERATOR of AN.

1 Rig and Multiple Controls
You can use your own rig. Or download from Blend Swap. But basically you need a deforming mesh that you can export animation as MDD.





2 Export out MDD
Activate MDD exporter from the User Preferences. Maybe it should always be ON by default, because Blender's MDD reader is always part of Blender Modifier feature. Anyway, "Save User Settings" if you like.

Then, you select the mesh that is deforming and then export the animation as MDD.




3 Use the MDD Cache
Once you have the MDD, all you need is the "original mesh" duplicate or imported into the 3D scene, and apply Mesh Cache Modifier, read the MDD cache from disk, and to apply it to the mesh. Then you get the animation back.

The mesh needs to be identical to the mesh used for rigging and animation in term of vertex order. We will be using AN to "drive" the "Evaluation Frame", so I set it to "Custom".



WEIRD BUG:
Sometimes after you save the Blend file, and re-opening, the path into the MDD might change, for example:

Correct:
//../MDD_TEST/simple_single_surface_rig_001.mdd

Wrong: (when opened back, was working...)
//MDD_TEST/simple_single_surface_rig_001.mdd

Make sure you put the //../ dot dot.

4 Offseting MDD Cache using AN Node
This part is pretty straight forward and simple. You just need "Object Attribute Output" node that can directly drive any attribute of an object.


The DATA PATH here is: modifiers["Mesh Cache"].eval_frame.

At this stage, note that we can actually have MULTIPLE caches, different kind of objects ANIMATION. Later on below, when things get serious, you will see how we can orchestrate many objects using AN node LOOP.


5 Using Loop Node
Now, this is the FUN part.

If we were to manually connect nodes for multiple object with MDD Cache, and create offset, we might have something like below:


Above is not wrong, in many software packages (such as Maya), this is actually what is happening under the hood. It's pretty crap, however easy to connect nodes to create offset using script, user will still end up with web crap of noodles and nodes. What if we have hundreds of objects? It will be a pain to connect.

You could write Python that randomize the "Object Attribute Output" node's attribute ("Delay" attribute, in this case), but still that is not efficient.

AN actually has a nice looping system that is pretty robust, yet pretty simple to understand.

How do reduce the connections above into a nice system? First, I look at which ATTRIBUTE that will have MULTIPLE values applied FOR EACH. In the way, AN Loop is similar to Houdini's "For Each" node conceptually.

Above we can see:
- DELAY: will have different values per object. How many? As many as user wanted.
- OBJECT: will have different input, whether the user supplied objects via "Group" node, or

So, I was first experimenting with generating the random values, this will be LIST of random values we can plug into DELAY attribute. It's a floating values.


LOOP in AN will do either of below:
- To "do something" for each LIST of input
- Generating LIST with certain number of iteration

6 Loop that generates LIST of OFFSET values

Above, you see, those network in light maroon color, is responsible to generate LIST of values for our OFFSET DELAY attribute.

7 Loop that take LIST of OFFSET and apply to MANY OBJECTS with MDD


I am plugging in the OUTPUT into another LOOP. The setup will then magically takes the generated offset value, and loop iterate it for X many objects we supply via "Group".

Don't you think it's such a revelations?


At this stage, you should already see things happening. You could Blender duplicates the mesh + mdd mesh cache modifier, and each one of this object will be controlled by our setup, each being offset slightly in random fashion.

This is a rather complex setup. Well, still a "simple" basic procedural magic example. But creating this any other way probably will not be as elegant.

HEY, CAN'T ONE JUST USE PYTHON?

SURELY, you can always make Python script that generate random values and apply it ONCE OFF to randomize the offset, and then create driver expressions that drive each "Evaluation Frame" of MDD objects so that we are getting the same effect.

However, this is one ideal situation when we set everything nicely, we have full control over the whole things. And more over, you could expand from this and make all kind of tweak and variations.

I WANT MORE...
The setup can be more elegant and reusable if we maybe create an AN GROUP CONTAINER for our node. AN prodives this way of organization to reduce clutter of node noodles.

You might want to check AN Subprograms "Group". It's like a way to nodify node network and reduce it into a CUSTOM NODE container that takes INPUT(S) and spit out OUTPUT(S).

If you check some of the AN Templates, you will see this "Group" custom node is used quite often. At first, this might feel like an extra unnecessary step, but when our network grows over 100 nodes or perhaps we often use a certain node setup, this will become very helpful.

Maybe I cover that in next article.

Good luck!










Post a Comment

MKRdezign

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget