p14 Curve Flow Dynamics

Video

Creating the dynamics for the swarm of p14 leaving the nucleus. I used similar techniques to creating the fish for “Going Live,” randomising the creation scale and rotation, but with the addition of random “tumbling” using a runtime expression. This was done with the help of Gnomon Dynamics.

Here’s the MEL script:

Creation:
float $mol = rand (0.9,1.1);
Flow_particleShape.custom_Scale = <>;
Flow_particleShape.random_Number = rand(-5,5);
Flow_particleShape.custom_Rotation = <>;

Runtime:
Flow_particleShape.custom_Rotation += Flow_particleShape.random_Number;

Exterior Comping, Gnomon Underwater Scenes and Ed Whetstone

I have been putting together the lighting for the exterior scenes, with a little help from Gnomon Underwater Environments.

gnomonunderwater

Including:

  • Setting up “fake caustics” by attaching an image sequence to a directional light.
  • Creating ambient light by setting up a “dome” of directional lights with no shadowing, which gives a kind of hazy, desaturated effect.
  • [tutorial missing from DVD… no idea why] Setting up spotlight fog which moves by using the same image sequence I used for the caustics. I figured it out based on the provided final Quicktime. The most difficult part was figuring out how to attach a custom curve to control the falloff of the fog.

I have also been mainly watching Guerrilla Compositing Tactics in Maya and Nuke with Ed Whetstone, as well as his other tutorial, Creating Light Rigs in Maya, a truly fantistic initiation into 3-point CG lighting.  From Guerrilla Compositing Tactics I learned how the depth pass can be used for all kinds of things: creating depth-based fog and haze, which I already kind of knew about, but the true revelation was the idea of targeted colour correction, based on the depth pass.

ed

ed2

In my own render, I have decreased the colour temperature based on distance from the camera.

In the end I found that simply emulating the underwater environment, while convincing in its realism, was a bit too hazy and would impair the clarity of the storytelling.  Therefore I used a combination of underwater and 3-point lighting techniques.  The only thing missing is particles, which I will do in Nuke now that the Bokeh effect has been improved.

Test001Speaking of which, I recently bought NukeX 7, which has a fantastic upgrade to the ZBlur node: ZDefocus.  Instead of adjusting the image plane, it uses a “focal point” that I can drag, and it automatically recalculates.  It also vastly improves the quality of the blur: with ZBlur, the only options were “Gaussian” or “Disk” Filter – now, with ZDefocus I can create bokeh effects that are pentagonal or hexagonal or even heart-shaped.

In other news, Angus has approved the animation so far, so I shall be working on those renders today.

Fantastic.

Full steam ahead!

Software Issues and Dead Ends

The last week or so has been a detailed study into Sod’s Law, or rather The Tendency For Maya To Suddenly Break When You Have A Deadline Approaching.

First, there was the Mental Ray Proxies issue.  I mentioned in a previous post that the proxies did not seem to like being moved around, and had a tendency to disappear from the scene – however, using Instanced proxies on particles seemed to work fine.

goalsAll well and good, except that I later found out that the Goals method I spoke of earlier ONLY seems to work when the geometry is in this configuration, i.e. straight up and down.  Otherwise, the orientation of the nucleosomes was completely wrong.  Not very useful.  So I watched some of Gnomon Workshop’s Grass and Plant Instancing in Maya/Mental Ray with Alex Alvarez, and tried out two possible solutions to this.

The first involved emitting from a black and white texture, giving instructions to Maya to “emit from dark” only, using a surface emitter.  Unfortunately the random nature of the Surface Emitter still came through, with some vertices emitting more than one piece of geometry and others emitting none at all.

EmitTextureTestThe second involved MEL scripting and a built-in Maya plug-in called nearestPointOnMesh to align the Y-axis of the particle with the surface normal of the emitter.NormalsOrientTestAnd if anyone’s interested, here’s the script:

// sets the orient axis for the instance
particleShape1.aimUp = <<0,1,0>>;

// get surface normal at particle position
vector $p = particleShape1.position;
vector $normal = `nearestPointOnMesh -ip ($p.x) ($p.y) ($p.z) -normal -q pPlane1`;

// orient the instance
particleShape1.aimDir = $normal;

At this point I really thought that I might have cracked it.  Sadly, it turned out that while the instances were oriented correctly in Y, there was nothing binding the orientation of the X-axis the the edgeflow, and once I started to bend the plane in the X and Z axes intead of just the Y, the orientation again got messed up.

I know that using MEL there will be a way to orient the X-axis according to edgeflow – Alex Alvarez could probably do it in a heartbeat, but I would probably need a whole extra year to figure it out, and I have a film to make.

So I went back to my original method of simply duplicating the geometry, with the aim of finding out what it was that was making the Proxies disappear.

As with so many of these problems, it would seem that a single button was the cause of my woes.

Mental Ray Proxies do not like to have their Transformations Frozen.

When I froze the transforms, it reset the translation of the proxied geometry to the origin. Bit inconvenient.

Hopefully now I’ve figured that one out, I can finally get on with my film.

Mental Ray Proxies, Instancing and Goals

I think I have reached something of a breakthrough regarding the problems I was having concerning the massively high poly-count of my scenes.

I have been learning about Mental Ray Render Proxies from the Gnomon DVD Creating Trees in Maya / Onyx: Forest Techniques, part Four with Alex Alvarez.  Proxies allow you to export your high-poly mesh as a Mental Ray Render Proxy (Assembly) file, and substitute it with a low-poly mesh, or proxy.  Not only does this speed things up in the viewport, it also massively reduces the render time of scenes, as Mental Ray only reads one file then copies it, as opposed to reading each piece of geometry separately.  So now I can render this 52 million poly scene in 13 minutes.

Chromatin (section)

And this is what that scene looks like in the viewport, with the proxy geometry:

viewportAll very well, except that I began to run into problems when I started to move the proxies around too much. The chromatin isn’t straight in the scene; it needs to be bent into loops and curves.  Without moving each proxy individually (a very long-winded task), I would need to deform them in some way.  The one caveat with proxies, as Alvarez explains, is that they cannot be animated, and as I found out, this appears to include applying any kind of deformer to it (including Soft Select set to “Object” mode…), as the proxied files seemed to disappear from the render every time I tried to rearrange them using a deformer.

I noticed that Alvarez was using a hill-shaped geometry plane as a surface emitter, emitting particles which were immediately keyed to cease being dynamic, then instancing his tree proxies to the emitted particles.  This allowed him to populate his forest scenes very quickly.

I tried to do the same thing, except using an Omni emitter instead.  My reasoning was that an Omni emitter emits from vertices rather than the whole surface, so if I arranged the veritces in an upwards spiral then the assets would arrnage themselves that way.  (I would also have to change the Aim Direction of the instance to “Velocity” for this to work.) I could then deform this geometry into the shape I wanted, without deforming the proxies in any way.chromatin_geoWhich worked, except this happened:

omniThe particles were in the right place, but oriented completely wrong, as Omni emitters emit in random directions, rather than along the surface normal.  What I needed, then, was something that was kind of like a cross between a Surface and an Omni emitter – something that would emit only from veritces like an Omni emitter, but also take into account the normal direction (like a Surface emitter).

After fruitlessly Googling for a while, I suddenly remembered that “terrible” Goals test I did a while back, which I didn’t think I would have any use for.

I realised that this was exactly what I needed – the particles would be oriented correctly as they would be travelling toward the surface points (though strangely, in practice I found that on reaching the surface they would “flip” onto the X-axis. I had to key the particles so that they “switched off” in the frame before they did that).  So I tried it, and this was the result:

goalsIt’s almost perfect, the only problem is the particles near the bottom which are slightly skewed in their Z-axis.  This is probably because the emitter is positioned at the bottom, so they are travelling in a slightly different direction.  I’ll try to find a way to change this, but compared to the problems I had in the previous tests, it’s probably not a matter of life and death…

Dynamic Fishing…

Sounds like the best extreme sport ever, but sadly in real life only involves Maya, more particularly MEL.  This is a shoal simulation of fish that I will be using as a base for my simulations in the Going Live project.  I instanced Sean’s swim cycle and have used MEL to:

– Randomise the start point of the swim cycle.

– Randomise the scale of individual fish.

– Connect the speed of the swim cycle to the velocity of the fish, so that the cycle speeds up if the fish are travelling faster.

Maya Goals Experiment

Remember that terrible Goals test I did last December?  Well I was convinced that I could do more with this feature than I had so far – it’s just that I hadn’t been using it in the correct context. So I did an experiment to teach myself more about it, using the Gnomon Dynamics DVD and a horse run cycle from the Digital Tutors Asset Library. I animated the Goal Weight value and Shader Transparency to make the particles form then disperse, and also applied Turbulence and Vortex fields to make it look less uniform.  I learned that when you’re using goals to make a character out of Particles, it’s best to break up the geometry into lots of different sections to give full control over not only how the character forms but also the size and goal weight of the particles forming different body parts.

In my own film, I will be able to use Goals to control the movement of swarms of particles – probably using a locator.

This is not just storyboarding…

…this is CONCEPTUAL storyboarding (Gnomon style)!  I have been watching the Gnomon Conceptual Storyboarding series narrated by Derek Thompson, a storyboard artist at Pixar.  It’s totally changed my mind aboyut storyboarding – I hated doing it before, but I love it now!

I had previously assumed that storyboard artists only started work once the designs for the characters and environment had been fully finalised, but I know now that this is not the case. Storyboarding is a much more organic process than I had originally thought, with the look of a film developing as the artist works – killing two birds with one stone, so to speak. As I’m working to a rather limited time frame, this suits me perfectly as I can develop the look of the piece while designing shots.  It also means that I can work out whether the environment will actually co-operate with the shots I have in mind.  Technically speaking, I also now know how to use Layer Comps in Photoshop, and now sorely wish I’d known about them before.

Progress has been slow – I was bedridden most of yesterday and today with a bout of food poisoning – but here’s one I was working on today.

0012

It’s quite a departure from the way I had previously staged this shot.  The reason for changing it is that Angus said that the nucleus was much more densely packed than I had represented it in my pre-viz shot.  This presented quite a challenge, as filling the nucleus up with stuff could potentially distract the  viewer’s eye from the action.  We noticed that in Hidden Life of the Cell, the BBC had avoided this particular scientific truth and built quite a sparse nucleus.  Wanting to stay as accurate as possible, Angus said, “I am sure you can do better!”

Better than the Beeb? Hmm… big ask.

So after spending a little while moving things around and playing with colour values and shapes, I hit on a possible solution…LOADS of depth of field!  Focusing only on where the action was happening and blurring the rest would still keep the impression of a densely packed nucleus, but not distract the eye. At least, I hope that it will work. And now, armed with my new conceptual storyboards, I’m fairly confident that it will.

Two more Dynamics tests, Gnomon… and MEL!!

So I broke into the massive (28-hour) long Gnomon Dynamics series for the first time yesterday. This series was made in 2001 but is still extremely relevant, and no one has gone into as much detail about Maya’s Particles system before or since.  I’m only about two hours into it but it’s been very useful so far. I’m aiming to get through all of it by New Year, but that might be a bit much, considering that it’s quite a lot to get my head around!

This test uses the Instancer with nParticles – and some custom scripting to randomise the scale and rotation. I’d also written a script to make the particles tumble, but it doesn’t seem to be working just now. Considering that Alex Alvarez wrote the original script for classic Particles, it may be an nParticles compatibility issue. Here’s the script:

float $foo=rand(.5,1);

nParticleShape1.custom_scale=<<$foo,$foo,$foo>>;

nParticleShape1.random_number=rand(-.001,.001);
nParticleShape1.custom_rotation=<<rand(360),rand(360),rand(360)>>

nParticleShape1.custom_rotation+=nParticleShape1.random_number;

Lovely, isn’t it? The result was something that looked less uniform than the previous test, but with the drawback that the solid particles don’t “merge” quite as well as the Blobby Surface nParticles from before.

This second one was a bit of a departure, using Goals. It was an experiment… it didn’t work for a couple of reasons:

– The particles attach to individual vertices, meaning that I couldn’t “stretch” the geometry without increasing the space between the particles.

– The particles attach to the vertices according to their number, resulting in things occurring in an extremely odd order.

Goals may prove useful for another part of the film, but I think I’ll leave them for now.

So after all that experimenting, it may be that I end up using something quite similar to my first dynamics test. Still, it never hurts to explore new avenues.