Tracking, and a catch up.

Video

Well, looks like someone fell off the blogging wagon… Bit silly of me as I’ve done an awful lot the past couple of months and I may well have remembered more of the techniques if I’d blogged about them at the time, however… what’s done is done. I’ll just have to make up for it now.

First of all, I’ve been learning about tracking. I created this reel for an application to a Matchmover/Tracker freelance job at a company that I really want to work for, but unfortunately the job was taken down a day before the reel was ready. Sod’s Law in action, yet not a completely wasted effort as I now have a reasonable grasp on tracking techniques where previously I knew nothing.

01 – Camera Tracking

In this one I finally finished the “shoe” project from the first semester of the Master’s. It was in equal parts nice and mortifying opening the file for the first time in eight months: nice, because I realised I’ve learned so much since then; mortifying because I’ve learned so much since then and by my higher standards the file was a mess.

– The Displacement map didn’t work first time around so I chucked it before the final render – I now know how Displacement maps work (32 bit depth ONLY; really neat UVs; Vector Displacement maps where possible because they are SO much more stable), so I set myself the challenge of getting it to work this time, which it did fairly easily compared to last time.

– I cheated with the reflection on the table the first time by just flipping the shoe – it worked, but I wanted to do it properly this time as that particular method won’t hold when dealing with animated characters (unless you Cache Geometry, then flip it, but still… messy).

I was literally stuck on this one for days, even though I knew that the answer when I finally figured it out would be really simple, and it was. It involved going into the mip_matteshadow shader, and ticking a box called “Reflection.”

I blame the naming conventions. It wouldn’t have been as hard if mip_matteshadow was renamed to mip_DOES_EVERYTHING-NOT_JUST_SHADOWS. Maybe it’s just me…

– I also used Linear Workflow and sub-surface scattering when creating the lighting and shaders, which I didn’t last time.

Compared to that lot, actually tracking the scene in NukeX seemed relatively easy. I roto’d out the middle of the table as the markers seemed to be sliding around a lot there, then steadily deleted markers to get the Average Tracking Error below 1 (0.581, in this case), while making sure the total number of trackers stayed above the minimum 100 required for a stable track.

02 – Object Tracking

This was a nightmare! Also, my first time tracking using Autodesk Matchmover, which I’m told is used quite often in industry. There’s a lack of decent tutorials in Object Tracking so I had to more-or-less wing it. The main difficulty was the busy background, as well as the fact that the markers would often disappear during the sword’s rotation. I had eight tracks in all, a large portion of which was done manually because Matchmover couldn’t pick the markers out from the background. The sword asset was from the Digital Tutors Asset Library and was particularly useful in that the fact it was glowing meant the lighting accuracy was less important. I did find and use a similar HDRI from HDR Labs, though.

03 – Planar Tracking

Compared to the other two this was very simple, and once I’d watched the tutorial on how it was done it only took about ten minutes to do. The tracking and replacement was done completely within NukeX.

So there you go – my tracking reel. I’ll be updating with more posts over the next few days, including Non-Organic Modelling, Multi-Tile Workflows, and my first ever VFX job…!

Bye for now,
S.

More Animations

Video

As the title suggests, here is some more animation. These shots are not exactly in order as I decided to get all the exterior scenes done first.

So, a breakdown:

1. Shot 006: p53 destruction. Texture emission of nParticles converted into polygons, as previously outlined.

2. Shot 010: p14 leaving the nucleus. Camera animated along a motion path (with a cluster attached to the curve to slowly swing it round at the end), instanced Maya particles, again as shown in a previous post.

3. Shot 011: p14 binding to HDM2, p53 entering the nucleus. This one was a pain. I had to be very precise in how the motion paths were laid out. I had to delete the jiggle deformer as it was going mental.

4. Shot 015: mRNA leaving the nucleus. This was a big “whoops” moment when I realised that my scale was all out. The mRNA was scaled correctly for the interior scenes, but in the exterior it was tiny. It was originally meant to be creeping out of the nuclear pore, but it wasn’t even the size of one of the arms on the pore. I didn’t much fancy redoing the rig, so I used the old compositor’s trick of “LET’S PUT IT RIGHT NEXT TO THE CAMERA SO IT LOOKS REALLY BIG!” I was pleased with the swim cycle in the end though, which was inspired by the swimming sperm from Inside the Human Body.

5. In Shot 016: mRNA translation by the ribosome, the mRNA was again far too small and would have been swallowed up by the ribosome, so I did away with the rig entirely, bent it into a curve using a wire deformer and scaled it up 8x. This has probably messed up the shaders so I’ll need to look into that tomorrow, but fixing those is far easier than fixing the rig.

6. Shot 017: creation of tumour suppressor protein. Out of all these shots, this is the only one Angus wasn’t so sure about, because of its lack of clarity. I think this is a silhouette issue – the protein is sitting in front of the ribosome rather than to the side so it kind of blends in. So I’ll move it to the side and perhaps make the chain a little longer, too. Angus wanted me to do some crazy stuff with slowmo and a zoom, but… I’m afraid I’ll have to tell him that it’s probably beyond my capabilities at the moment. Although Maya Time Warps are something I’ve always wanted to learn about, their absence in this piece so far would make adding one right now look a little incongruous.

Three more exterior shots to go and then I am on to the nine DREADED PROXY SCENES which I am absolutely terrified about because the proxies will probably magically disappear as they have a habit of doing. I’m hoping that the months of R&D I did on this issue will guide me through any problems. As for the renders, I am now on 628 out of 2569 frames, with another 243 (fingers crossed) ready by morning. Until all the rendering is done I will present this figure thus at the end of every blog post (more for my benefit than anyone else’s):

628/2569 (243)

p14 Curve Flow Dynamics

Video

Creating the dynamics for the swarm of p14 leaving the nucleus. I used similar techniques to creating the fish for “Going Live,” randomising the creation scale and rotation, but with the addition of random “tumbling” using a runtime expression. This was done with the help of Gnomon Dynamics.

Here’s the MEL script:

Creation:
float $mol = rand (0.9,1.1);
Flow_particleShape.custom_Scale = <>;
Flow_particleShape.random_Number = rand(-5,5);
Flow_particleShape.custom_Rotation = <>;

Runtime:
Flow_particleShape.custom_Rotation += Flow_particleShape.random_Number;