Tracking Objects In NUKE! - ExpoTees update #20

Posted on at

(This post was saved as a draft but wasn't uploaded)

Now I have my footage, I'm going to import it into Nuke, so that I can track it and export the pointcloud data & camera into maya. Tracking is a pretty easy process, so I started by plugging a CameraTracker node into my footage. I pretty much left the settings at default, as these usually track quite well, so I didn't need to tamper with it much. Once I had tracked the data, I then needed to solve it. Solving is just Nuke trying to determine which tracker points aren't needed, and it refines the points that have a stronger track log. Once these points have been solved, it was time to set up the scene. I did this simply by clicking 'Create scene', and Nuke does the rest for you. 



Once the scene has been set up, I then took a look at my scene in 3D space. The pointcloud was rotated quite a lot at first, but to fix this I just had to find 3 points that almost completely alligned to form a line on the X axis. I then selected these points, and set them to form the X axis for the point cloud. It actually worked perfectly on the first try, which never normally happens. I took a look at the scene in 3D again and studied the axis, and I found that the points were completely straight. With this finished, I moved onto the next step, which was studying the quality of the tracking. To do this, I added geometry into the scene. A cube is good to use, as it has sharp edges. After scrubbing through the timeline, I found that the cube tracked really well. It did have some jitters, but I should be able to fix these. The problem is that there isn't much sharp geometry, so I had to rely on colour patterns from he grass and the daisys. 

Now the tracking points are sorted out, I now have to import them into maya, but first I need to export the pointcloud data and the camera from Nuke. To do this, you need to add a MergeGeo node, which will allow the Geo to be grouped together in 3D space. Then, a WriteGeo node needs to be plugged in, so that the Geometry can be rendered out in 3D, instead of in 2D like the usual Write node. I then had to render out the camera and pointcloud data seperately. This means I had to deselect everything apart from the camera, and render out the camera. I then had to do the same with the pointcloud. 



After I had exported the PointCloud and the Camera, I then imported it into maya, and imported the footage as the background plane in the camera's view. This then places the tracker points onto the footage the same way it does in Nuke. This means I can now merge the logo into the scene in Maya, and begin working on the lighting for the logo.




- Josh Docherty - 3D Modeller & vfx artist

About the author