Blender 3D: Noob to Pro/Motion Tracking with Icarus

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Motion tracking, also called Match moving, is an essential element when integrating 3D elements with live footage. Motion tracking software is usually pretty expensive, but the Icarus application (Windows and Mac) is available for free for educational use. Icarus, which hasn't been updated for while, was later replaced by the commercial application PFTrack. Other popular motion tracking applications are PFMatchit and PFHoe(both also from The Pixel Farm), Voodoo (for Windows/Linux; free for non-commercial use), SynthEyes, Boujou and 3D-Equalizer (commercial).

The excellent CG prodigy Colin Levy hosts Icarus(by kind permission of The Pixel Farm Ltd), the Icarus import script for Blender, as well as a splendid video tutorial (see Download Icarus and Video Tutorial). However, I lacked a brief text tutorial about motion tracking, so I decided to write my own. This tutorial is extremely brief and high-level, and requires some previous knowledge on video editing, 3D, and Blender.

Note: this tutorial was created using Mac OS X 10.5 Leopard, Blender 2.46, Icarus 2.09, and the Icarus Import Script for Blender v1.07e (for Blender 2.41, written by Alfredo de Greef).

Tutorial[edit | edit source]

Phase 1: Preparing the Video Footage[edit | edit source]

Note: this tutorial explains the Auto-feature Tracking mode in Icarus. There are other options which gives more user control - see the Icarus UserGuide.pdf for more information.

  1. Record your video footage. Having the camera on a tripod (thus limiting to just panning/rotating) simplifies the tracking, but Icarus can handle a hand-held camera as well. Filming a background with orthogonal lines (that can align to X/Y/Z dimensions), such as a room, also helps the tracking.
  2. Capture/import your video footage to your computer. Icarus handles video up to DV resolution (720*576 pixels).
  3. Start the Icarus Calibration application (there is also a Distortion and a Reconstruction application).
  4. Create a new project (Project->New).
  5. Import your video footage (Project->Import Movie).
  6. Fill in the Camera Parameters information in the window that pops up - especially the Camera Motion and the Pixel Aspect options.
  7. In the left panel, expand the group called Coordinate Frame. You should see X Axis, Y Axis, etc.
  8. Click the Z Axis tool (blue) and mark vertical lines in your video footage. Use the X Axis (red) and Y Axis (green) tools to mark horizontal lines (up to you to decide which should be X and Y).
  9. Estimate the focal length (Camera->Estimate Focal Length).
  10. Navigate in time in your video footage using the time slider (beneath the video image). Add more X/Y/Z marker lines on a few key frames, especially as new pieces of the background are revealed when the camera moves.
  11. Save your project (Project->Save).
  12. Start the tracking process (Camera->Track and Calibrate). This will take some time.
  13. Export the results in human-readable form (Project->Export 3D Motion, select Human Readable (*.txt) as file type).

Phase 2: Importing the Motion Tracking Data into Blender[edit | edit source]

  1. Start Blender, and open a Text Editor view.
  2. Open the Icarus import script ICARUS_import241.py (File->Open).
  3. Start the script (File->Run Python Script). You should now see the Icarus Import screen.
  4. Press the FSEL button, and open the results you exported from Icarus.
  5. Press the Create Curves button. This imports the camera motion from the Icarus data and applies it to the Blender default camera.
  6. Press the Feature Points Mesh button. This imports 3D shape dots from the Icarus data, which helps as reference when you want to align your own 3D elements to the video footage.

You are now ready to add your own 3D elements to the Blender scene.

Phase 3: Compositing 3D Elements on top of Video Footage[edit | edit source]

If you want to easily composite the 3D elements on top of an image, you can add the image as the rendering back buffer in Blender (Scene tab in the Buttons view). However, this doesn't work for videos, so we need another solution.

  1. In Blender, switch to SR:4 - Sequence in the layout dropdown menu at the top of the screen.
  2. In the Video Sequence Editor view (middle of screen), add your video file (Add->Movie). Move the new strip to layer 1, frame 1.
  3. Add the current scene to the sequence (Add->Scene, Scene). Move the new strip to layer 2, frame 1.
  4. Select the scene strip on the second layer (right-click).
  5. Open the Scene panel in the Buttons view, and then open the Sequencer sub-panel.
  6. Change the Blend Mode dropdown from Replace to Alpha Over. Your 3D elements should now render over the background video in the top-right preview screen.
  7. In the Render panel, enable Do Sequence just below the ANIM button. This will enable the background video when rendering.

Troubleshooting[edit | edit source]

  • If your imported Feature Points Mesh looks a bit spherical, you need to generate camera distortion data using the Icarus Distortion application.