Try to get the leurre example running, which is in subversion and in the bundle released. It ships with an AVI file, so you can get it running without messing around with cameras or making it work on your video. Make sure you can use the display to see images from each stage in the pipeline.
After that, create a pipeline of your own by selecting approximately one component from each category, which usually looks something like this:
- Trigger component, usually timer trigger. Optional.
- Input component, either camera or from AVI file (for testing purposes also from image file).
- Conversion to color.
- Adaptive background subtraction.
- Binary mask, perhaps.
- Blob detection
- Calibration with TSAI
- Tracking, if desired (you get positions even without this)
Build the pipeline gradually, verifying things are working by using the display after each step. Tune the parameters to make it work well on your video, and take a look at the examples to see the best way to set things up.
The following examples may serve as a good starting point for your application. You'll find these examples with all necessary files in the Example folder of SwisTrack.
- RobotTracking: Tracking robots with an overhead camera
Feature Presentations and Utilities
The following examples present a feature (usually a component), and shows you how and in which setup it is used. You'll find these examples with all necessary files in the Example folder of SwisTrack.
- OutputProcessing: Templates for interfacing SwisTrack
- MultiCamera: Program to start/stop multiple cameras with a single click
- IDReaderRing: Reading circular barcodes around a blob