Entry Date:
December 9, 2010

Tracking Objects at Sea

The goal is to develop a robust flying robot system that can track objects such as whales, sea lions, and boats at sea. We wish to use this data for marine biology and surveillance applications such as census, behavior monitoring, and behavior modeling.

We are developing a "flying camera" robot that uses an Ascending Technologies Falcon 8 robot. The visual system of this robot uses a classifier based on HS histogram analysis. We are especially interested in using this system as a tool for collecting data about the Southern Right Whales in a collaboration with Roger Payne (Ocean Alliance, USA) and Mariano Sironi (Whale Conservation Institute, Argentina. The other collaborators are Peter Corke (QUT), Daniel Gurdan (Ascending Technologies), and Jan Stumpf (Ascending Technologies).

More specifically, we have created a system for classifying objects of interest in a camera frame in order to follow the target through subsequent frames. We wish for this classification system to be robust to external disturbances such as occlusions of the target, lighting changes, and noise in the image.

The object classifier begins by presenting the user with an image containing the desired target(s). Using a GUI, the user can threshold the H (hue) and S (saturation) planes of the image until only the intended target remains in the image. These thresholded images are used to create a discrete probability density matrix called the 2D target histogram. This target histogram is then applied to subsequent frames to find areas in the image which have a high probability of being the intended target by comparing pixel HS values to those in the 2D target histogram. Areas with a high probability of being the target are then identified with a center of mass marker and bounding box.

The system also supports the creation of additional 2D target histograms enabling the user to simultaneously track targets of various HS characteristics.

The HS histogram classifier has been run on variety of clips with a variety of target objects. The output of the classifier can be viewed as separate video files. Clip names correspond to the clip names listed in the table above. The clips were taken from several collections of video and are grouped accordingly along with the source.

We are currently testing this classification system for visual servoing utilizing the Quad-Rotor platform. Most recently, experiments have been run utilizing a Point Grey MV2 camera with the classification code run on a FIT-PC2 mounted on the Quad-Rotor. A sample trial output is shown below. The target for this test was a wooden block.