Proceedings of SPIE Defense, Security & Sensing, vol. 7692, Orlando, FL (2010)
One of the principal challenges in autonomous navigation for mobile ground robots is collision avoidance, especially in dynamic environments featuring both moving and non-moving (static) obstacles. Detecting and tracking moving objects (such as vehicles and pedestrians) present a particular challenge because all points in the scene are in motion relative to a moving platform. We present a solution for detecting and robustly tracking moving objects from a moving platform. We use a novel epipolar Hough transform to identify points in the scene which do not conform to the geometric constraints of a static scene when viewed from a moving camera. These points can then be analyzed in three different domains: image space, Hough space and world space, allowing redundant clustering and tracking of moving objects. We use a particle filter to model uncertainty the tracking process and a multiple-hypothesis tracker with lifecycle management to maintain tracks through occlusions and stop-start conditions. The result is a set of detected objects whose position and estimated trajectory are continuously updated for use by path planning and collision avoidance systems. We present results from experiments using a mobile test robot with a forward looking stereo camera navigating among multiple moving objects.
1 Charles River Analytics
2 University of Massachusetts
For More Information
(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)