MULE-F

A compact system for autonomous tracking and following

Monocular Unmanned LEader-Follower (MULE-F)

Unmanned ground vehicles (UGVs) offer a range of benefits in both military and civilian applications, and are particularly suited to eliminating or reducing dull, dirty, or dangerous (DDD) jobs that are currently done by humans. For example, mobile robots have successfully been applied to tasks including explosive ordnance disposal (EOD), scouting, and emergency response. However, these robots typically require continuous teleoperation by at least one human, and require detailed operator oversight for even mundane tasks such as traveling from one location to another. As a result, UGVs may offer the promise of removing the human from the DDD environment, but at the expense of considerably increased task completion times (slower), operator workload (harder), and/or crew manning (more expensive).

 

Image of Charles River Analytics MULE-F poster.

The poster lists some of MULE-F’s features and benefits.

The Charles River Analytics Solution

To address these issues, we developed a compact system to autonomously track and follow a designated operator/leader, called Monocular Unmanned LEader-Follower (MULE-F). The system enables a UGV to follow an operator on foot autonomously—requiring no teleoperation on the part of the operator—using a lightweight video camera and requiring no modifications to the leader’s equipment, such as special patterns or infrared beacons. The system requires only a monochrome camera, with the intent to maximize ease of deployment on existing UGVs. The system’s core software capabilities are designed with a reduced computational complexity to minimize the payload’s size, weight, power and cost (SWAP-C) impact.

MULE-F is composed of three core modules: a pedestrian detection module that determines the locations of all visible pedestrians in the UGV camera’s field of view; an appearance-learning and tracking module that maintains a lock on the leader and differentiates between the leader and nearby troops or team members; and a gesture recognition module that enables natural control of the vehicle by their leader. Our system integrates state-of-the-art detection methods with kinematic tracking and online appearance learning techniques to reliably track a human leader walking in complex outdoor environments. The system operates at up to 15Hz on 640×480 video on a small 2.5GHz computing platform.

 

The Benefit

Interest in introducing autonomous and semi-autonomous capabilities to mobile robots continues to grow. The development and deployment of autonomous capabilities will free operators from performing mundane teleoperation tasks, and permit them to focus their attention on tasks requiring a high level of human engagement, leaving the truly DDD tasks to the mobile robots.

This video shows MULE-F following a pedestrian in a complex and natural setting. Colored particles represent a visualization of the tracker’s confidence that a person is located somewhere in the image. The red cloud illustrates the probability distribution of the leader’s position.

Contact us to learn more about MULE-F and our other robotics and autonomy capabilities.

Our passion for science and engineering drives us to find impactful, actionable solutions.