Charles River Analytics, a developer of cutting-edge technologies for intelligent systems, announced recent contract wins in robotics, focused on navigation and obstacle avoidance.
In April 2010 the U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC) awarded a 27-month contract, entitled “Scene Understanding for Semi-Autonomous Navigation (SUSAN).” Camille Monnier, Senior Scientist at Charles River, explained Charles River’s research and development in the program: “We’re developing software that allows a mobile robot to follow waypoints from an imaging sensor like a camera. This way, there’s no need for expensive range sensors or even GPS [global positioning system]. We’re also looking at how we can extend the technology to allow the same robot to follow a human controller and recognize hand gestures, such as might be used to command the robot to stop or to go to some location being pointed at.”
SUSAN addresses the Army’s need for easier-to-operate robots, which are a valuable asset to soldiers in the field. Robots can conduct bomb-disposal and reconnaissance missions, keeping soldiers away from these dangerous situations. Currently, deployed mobile ground robots use remote-control devices that require the soldier’s full attention even for such simple tasks as driving to a region of interest. SUSAN allows a soldier to point out safe waypoints for the robot to follow, putting SUSAN in charge of driving to each waypoint. This semi-autonomous approach leaves the soldier responsible for more important decisions, while freeing the soldier from staring at a hand-held display and constantly moving a joystick.
The SUSAN project is part of a wider effort at Charles River to develop scene understanding and obstacle avoidance technologies. For example, Moving Object Tracking Onboard a Moving Platform (MOTO-MP) was an eight-month project for the Defense Advanced Research Projects Agency that enabled a moving robot to detect and track multiple moving objects in a scene. This capability addressed a shortcoming in traditional approaches that struggle to identify moving objects when the entire scene is in motion (relative to the robot).
“But obstacle avoidance is only half the battle,” said Jonah McBride, Senior Scientist at Charles River. “To achieve fully autonomous operation, a robot requires both scene sensing and self-localization capabilities. We are also developing software which allows a robot to build a map of its surrounding environment and simultaneously determine its location with respect to that map. We call this technique Simultaneous Localization and Mapping (SLAM).”
Charles River is using SLAM in other ongoing contracts, such as Mapping Of Local Environments for Reconnaissance And Training (MOLE-RAT), a project also awarded by Army TARDEC. MOLE-RAT enables a robot to autonomously navigate buildings and other urban structures where GPS is unreliable, passing information on to soldiers in the field. MOLE-RAT uses SLAM to track the robot’s location and build a 3D model of the scene using a stereo video camera and low-cost onboard motion sensors.
Charles River has combined the SLAM approach with self-localization in an earlier award from the Air Force Research Laboratory Munitions Directorate (AFRL/RW). In Collaborative Autonomy for Robots Using Signals of Opportunity (CARUSO), Charles River addressed similar challenges as in MOLE-RAT, but this time incorporated other enhancements, including WiFi signals, to improve a robot’s navigation performance.
Self-localization and navigation to a target location are also being addressed in a project entitled Continuous Optical Mapping for Munitions Using Terrain Elevation and Reconstruction (COMMUTER), which focuses on the shortcomings of contemporary tactical missiles in self-localization. Global Positioning Systems (GPS) are subject to jamming and intermittent unavailability while a stellar navigation system is limited by lighting and weather, and radar sensors may alert targets. COMMUTER overcomes these shortcomings by estimating the geolocation and orientation of the air vehicle carrying it, by comparing a real-time camera-generated image stream with an onboard library of shape-based landmark features. COMMUTER was awarded both an initial feasibility study and follow-on prototyping award from the U.S. Army Research, Development and Engineering Command (RDECOM).
Magnús Snorrason, Vice President, Sensor Processing and Networking, said, “Robotics offer so much promise to the military and many industries where people could be removed from harm’s way, if only the technological solutions could be made more reliable and less brittle. This is why the central goal of our robotics research is on adding autonomy where it makes sense. Some things are better left to the human operator, but automating even a small part of a robotic system can make a huge difference in terms of overall usability.”
This material is based upon work supported by the U.S. Army TACOM under Contract No. W56HZV-10-C-0131, the Air Force Research Lab under Contract No. FA8651-09-M-0089, the U.S. Army TARDEC under Contract No. W56HZV-10-C-0195, and the U.S. Army RDECOM under Contract No. W31P4Q-10-C-0123.
Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the U.S. Army or Air Force Research Lab.UNCLASSIFIED: Dist A. Approved for public release (Distribution Unlimited)