The United States Air Force awarded a follow-on contract to Charles River Analytics to design and develop a system called Argumentation-based Negotiation for Automated Sensor Tasking (ANAST). ANAST is an integrated video analysis and sensor management tool that will support the Air Force Research Lab’s Small Unmanned Systems (SUS) project.
ANAST was conceived as an efficient way to manage the military’s sensor resources. Since the number of surveillance objectives often exceeds the capabilities of available assets, effective deployment and management of sensor assets throughout an engagement is required to address scheduling needs in the face of a dynamic tactical situation. ANAST meets this challenge by combining computer vision and negotiation technologies to automate the management of sensor resources. Computer vision-based video analysis automatically identifies moving objects (potential targets) and creates tracking tasks from raw video. Argumentation-based negotiation addresses the allocation of resources to the potential targets identified by the vision system. Negotiation works continuously, providing dynamic adaptation to changes in the mission environment so that the resulting sensor tasking schedule fully integrates environment changes and the impact of schedule changes with the decision-making process.
ANAST provides an opportunity for Charles River to integrate research and development expertise gained through several related projects. The argumentation technology in ANAST is an offshoot from two other Charles River projects. With Rapid Expertise Aggregation Supporting Optimal Negotiation (REASON), Charles River developed a collaborative decision-making platform based on their argumentation technology in a DARPA-funded effort. In Integrated Software Environment for Battlespace Digital Mapping and target Range Acquisition (DMAP), they created a system able to make targeting decisions using input from a GIS system, uncertain knowledge about the state of the world, and expertise encoded in the argumentation network. The video tracking technology in ANAST stems from Adaptive Learning in Particle Systems (ALPS), which was funded by the Air Force Research Lab. ALPS is a video tracker based on a particle filter samples the space of possible target maneuvers and automatically adapts several key algorithm parameters required for real-time performance.
For more information, contact Dan Gutchess at email@example.com