Monday, October 5, 2020

Charles River Analytics, developer of intelligent systems solutions, will showcase our developments in applied robotics and artificial intelligence (AI) from a virtual booth at AUVSI XPONENTIAL 2020—the world’s largest tradeshow for unmanned autonomous systems. Held from October 5-8, AUVSI hosts professionals from more than 60 countries, representing industry, government, and academia. 


This year’s theme for Charles River is “Autonomy You Can Trust”, highlighting how our scientists and engineers are bringing together experience from many different applications to build towards a unified framework and architecture for future autonomous systems.


Autonomy You Can Trust

Wondering how to make autonomy work in complex, unpredictable operational environments with intermittent, limited, or nonexistent bandwidth? How to move your autonomous operations from human in-the-loop to human on-the-loop control? How to negotiate skyrocketing technical complexity when moving from a single vehicle to a heterogeneous swarm, which can act and reason in ways that defy intuition?

For autonomous systems, there are a multitude of interrelated problems and no grand unified theory that solves them all. So we’ve built our expertise from the ground up, developing proof-of-concept demonstrations, highly capable prototypes, and innovative products for a wide range of autonomous applications, supported along the way by an engaged user community and distinguished sponsors through countless government technology development programs. We integrate this diverse work into a unified autonomy framework and architecture, guided by a surprisingly simple idea:

True autonomy will be widely adopted only when it can earn your trust.

Expanding Autonomous Sensing Capabilities

To enable small autonomous or semi-autonomous drones to identify chemical, biological, radiological, and nuclear (CBRN) threats, MIDNIGHT applies advanced techniques from computer vision, machine learning, and autonomous navigation to the fields of radiation detection and perceptual sensing.

3-D LiDAR model

3-D LiDAR model with radiation source in bright blue (Pavlovsky et al. 2018)

Pioneering Resilience in Autonomy Software

Under DARPA’s BRASS program, PRINCESS incorporates new advances in machine learning and probabilistic modeling to help build software systems that can understand, learn, and adapt to change. We grounded our research in unmanned underwater vehicle (UUV) platforms, which must quickly acclimate to changing environments, system failures, and new missions.


A REMUS 600 autonomous underwater vehicle (U.S. Navy photo by John F. Williams/ Released). PRINCESS applies machine learning and probabilistic programming to help UUV software adapt to ever-changing ecosystems.

Enabling “Smart” Autonomous Communications

To broaden the range of potential missions for UxVs, we are developing software tools that provide adaptive signaling behavior, such as choosing among alternative communications pathways to maximize the value and timeliness of information transmission based on mission relevance, operating situation, and bandwidth constraints.

Enhancing Autonomous Navigation

CAMINO is a system we developed for the US Navy to improve the accuracy of underwater positioning and navigation at a reduced size, weight, power, and cost (SWAP-C), enabling effective navigation for smaller, cheaper, and even expendable underwater assets.


Sensing for Situational Awareness

AutoTRap and Awarion


The Awarion™ Autonomous Lookout System is an integrated software and hardware smart camera system that will deliver situational awareness at the sea surface, detecting and classifying ships, obstacles, and other objects, performing passive ranging, and tracking targets over time and across different sensors. A ruggedized processing unit will offer autonomous pan-tilt-zoom capabilities to detect everything from navigational aids to marine mammals, providing an important sensing solution for autonomous vehicles.

AutoTRap Onboard

We recently released AutoTRap Onboard™, our AI-based object detection software that analyzes side scan sonar data, generating contact reports in real time, instead of in post-mission analysis. AutoTRap Onboard readily integrates onto the market-leading Teledyne Gavia line of autonomous underwater vehicles (AUVs), expanding their already impressive survey capabilities.

AutoTRap sonar data

When objects are detected, AutoTRap Onboard provides contact alerts to your system

Teaming with Autonomous Swarms

Under DARPA’s OFFensive Swarm-Enabled Tactics (OFFSET) program, we developed EUROPA, which provides novel, tactically tailored, multimodal user interfaces to help operators control drone swarms.


With SATURN, we developed capabilities that give heterogeneous swarms of unlimited size resilient behavior while achieving mission objectives.

Our MERLIN effort applies a meta-reinforcement learning approach to discover and learn novel swarm tactics.

In support of the US Army’s Combat Vehicle Robotics (CoVeR) program, MANTA enables robust, natural, heads-up and hands-free direction of autonomous behaviors for multiple unmanned systems in complex environments.


Making AI Understandable: The Ultimate Basis for Trust

Under DARPA’s Explainable Artificial Intelligence (XAI) effort, our team developed probabilistic causal modeling techniques and an explanatory interface that enables users to naturally understand and interact with machines. CAMEL creates simple, understandable explanations of how these complex, deep learning machines work, the key to helping human operators develop trust in autonomy.


Learn more about how we can make our adaptive autonomy work for you at AUVSI XPONENTIAL 2020. Schedule a meeting with us!