From the Crows’ Nest Podcast | How to Trust AI on the Battlefield

From the Crows’ Nest Podcast

Explainable AI

­­Scientists and engineers at Charles River Analytics are generating new knowledge at the frontiers of this rapidly growing area, creating a future where humans and AIs collaborate—to drive cars, respond to disasters, and make medical diagnoses.

Why XAI?

Artificial intelligence (AI) systems are increasingly becoming part of high-stakes decision processes. In addition, many emerging applications of AI depend on collaboration between people and AI systems. These trends are creating a need for AI systems that people can trust.To be trustworthy, AI systems must be resilient, unbiased, accountable, and understandable. But most AI systems cannot explain how they reached their conclusions. Explainable AI (XAI) promotes trust by providing reasons for all system outputs—reasons that are accurate and human-understandable.

Adapted from Four Principles of XAI, a draft report from the National Institutes of Standards and Technology

Podcasts Features

From the Crows Nest Podcast:
How AI Is Changing Electronic Warfare
with Rob Hyland

In this episode of From the Crows’ Nest, Principal Scientist and Director of Transition Rob Hyland joined host Ken Miller to explore how AI is transforming electromagnetic spectrum operations (EMSO). Hyland explains how “hybrid AI” is helping systems learn across the spectrum, speed up upgrades, and adapt faster to evolving threats. He also discusses the challenges of integrating AI into defense systems—and why building specialized AI expertise is critical for the future of electronic warfare.

From the Crows Nest Podcast:
How to Trust AI on the Battlefield
with Jeff Druce, PhD

In this episode of From the Crows’ Nest, Senior Scientist Jeff Druce, PhD joined host Ken Miller to explore one of defense’s toughest challenges: trusting AI in combat. Druce discusses how AI and machine learning are transforming warfighting, from situational awareness to mission success, and how his team is developing methods for AI systems to explain their reasoning to human operators. The conversation highlights why building transparency and operator trust is essential for the future of AI-driven defense.

Our Approach

CAMEL Project: Using a Real-Time Strategy Game as an Environment for XAI

Our XAI does more than provide correlation-based explanations of AI system outputs: it offers deep insight and causal understanding of the decision-making process. The value of this approach is backed by user studies demonstrating increased trust in the AI system and enhanced human-AI collaboration. Combining our cutting-edge research on XAI with our decades of experience applying AI to real problems for real users, we develop complete systems that work with the entire AI ecosystem—hardware, software, algorithms, individuals, teams, and the environmental context. We can also help you add XAI functionality to your system, supporting your compliance with laws and regulations requiring that decisions from automated systems explain the logic behind those decisions.

Featured Projects

A Leading Laboratory

Charles River’s scientists and engineers have been conducting leading-edge research since our founding over 40 years ago. Our open, collegial lab collaborates with dozens of universities and research labs across the U.S. We are currently engaged in more than 200 R&D projects, focusing on some of the biggest challenges in AI. Find out more from these selected academic publications and presentations.

AI-Driven Course of Action Generation Using Neuro-symbolic Methods
Explainable Artificial Intelligence (XAI) for Increasing User Trust in Deep Reinforcement Learning Driven Autonomous Systems
Causal Learning and Explanation of Deep Neural Networks via Autoencoded Activations
Brittle AI, Causal Confusion, and Bad Mental Models: Challenges and Successes in the XAI Program

Our People

Jeff Druce
Senior Scientist

Stephanie Kane
Principal Scientist and Division Director

More +
Dr. James Niehaus

James Niehaus
Principal Scientist and Division Director

More +

Michael Harradon
Senior Scientist

More +

Our passion for science and engineering drives us to find impactful, actionable solutions.