Tuesday, June 18, 2019

 Charles River Analytics Inc., developer of intelligent systems solutions, proudly announces that our team will present research results at the 14th International Naturalistic Decision Making (NDM) Conference. We will discuss Causal Models to Explain Learning (CAMEL), an approach we developed under DARPA’s Explainable Artificial Intelligence program, which aims to create a human-machine dialogue with AI systems.

NDM focuses on how people make tough decisions in real-world scenarios, and takes place from June 18-21, in San Francisco, CA. The conference also provides a forum for discussion and engagement between academics and practitioners in the areas of healthcare, military command and control, crisis response, information technology, and aviation.


Our CAMEL project uses probabilistic causal models to explain to analysts how AI is learning and whether its conclusions are trustworthy.

“We developed probabilistic causal modeling techniques and an interpretive interface that allows users to better understand the AI reasoning that leads to its conclusions,” said Dr. James Tittle, Principal Scientist at Charles River Analytics and Co-Principal Investigator on the CAMEL effort. “At NDM 2019, our team will demonstrate enhanced user acceptance and trust of AI partners resulting from use of the CAMEL Explanation Interface.”

Our CAMEL approach will significantly impact the way that machine learning systems are deployed, operated, and used both inside and outside the Department of Defense.

Learn more about our Human-Computer Interaction capabilities.