Charles River Analytics Inc., developer of intelligent systems solutions, proudly announces that our team will present research results at the 14th International Naturalistic Decision Making (NDM) Conference. We will discuss Causal Models to Explain Learning (CAMEL), an approach we developed under DARPA’s Explainable Artificial Intelligence program, which aims to create a human-machine dialogue with AI systems.
NDM focuses on how people make tough decisions in real-world scenarios, and takes place from June 18-21, in San Francisco, CA. The conference also provides a forum for discussion and engagement between academics and practitioners in the areas of healthcare, military command and control, crisis response, information technology, and aviation.
Our CAMEL project uses probabilistic causal models to explain to analysts how AI is learning and whether its conclusions are trustworthy.
“We developed probabilistic causal modeling techniques and an interpretive interface that allows users to better understand the AI reasoning that leads to its conclusions,” said Dr. James Tittle, Principal Scientist at Charles River Analytics and Co-Principal Investigator on the CAMEL effort. “At NDM 2019, our team will demonstrate enhanced user acceptance and trust of AI partners resulting from use of the CAMEL Explanation Interface.”
Learn more about our Human-Computer Interaction capabilities.
To learn more about Charles River or our current projects and capabilities, contact us.
Charles River Analytics conducts cutting-edge AI, robotics, and human-machine interface R&D to create custom solutions for your organization. Our track record speaks for itself—our implemented solutions enrich the diverse markets of defense, intelligence, medical technology, training, transportation, space, and cyber security. Our customer-centric focus guides us towards problems that matter, while our passion for science and engineering drives us to find impactful, actionable solutions.
At Charles River Analytics, we turn research into results.