News and Events

PAL understands both verbal and nonverbal instructive gestures, fostering true collaborative human-robot interaction. Using large language models (LLMs), PAL reduces uncertainty to make accurate decisions, even in novel situations.

From gesture to action: How collaborative robots are becoming smarter and safer

Collaborative robots are gaining traction, but for them to become true working partners, they must first bridge the gaps in trust. Building trust starts with realizing that human communication involves more than just speech—it includes body language, hand gestures, and other nonverbal cues. To build a truly collaborative robotic application, Charles River Analytics has developed Person Aware Liaison (PAL), which is designed to integrate both verbal and nonverbal human inputs to guide robotic actions and foster trust. The Phase II Small Business Innovation Research (SBIR) contract is funded by NASA.

PAL works by making inferences from modalities such as gesture, gaze, and body posture in addition to analyzing speech. These data points then collectively query a large language model (LLM) that delivers the appropriate reaction for the robot to follow. The final actions that PAL executes keep the human in mind and are both safe and non-intrusive, making the robot a smarter and more reliable partner.

Dr. Madison Clark-Turner, Robotics Scientist at Charles River Analytics and Principal Investigator on PAL, says that trust in autonomous systems comes from perception and expectation. A user will approach an autonomous system with a set of expectations of what that system can accomplish and then will test those expectations. If the system falls short, the user may lose trust in their ability to collaborate, leading to inefficiencies and a drop in morale.

A robot that can draw on contextual cues from a human and then respond in a courteous manner, however, can encourage trust and more efficient collaboration. PAL interprets both verbal and nonverbal inputs from its human partner to make contextual leaps regarding user intent. For example, if a human asks a robot, “Can you bring me that?”, the robot will need to interpret what the human is pointing at. Or if a human asks a robot to place a tool back in its original spot, the robot will need to look back through its history of actions to determine the tool’s original location. PAL also factors in user gaze and posture to infer engagement, deducing when to follow up and interact with the human and when to leave them alone to focus.

PAL’s LLM is based upon a publicly available language model. “Our foundation model is great for question asking but is unstructured in its output, so we’re adding our secret sauce to make it work the way we want to,” Clark-Turner says. PAL also borrows from Charles River expertise in physiological sensing, UX, and assistive robotics.

PAL is especially useful in today’s robotics applications, which have moved away from traditional, physically enclosed or caged robots, Clark-Turner says. “There’s a large unstructured environment outside of caged robots that is not easily described and where humans are just better equipped to work.” On the other hand, we might not always have the strength or the endurance to complete heavy lifts or repetitive tasks. “There is absolutely a merging of human-robot worlds that can happen, and we are primed to make this as smooth a transition as possible with PAL,” Clark-Turner adds.

After completing Phase I, which included modalities of gaze, gesture, and speech, the next phase will include additional modalities such as body posture and informative gestures (such as directions for movement of objects).

PAL is also developing general skills such as manipulation tasks—picking up, rotating, and handing things to the user. These skills are essential for assembly, maintenance, repair, and manufacturing across diverse environments on land, underwater, and in space. “We see a broad roadmap of applications down the line, everything from assistive medical care to industrial operations to help with repairs and maintenance,” Clark-Turner says. Another potential application: helping NASA execute missions in space.

Contact us to learn more about PAL and our other capabilities in robotics and autonomy.

Solutions to serve the warfighter, technology to serve the world®

Charles River Analytics brings foundational research to life, creating human-centered intelligent systems at the edge of what’s possible, through deep partnerships with our customers. 

To learn more about Charles River or our current projects and capabilities, contact us

For media inquiries, please contact Longview Strategies.