Development of Dynamic Visual Artifacts for Resilient Human-Automation Collaboration

Stark, R., Voshell, S., Mahoney, S., and Farry, M.

Presented at the 4th International Conference on Applied Human Factors and Ergonomics (AHFE) San Francisco, CA (July 2012)

Prevalence of recent “Big Data” technologies such as cloud computing, data warehousing, and enterprise analytics continue to increase our powers to collect, transmit, and transform data across an increasing number of domains. However, the growing levels of collection and storage continue to outstrip our ability to interpret the data in various work environments, from the mission-critical stalwarts (e.g., the intelligence community, space mission control, power and energy management) to enterprise businesses and social media. Recent trends in information visualization and advanced automated analytics have the potential to support practitioners to overcome these new data overload challenges. However, to exploit such technologies, they must be designed to support collaboration between the human practitioners and the automated systems. By designing these technologies as a collaborative joint system up front requires new mechanisms for supporting the directability and observably of these systems.

Our team is developing a variety of visual artifacts that support collaboration between teams of humans and automated systems inspired by mixed-initiative artificial intelligence and boundary objects. Mixed-initiative artificial intelligence aims to integrate the data processing power of computers with qualitative insight from human users. Boundary objects are things—tangible, linguistic, or abstract—that are shared by teammates with different backgrounds to increase their collaboration. One form of boundary object that can also help human-automation teams is a visual artifact or visualization. However, when work context and circumstances change, these artifacts may become less useful. If paired with mixed-initiative artificial intelligence, both humans and automation can adapt the visual artifacts to support new contexts. We plan to evaluate these artifacts with representative users to determine the shared understanding and trust in teams of humans and automated systems. The expected result of humans and machines that share understanding and trust is continued performance in the face of environmental changes, or resilience.

For More Information

To learn more or request a copy of a paper (if available), contact S. Mahoney.

(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)