Bracken, B.1, Negri, A.1, Amazeen, P.3, Likens, A.3, Fedele, M.4, Demir, M.4, Palmon, N.1, deB Frederick, B.2, and Cooke, N.4
Presented at the Medical Health Systems Research Symposium (MHSRS), Kissimmee, FL (August 2016)
Background: Military medical personnel are deployed to operational environments where their success saving lives depends on their ability to act quickly and effectively, both as individuals and as teams. Medic training often includes high-fidelity simulations. However, currently trainers infer competence by observation alone—a challenging task. We designed a system to automatically sense indicators of cognitive workload to augment performance observations, offering insight into factors underlying that performance.
Method: We are now validating our system to augment training by Monitoring, Extracting, and Decoding Indicators of Cognitive Workload (MEDIC) using undergraduates completing cognitive and physical challenges well-validated in their effect on cognitive and physical workload. MEDIC includes: (1) a multimodal suite of unobtrusive, field-ready neurophysiological, physiological, and behavioral sensors, including a user interface (UI) for trainers/experimenters to enter notes during the scenario on team metrics and when important events occur; (2) a data processing and fusion engine to process and time-align raw data originating from the sensor suite; (3) probabilistic modeling techniques to interpret the best indicators of cognitive workload and team dynamics; and (4) an after-action review user interface to display interpreted human states time-aligned with important events that occurred across the experiment or training scenario.
Results and Conclusions: Our forehead device, designed to be mounted on the inside of a baseball cap or standard issue helmet, includes functional near-infrared spectroscopy and accelerometry sensors to measure indicators of cognitive workload. Our body band device includes SpO2, galvanic skin response, accelerometer, magnetometer, gyroscope, and temperature sensors positioned inside of an mp3 player running sleeve to measure indicators of physical activity. Finally, our trainer-annotation UI allows the trainer or experimenter to enter details of the simulation into the system prior to and during the actual exercise. Through this UI they can take notes and record time-tagged photos and videos to record performance details as the scenario unfolds. During our initial validation study using Arizona State University students, teams of 3 participants complete an obstacle course of physical and cognitive challenges including (1) memorizing word lists, (2) balancing moving balls on weighted boards, (3) completing puzzles, (4) passing weighted medicine balls while balancing on Bosu balls, (5) completing logic problems, (6) constructing walls using boxes of different sizes and weights, and (7) jumping rope in sync with other team members. Once data are collected, our data processing and fusion engine de-noises, processes, and time-aligns data from all sensors. Then data are pushed through our data interpretation engine that uses statistical and probabilistic modeling techniques to output estimates of individual (e.g., stress and cognitive workload) and team state (e.g., team dynamics). Our next validation is at Vanderbilt Medical School where we will collect data on medical residents during high-fidelity training simulations.
1 Charles River Analytics, Cambridge, MA
2 McLean Hospital/Harvard Medical School, Belmont, MA
3 Arizona State University, Tempe, AZ
4 Arizona State University, Mesa AZ
This work was supported by the U.S. Army Medical Research and Material Command under Contract No. W81XWH-14-C-0018. The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation.
For More Information
To learn more or request a copy of a paper (if available), contact Bethany Bracken.
(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)