Enhancing Situational Awareness through Perceptual and Cognitive Acclimation to Augmented 360-Degree Visual Fields
Special Augment Event at Interservice/Industry Training, Simulation and Education Conference (I/ITSEC), Orlando, FL (to be presented in December 2019)
Charles River Analytics is developing a system to enhance situational awareness (SA) by extending humans’ natural field of view (FOV), roughly 120°, to a real-world, 360° FOV. Our system provides a mechanism to facilitate acclimation into and out of an augmented FOV by using gaze tracking to compress a dynamically adjusted 360° scene into a naturally viewable FOV, ensuring distortion does not inhibit focal vision. Our prototype system is currently deployed on the FOVE virtual reality (VR) head-mounted display (HMD), Magic Leap One augmented reality (AR) HMD, and Android-based mobile VR. We have conducted preliminary human subjects research to validate the feasibility for individuals to perceptually and cognitively acclimate to this type of augmented FOV, specifically using an anatomically-informed distortion map aligned to the foveal and peripheral regions of the eye. This pilot study demonstrated a 40% reduction in reaction time to detect and localize the position of a target stimulus appearing in a 360° static visual scene (with equivalent localization accuracy), and a 14% increase in detection of stimulus presence when viewing the 360° scene in VR and a 25% increase in detection when viewing on a 2D screen, all compared to performance on an unaided control condition. We are also investigating specific strategies for facilitating acclimation for effective operation while in the augmented FOV, as well as integration of visual and auditory cues and annotations to support acclimation and/or task performance. This system is being developed to aid and/or improve job performance, particularly for warfighters as they often operate in highly dynamic 360° environments at risk of threats emerging from all sides. This system can improve performance across a variety of tasks (e.g. surveillance, navigation, and object/threat detection, identification, and localization) across different classes of warfighters (e.g., manned and unmanned vehicle pilots and crew, dismounted warfighters, remote ISR staff), all of whom are subject to different SA challenges (e.g., remote UxV operators often suffer the same occlusion issues as crew in confined cabin vehicles, but lack motion and auditory cues available to those individuals) that may be addressed by full-proximity SA to help ensure tactical mission success.
For More InformationTo learn more or request a copy of a paper (if available), contact Caroline Kingsley.
(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)