Supporting a Rapidly Advancing Industry: Ensuring XR Utility and Usability Through Software Development, Human Factors Research, and Novel Applications

Kingsley, C., Duggan, D., and Jenkins, M.

73rd Department of Defense Human Factors Engineering Technical Advisory Group Meeting (DoD HFE TAG), Aberdeen, MD (April 2019)

While the extended reality (XR) industry – collectively augmented reality (AR), mixed reality (MR), and virtual reality – rapidly progresses, essentially reinventing the very definition of these terms with the introduction of each new head-mounted display (HMD), there comes a need for engineering (i.e. software development), research (i.e. fundamental, human factors), and applications (i.e. networked training simulations) of these technologies to maintain pace and lay the groundwork for future innovations.

In pursuit of charting the XR road ahead, Charles River Analytics has explored these areas of interest under a variety of DoD-funded XR efforts. Specifically, as new XR hardware devices (i.e. peripherals, HMDs, tracking systems) enter the market, they are often not immediately interoperable out of the box, leading to fractured systems and requiring time-consuming software development work to be redone, as developers must design and develop interaction systems for each new device. To account for this issue, we developed an XR SDK to support device interoperability, enabling plug-and-play for a wide variety of XR devices and increasing hardware flexibility. In terms of research on XR’s utility and specifically related to the variety of devices, we are currently carrying out a human subjects research study on the effect of the fidelity of interactions on training of fine- and gross-motor control tasks in virtual environments, early results of which we plan to discuss.

Finally, in an effort to contribute to advances within the field, we have explored novel applications and uses for these XR technologies. Notably, we are exploring the ways in which users wearing AR/MR HMDs and VR HMDs can effectively interact within the same, networked XR environment, creating an interface between varying synthetic realities.

For More Information

To learn more or request a copy of a paper (if available), contact Caroline Kingsley.

(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)