News and Events

CRA staff and scientists around a table
Dan Duggan (third from left) and Caroline Kingsley (far right) along with other Charles River staff play with a collection of virtual reality gear at the Charles River office

Cracking the Code of Virtual Reality at Charles River Analytics: Advancing the VR Landscape with New Software Tools

Software engineer Dan Duggan was sitting in his office in 2017, looking through an augmented reality headset at furniture rendered in 3D and displayed around the room. It was one of the first times he was playing with this kind of headset, and its power hadn’t really clicked for him; until he took it off, and tried to set it down on a piece of furniture that was no longer there.

Duggan is just one of roughly a dozen engineers and scientists at Charles River Analytics who have become experts in the rapidly evolving science and engineering of augmented reality (AR) and virtual reality (VR) while winning a steady stream of SBIR contracts from military and government agencies.

Staff have partnered with commercial AR and VR hardware manufacturers, contributed to open source tools, and most recently, received a direct grant from the makers of one of the most popular VR game development engines, Epic Games. Although Charles River historically has focused on applying AR and VR on a wide range of disparate projects, a unifying ambition has emerged to push the boundaries on the overall field, a field whose potential has long seemed to overshadow its progress.

Senior software engineer Arthur Wollocko joined Charles River in 2011 after earning his BS in computer science from Skidmore College. He now leads many of the company’s VR development efforts, advising and working with other developers like Duggan. In January 2017, Wollocko began serving as an adjunct professor at Champlain College in Vermont, where he teaches a curriculum on AR and VR, including a history of the field often referred to as XR.

“VR has almost come into the mainstream about three times. It dates back to 1954 for its first digital realization,” Wollocko said. But for various reasons, VR has always fizzled out. He thinks that is all about to change in the next couple of years, with Charles River being just one of many companies joining together to create a fully-fledged ecosystem. “This time around it won’t fail—it will stick around and there will be a breakthrough app.”

Creating New Worlds

For Charles River staff to have gotten to the point where they can address the challenges of the broader XR community, they had to become deeply experienced in the details of the field. XR is often superficially associated with games and entertainment, but actually is much deeper in scope.

AR and VR are distinguished by their use of virtual environments that contain virtual content, which users can interact with through ordinary senses like sight, touch, and sound. A simple example of a virtual environment is a traditional desktop computer display, which users interact with through sight of the screen and touch of the mouse. More complex versions involve layering virtual elements on top of reality with a headset, as with AR, or fully immersing a user into a virtual world, as with VR.

VR/AR Venn Diagram
VR and AR employ immersive virtual environments to give a user new experiences

“One of the big values of AR and VR is the concept of immersion. If you look it up, it’s not really well defined,” Duggan said. He’s developed an understanding of the critical role of immersion while working on individual projects.

One of those projects is BARRACUDA, which provides motorcycle riders visual or auditory alerts before they encounter hazards like potholes or traffic accidents. These visual alerts can be delivered through helmet-mounted heads up displays (HUDs) or smart glasses like the Everysight Raptor, designed for cyclists.

The alerts provided by the Charles River system show up as images in a riders’ visual field; they augment their reality while being intentionally designed to minimize distraction, as proven out by user surveys. This augmentation is starkly distinct from VR, which replaces the real world with an artificial experience, creating a potentially magical but also peculiar “psychophysics” effect.

“People’s expectations about how software works is going to change [as VR is more widely adopted],” Duggan said. From his observations, people who experience cutting-edge VR “seem to think of it less as a game or simulation, and almost more as a separate reality that they’re stepping into.”

One of the many VR projects underway at Charles River is VECTOR, a mission command planning tool that lets a large group of commanders participate in a virtual command room, interacting with a strategic map in 3D. VECTOR leverages advances in VR hardware and software to distribute a command post—typically a single room in a real-world location—over a much broader geographic area, where many different staff can participate.

At Charles River, XR tends to be used for three main applications: collaboration tools (e.g., VECTOR), task support tools (e.g., BARRACUDA), and training simulations. Some applications use both VR and AR, such as VECTOR. Mission planners must study battlefields from home base—where the full immersion of VR can be helpful—but also while out on deployment, where AR makes the most sense.

Scientists at Charles River have found that AR and VR are rarely if ever used interchangeably; they are typically best suited to do different things.

“VR is good for environments that are really hard to recreate… AR is really good if you’re doing a job in the real world and you need to augment your info,” said scientist Caroline Kingsley, who leads the BARRACUDA project. “If you’re trying to train combat medics, and there are all these casualties everywhere, then [without VR] you need to act all those out and it’s really expensive.”

The Cost of Replacing Reality

Part of what makes XR challenging to develop is its fundamental dependency on immersion, which typically requires hardware solutions—headsets or smart glasses to provide natural visual engagement; hand trackers so that users can touch and grab things; and so on. Ten years ago, much of this hardware was lacking, and not coincidentally, XR activities at Charles River were minimal.

However, in the last several years, hardware that enables XR has become increasingly available; and not only that, but it’s become powerful, useful, and affordable. Although these improvements have allowed XR applications at Charles River to become commonplace, the immaturity of the hardware and hardware/software integration ecosystem has posed a formidable challenge.

In certain cases, new hardware has emerged only to immediately disappear off the map. For example, in the second phase of the BARRACUDA project, which started in 2016, Charles River teamed with NUVIZ—the makers of a helmet mounted HUD—to create a system that displays alerts to a rider. The design of this interface required a substantial amount of software engineering specific to the NUVIZ.

Image from BARRACUDA project
VR and AR employ immersive virtual environments to give a user new experiences

But by the end of 2019, when the team was looking at turning their mature research into a product, the NUVIZ company that their software depended on became inexplicably defunct. “Does anyone know, for sure, what happened to NUVIZ?” said one rider in the popular ADVrider motorcyclists’ internet forum.

The problems generated by the unstable hardware landscape motivated Charles River staff to generalize their efforts from solving specific problems to tackling broader XR development obstacles. They created the Virtuoso Software Development Kit (VSDK), which enables XR developers to create applications that interoperate with a wide range of devices.

VSDK integrates with the Unity Engine VR development suite, widely used to develop VR software applications. Wollocko had already started using Unity for projects several years prior to creating VSDK because it provided much higher fidelity 3D rendering, often needed for graphics portrayed in XR.

“I was working on a big BAA [broad agency announcement], and creating Java Swing interfaces, these old-school, clunky interfaces for training people,” Wollocko said. “Then I’d go home and play Call of Duty, and I’d be like: why the heck am I not doing this at work?”

Use of the Unity Engine and VSDK has allowed Wollocko and other Charles River engineers like Nicolas Herrera to create applications with graphics comparable to modern games. In addition, VSDK supports the implementation of common VR behaviors, like picking up an object and throwing it, providing great utility for developers.

Now, Wollocko, Duggan, and Herrera are leading an effort funded by Epic Games to extend VSDK to the Unreal Engine, the other main leading game development suite.

After many years of poking and prodding at the limits of XR on individual applications, Charles River staff are now advancing the landscape as a whole. And in the next few years, just like Duggan experienced with his 3D furniture, we may all be forgetting the difference between fantasy and reality—in a good way.

Related Articles

Charles River Analytics awarded Epic MegaGrant

Charles River Analytics awarded Epic MegaGrant to bring virtual and augmented reality development platform to Unreal Engine

Charles River Analytics Launches an Open-Source SDK that Solves Key XR Challenges (Business Wire)

Open-Source VIRTUOSO SDK for Unreal Engine – a Standard Framework for XR Development

Solutions to serve the warfighter, technology to serve the world®

Charles River Analytics brings foundational research to life, creating human-centered intelligent systems at the edge of what’s possible, through deep partnerships with our customers. 

To learn more about Charles River or our current projects and capabilities, contact us