
A mixed-reality application suite for space situational awareness

Space Operation Visualizations Leveraging Augmented Reality (SOLAR)
As part of the Defense Advanced Research Project Agency’s Hallmark program, SOLAR helps operators quickly develop situational awareness to maintain critical space assets.
Operators must accurately track assets in space—a task becoming ever more difficult due to the vast number of unpredictable players in space. Although operators need more advanced software tools to anticipate and respond to actions, they often have to rely on outdated processes—and incomplete information—to execute US military operations.

Solar supports the understanding of 3D and 4D space concepts, including satellite structure, fuel consumption, orbital elements, conjunction risks, maneuver prediction and control, and other complex interactions.
Immersive experiences
Augmented (AR), virtual (VR), and mixed-reality (MR) solutions enable custom experiences contextually tailored to individual needs that maximize effectiveness based on deep human factors expertise.
Spatiotemporal exploration
3D visualizations, filtering, and annotation tools enhance spatiotemporal understanding for satellite visibility on Earth, proximity-based conjunction assessments, and potential maneuver options.
User-driven context
Synchronized XR overlays display contextual metadata information based on user-level information access control to provide context for what is observed and support intuitive understanding of dynamic satellite constellations.
Multidevice integration
Device and web networking allow collaboration, data streaming, watchlist configuration, and rapid content sharing between 2D and 3D mediums.
“Operators can make faster and more confident decisions with SOLAR because it fuses proven human–computer interaction technologies with next-gen augmented reality displays. SOLAR also offers intuitive visualizations and a workflow-centric graphical user interface. With SOLAR, operators can quickly make sense of outputs from novel AI, machine learning, and other advanced analytic tools developed by both Charles River Analytics and third-party companies.”

Daniel Stouch
Principal Scientist and Principal Investigator on the SOLAR effort
SOLAR features

Core space education content, tools, and interaction systems provide the foundation for physics-based models and natural, intuitive gesture-based interaction.

Underlying 3D models support use by many AR and VR headsets, such as the Magic Leap, Microsoft HoloLens, Oculus Quest, and HTC Vive.

Cross-device, synchronous networking support for education-based use enables distributed, remote learning with real-time interaction and feedback.
We developed an analytic tools under our PICASSA effort (also part of the Hallmark program). Our PICASSA analytic tool improves threat detection in space through probabilistic modeling and simulation. SOLAR allows space operators—who have minimal training in the reference models and reasoning behind PICASSA—to easily understand and incorporate these outputs into their space situational awareness and course-of-action development workflow to arrive at decisions faster and with greater confidence.
From the Press
Contact us to learn more about SOLAR and our other Adaptive Intelligent Training capabilities.
This material is based upon work supported by the United States Air Force and DARPA under Contract No. FA8750-17-C-0170. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Air Force and DARPA.