Charles River Analytics and The Image and Video Computing Group at Boston University completed development of the Taxiing Via Intelligent Gesture Recognition (TIGER) for UAVs project for the Naval Air System Command (NAVAIR). TIGER is a vision system designed to address the US Navy’s need to command Unmanned Aerial Vehicles (UAVs) onboard aircraft carriers.
With the growing use of unmanned vehicles throughout the military, new technologies are needed to address the challenges of using UAVs in new applications. One such challenge involved the US Navy needing to solve taxiing problems before allowing Unmanned Combat Air Vehicles (UCAVs) to use aircraft carriers. The Navy wanted to be able to command UAVs with similar gestures that are used to command manned aircraft. The TIGER vision system meets this need by using real-time video processing, image segmentation, and fusion of motion and shape to track body/hand movements, in conjunction with Boston University’s Specialized Mappings Architecture (SMA), which is used to interpret the tracked movements as individual gestures.
By using a specific camera configuration, the TIGER system searches for the UAV’s director on the deck of an aircraft carrier. It then recognizes the director’s body pose and hand position as a gestural command, and determines the appropriate response to the command. Using this technology, a UAV is able to locate and track a director, recognize the director’s gesture, and then execute the command.
Vitaly Ablavsky, Senior Research Engineer at Charles River Analytics, was the principal investigator on this project. Mr. Ablavsky’s expertise is in computer vision, embedded real-time systems, simulation, and software engineering. His responsibilities for this project included the high-level system design and motion analysis algorithms.