Robots Can Save the World, Too: Current and Future Applications of Autonomous Technologies for Wildlife Conservation
Presented at the Association for Unmanned Vehicle Systems International (AUVSI) 2020 XPONENTIAL Conference, Virtual (October 2020).
As wildlife become increasingly threatened by a warming climate, habitat loss and degradation, invasive species, disease, pollution, poaching, and other human activities, autonomous technologies have the unique ability to aid conservation efforts and combat these threats at the necessary level of efficiency and scale to drive positive, sustainable outcomes. As these technologies have rapidly matured over recent years, particularly unmanned air and ground vehicles (UxVs), computer vision and image processing, sensor processing and fusion, and advanced machine learning techniques, there is an emerging body of work applying these technologies to protect wildlife in novel and exciting ways.
Among those working to augment conservation with innovative technology, Charles River Analytics has a growing collection of USAF-, USN-, DARPA-, and NOAA-funded projects harnessing Charles River’s expertise in autonomous systems and enabling technologies to now detect, monitor, and protect wildlife across land, sea, and air. During this session, we presented an overview of our past, current, and future projects in this space, discussing the goals, challenges, outcomes, and future applications related to each effort. The efforts we discussed include, but are not limited to: BERNARD, a project focused on building realistic endangered species decoys (e.g. desert tortoises) with on-board sensor processing to detect, classify, and record predator interactions and deploy highly targeted non-lethal and non-toxic aversion stimuli to eventually train unwanted predators (e.g. human subsidized ravens and coyotes) from preying on protected species; SAFE Species, a project focused on developing intelligent software to augment existing UxVs as semi-autonomous field survey devices capable of autonomous navigation and mapping, automatic vision identification of endangered species and related key characteristics (e.g. burrows, scat), and extended mission duration capabilities; NEMO, a project using sensor fusion from surface (video) and subsurface (acoustic) to detect, classify, and localize marine mammals in cold/dark (IR) and warm/bright (EO) environments and protect them from ship strikes and sonar exposure; JONAH, a related project to develop a human-in-the-loop system for reliable around-the-clock shipboard whale protection through onboard, automatic, and precise marine mammal activity detection, identification, localization, and immediate alerting to increase the crew’s ability to react and reduce potential whale strikes; and, SOUSAPHONE, a project to develop an aircraft and autopilot agnostic software framework that enables multiple ship-launched unmanned aircraft systems (UAS) to safely coordinate semi-autonomous beyond line-of-sight (BLOS) wildlife survey operations (e.g., for pinnipeds) using advanced artificial intelligence and machine learning (AI/ML)-based computer vision and relative navigation techniques in harsh and challenging environments such as the Alaskan arctic.
We concluded this session with a generalized discussion of our lessons learned across these projects, technologies, and domains, and how they can be applied to future conservation technology efforts. We will also discuss our thoughts on related future research directions, remaining obstacles for technology adoption and implementation, commercial benefits, and the unique ecological and ethical considerations that emerge with the nature of this work.
For More InformationTo learn more or request a copy of a paper (if available), contact Caroline Kingsley.
(Please include your name, address, organization, and the paper reference. Requests without this information will not be honored.)