×
×
×
×
×
×
×
×

Alumni

Alumni
×

Search

×

AFIT Researches Deep Learning Techniques Using Multi-modal Sensors to Support DoD’s Counter-sUAS Strategy

Posted Friday, February 23, 2024

 

Since 2021, AFIT has focused on building models which detect loose ranges of drones using pre-recorded audio.
2023 marked an inflection point in which phase two began where video of drones was added to the arsenal of detection techniques. (Contributed photo)

 

By Capt. Anthony Brunson, USAF
AFIT Graduate Student
Department of Electrical and Computer Engineering
Air Force Institute of Technology

The Air Force Institute of Technology has partnered with the Air Force Research Laboratory (AFRL) to research and develop new techniques to detect, classify, track, and counter Small Unmanned Aerial Systems (sUAS). With the rise of sUAS use for military use including reconnaissance, and attack, the Department of Defense has published the Counter-sUAS Strategy. The implementation of this strategy is being championed by all branches of the military. AFIT’s role is in researching new Deep Learning techniques using multi-modal sensors, including audio, visual, and infrared. Since 2021, AFIT has focused on building models which detect loose ranges of drones using pre-recorded audio. 2023 marked an inflection point in which phase two began where video of drones was added to the arsenal of detection techniques. The fusion of these two data sources provides valuable insight into our ability to accurately detect the location of a drone down to the meter. Phase three is comprised of real-time detection and tracking, and is planned to be completed by the end of calendar year 2023.

In the first phase of our research, audio recorded on consumer cell phones was processed through various machine learning models yielding detection accuracies of 87.5% and range prediction accuracies of 80.1% within 20 meters. Each observation was 256 milliseconds of data. It was found that the largest error in detection was at the threshold between classes, and at the maximum range boundary bordering on undetectability. We hypothesize that the error rate can be reduced by increasing the number of milliseconds between observations to a range that will maximize the difference between each observation. For example, a sUAS will travel farther in 760 milliseconds than 256 milliseconds, resulting in a larger change in the audio footprint over that time. This difference may result in more accurate predictions due to each class boundary containing larger changes between each border.

The second phase includes building a multi-node network capable of recording audio and video. To capture data, an array of 12 low-cost Raspberry Pi’s are configured with a microphone and camera, and loaded with a custom Python script that uses Python’s RPyC library to communicate with a central command station that sends record commands to all nodes asynchronously. Completed recordings are then sent from the nodes to the command station automatically upon completion for analysis.

The final phase will integrate the audio and video techniques to detect drones in real-time. The cutting edge to deep learning is the ability to make at or near real-time predictions. You Only Look Once (YOLO) is one technique used for real-time object detection from imagery. Using Picamera as the data collection source, data is streamed through YOLO and drone detection is output. Since we also require 3D positioning of drones, our research focuses on extending the detection mechanism to also include range predictions based on pixel size within each frame. The predictions generated from audio data will include triangulation of position based on sound. The fusion of these techniques will be the foundation for obtaining GPS coordinates and altitude of drones in the sky.

This research is foundational to the elimination of weaponized drones from the sky. In two short years, large strides have been made towards the ultimate goal of countering and defeating sUASs. Given the current world events and expansion of sUAS use to include swarms being sent into countries to neutralize people and infrastructure, it is vital that this research continues to enhance the capabilities of AFRL, the United States Air Force, and the Department of Defense.

 

 

More news...

Return to the top of the page

Air Force Institute of Technology
2950 Hobson Way
Wright-Patterson Air Force Base, OH 45433-7765
Commercial: 937-255-6565 | DSN: 785-6565