ALBUQUERQUE, N.M. — Remember what it’s like to twirl a sparkler on a summer night? Hold it still and the fire crackles and sparks but twirl it around and the light blurs into a line tracing each whirl and jag you make.
A new patented software system developed at Sandia National Laboratories can find the curves of motion in streaming video and images from satellites, drones and far-range security cameras and turn them into signals to find and track moving objects as small as one pixel. The developers say this system can enhance the performance of any remote sensing application.
“Being able to track each pixel from a distance matters, and it is an ongoing and challenging problem,” said Tian Ma, a computer scientist and co-developer of the system. “For physical security surveillance systems, for example, the farther out you can detect a possible threat, the more time you have to prepare and respond. Often the biggest challenge is the simple fact that when objects are located far away from the sensors, their size naturally appears to be much smaller. Sensor sensitivity diminishes as the distance from the target increases.”
Ma and Robert Anderson started working on the Multi-frame Moving Object Detection System in 2015 as a Sandia Laboratory Directed Research and Development project. A paper about MMODS was recently published in Sensors.
Detecting one moving pixel in a sea of 10 million
The ability to detect objects through remote sensing systems is typically limited to what can be seen in a single video frame, whereas MMODS uses a new, multiframe method to detect small objects in low visibility conditions, Ma said. At a computer station, image streams from various sensors flow in, and MMODS processes the data with an image filter frame by frame in real time. An algorithm finds movement in the video frames and matches it into target signals that can be correlated and then integrated across a set of video frame sequences.
This process improves the signal-to-noise ratio or overall image quality because the moving target’s signal can be correlated over time and increases steadily, whereas movement from background noise like wind is filtered out because it moves randomly and is not correlated.
Before MMODS was deployed for remote sensing enhancement, Ma and Anderson demonstrated its effectiveness on simulated data with target objects as small as one pixel with a signal-to-noise ratio close to 1:1, meaning there is no distinction between signal and noise.
These objects would normally be undetectable to both human eyes and sensors. The baseline detector system achieved a 30% chance of detecting a moving object. When MMODS was added to that system, it had a 90% chance of detection without increasing the rate of false alarms.
In another demonstration, the researchers used MMODS to detect moving objects from live data collected with a remote camera at the peak of Sandia Mountain. Without prior knowledge of Albuquerque’s roads, MMODS detected vehicles moving throughout the city.
“Given that a modern video camera has about 10 million pixels, being able to detect and track one pixel at a time is a major advance in computer vision technology,” Ma said. “MMODS has been proven to improve modern detection sensitivity by 200 to 500% and works for fast- and slow-moving objects, even in poor visibility conditions.”
Journal
Sensors
Method of Research
Computational simulation/modeling
Subject of Research
Not applicable
Article Title
Remote Sensing Low Signal-to-Noise-Ratio Target Detection Enhancement
Article Publication Date
21-Mar-2023