UNMANNED aerial vehicles, such as the drones Predator and Reaper, are rewriting the rules of battle in Iraq and Afghanistan and are being considered for monitoring U.S. borders. Operated by remote control, sometimes thousands of kilometers from the battlefield, the aircraft use multiple cameras to track and destroy the enemy.
These drones can fly several kilometers high, hover over targets for 24 hours and longer, and are often armed with bombs and air-to-ground missiles. By finding, tracking, and monitoring people, vehicles, and events of interest on a continuous basis with wide-area video cameras, the aircraft provide persistent intelligence, surveillance, and reconnaissance (ISR). In theory, persistent ISR coverage can prevent enemies from evading overhead surveillance systems, thereby enabling fast decision making for tactical missions, with reduced risk to U.S. forces and noncombatants.
A new Livermore computational system is designed to help the Department of Defense and other agencies monitor tens of square kilometers of terrain from the skies, with sufficiently high resolution for tracking people and vehicles for many hours at a time. The system, called Persistics, promises to overcome a severe and growing problem: the overwhelming volume of video data generated by modern overhead imagery. Military wide-area video surveillance cameras such as Constant Hawk and Angel Fire are collecting ever-increasing amounts of valuable video imagery over large geographic areas. However, the technology to promptly process, store, and extract meaning from data, and then transmit this information over long distances for further analysis, has lagged far behind.
Data Overwhelm Capabilities
Computer scientist Mark Duchaineau, one of Persistics’ key architects, says, “The data-processing infrastructure for national security is not designed for the amounts and types of data being generated by unmanned aerial drones relative to the scale of human resources available to analyze them through conventional ’VCR playback’ style of viewing.” In addition, the communication bandwidth supporting data transmission from air to ground and the archive storage capability are much too slow or too small to support fast-turnaround human analyses.
Duchaineau says, “Several years ago, Department of Defense managers presented us with this interesting processing challenge.” The Livermore scientists knew they needed to work on the software side to get the most information out of the streaming video imagery. The greatest opportunity for helping tackle this challenge lay in the development of advanced algorithms (short computer programs) implemented on new computer architectures.
Persistics, the product of several years’ effort, is an innovative data-processing “pipeline” that takes a radically different approach to addressing the video-data overload challenge. (See the box below.) The technique retains the level of detail necessary for detecting anomalies while at the same time compressing the unchanging “background” and everything in motion by about 1,000 times without losing pertinent information. As such, the approach ameliorates the dearth of communication bandwidth for transporting video without losing image fidelity. Indeed, Persistics technology can produce subpixel resolution for the background and any “movers” (people and vehicles), thereby allowing for additional analyses of suspicious activities. A single pixel can correspond to anywhere from several square meters to less than 1 square meter of real estate.
The Persistics architecture can support near real-time monitoring for tactical combat missions as well as forensic analysis of past events. Its analysis algorithms permit surveillance systems to “stare” at key people, vehicles, locations, and events for hours and even days at a time while automatically searching with unsurpassed detail for anomalies or preselected targets. The Livermore breakthrough combines optimized hardware featuring the newest generation of graphics chips (typically used for computer gaming) with innovative algorithms. Some algorithms focus on compressing data while others analyze the streaming video content to automatically extract items of interest.
The system stabilizes incoming video imagery through extremely accurate calibration of onboard cameras and correction of pixel-to-pixel intensity variations. In this way, a mosaic made from hundreds of individual sensors can be represented as an image from a single camera. This procedure allows the system to exploit the spatial and temporal redundancy in the data and tease out movers from the static background.
In Use at Ground Stations
Analysts working at ground stations will interact with the transmitted airborne video data. For example, Persistics has been integrated into the Air Force Research Laboratory–developed Pursuer viewer to allow analysts to pan, zoom, rewind, query, and overlay maps and other metadata. With this viewer, they can use Persistics to make requests such as, “Give me the frames that recorded this vehicle from one to two o'clock this afternoon,” or “Show me all the vehicles that stop at this location today.” Says Persistics project leader Holger Jones, “With Persistics, analysts can determine the relationships between vehicles, people, buildings, and events.”
Persistics data-processing modules rely on commercial video-processing hardware designed for computer graphics, video editing, and games. “GPUs [graphics processing units] continue to grow in power, while shrinking in size and energy requirements,” says Jones. In fact, their processing power is outpacing that of central processing units (CPUs), which run most computer operations. “We're riding the GPU technology wave,” says Jones.
Vaidya led a former project that researched how GPUs might be programmed and used in knowledge-discovery applications relevant to national security. “We realized these processors—traditionally designed for fast rendering of visual simulations, virtual reality, and computer gaming—could provide efficient solutions to some of the most challenging computing needs facing the intelligence and military communities,” says Vaidya. (See S&TR, November 2005, Built for Speed: Graphics Processors for General-Purpose Computing.)
The Persistics software is currently housed on a 12-node minicluster, and each node contains a combination of CPUs and GPUs. To help meet ISR requirements in a cost-effective manner, the Livermore Persistics team has optimized a combination of microprocessors and high-end graphics cards found in both gaming boxes and many personal computers. When combined with a high-speed network and software tools written in open-source (not vendor-proprietary) code, the clusters outperform larger and more expensive proprietary engines in extracting information from visual data. (See S&TR, November 2004, From Seeing to Understanding.)
Persistics Coming On Board
Persistics is being further enhanced to work with DARPA’s newest generation of real-time persistent surveillance capability called the Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System (ARGUS-IS). This system to detect and track events on battlefields and in urban areas can cover 100 square kilometers, a significantly greater aerial footprint than current systems. The ARGUS-IS video-collection rate of 12 hertz (frames per second) is considerably greater than previous systems, which operate at 2 hertz. ARGUS-IS comprises 368 cameras of about 5 million pixels each, identical to those used in cell phones. The cameras operate together behind four high-quality telescope lenses. In all, ARGUS-IS has 1.8 billion pixels compared with 4 million pixels in the first Sonoma sensor (the predecessor to ARGUS-IS) developed at Livermore in 2003, 176 million pixels in the most advanced Sonoma sensor developed in 2007, and 800 million pixels in the sensor developed in 2009 by the Massachusetts Institute of Technology’s Lincoln Laboratory.
Persistics can simultaneously and continuously detect and track the motion of thousands of targets over the ARGUS-IS coverage area of 100 square kilometers. ARGUS-IS can generate several terabytes of data per minute, hundreds of times greater than previous-generation sensors. “Until now, we had no practical way to store that much data,” says Jones. “With Persistics, we have an innovative method to compress the equivalent of thousands of hard drives to just a few drives.”
Decades of Visualization Research
Among other accomplishments, researchers have pioneered methods to locate areas of interest in three-dimensional simulations of nuclear weapons physics generated by supercomputers performing trillions of operations per second. Laboratory scientists have also developed computer-automated techniques to search for microscopic flaws in glass optics used in the National Ignition Facility, the world's most energetic laser.
Many federal agencies are interested in Persistics applications, including the Department of Energy, Department of Defense, and Department of Homeland Security. Collaborations in ISR for nuclear safeguards and treaty monitoring support extend to international organizations such as the International Atomic Energy Agency and the Comprehensive Nuclear-Test-Ban Treaty Organization.
Over the past decade, Livermore researchers have studied the vexing issue of sensor output growth far outpacing human capabilities to ingest and react to the collected data in a timely manner. Early work, funded through the Laboratory Directed Research and Development Program, focused on developing software techniques that could first “quiet” the jittery video taken onboard a moving airborne vehicle buffeted by the atmosphere and then compress the data. The research effort grew into the Department of Energy project called Sonoma, which developed wide-area sensors for monitoring nuclear nonproliferation. The Sonoma Persistent Surveillance System featured wide-area views at high resolution, real-time onboard data processing, and high-performance visualization at the receiving end. In 2006, the Sonoma Project team received an R&D 100 Award for their innovation. (See S&TR, October 2006, Surveillance on the Fly.)
“People in the Department of Defense got excited because they realized they could use the same techniques to look for terrorist activities,” says Duchaineau. In 2005, the video-camera effort that began with Sonoma was passed to the Department of Defense, which developed Angel Fire for the Air Force and Marine Corps and Constant Hawk for the Army. ARGUS-IS, built by BAE Systems, Inc., is the newest imaging system.
During Sonoma’s development, DARPA managers turned to Livermore to explore avenues for reducing the massive volume of data collected by these new sensors without sacrificing image quality. Vaidya recalls, “Our inability to transport the data to ground in an efficient manner had become a bottleneck.” Persistics provides not only an efficient means of stabilizing and compressing video to transport it across limited bandwidth channels but also back-end anomaly detection and behavior analysis to differentiate between normal and abnormal patterns of behavior.
Fast Forward or Rewind
At the other end of the spectrum, Persistics supports forensic analyses. Should an event such as a terrorist attack occur, the archival imagery of the public space could be reviewed to determine important details such as the moment a bomb was placed or when a suspect cased the targeted area. With sufficiently high-resolution imagery, a law-enforcement or military user could one day zoom in on an individual face in a heavily populated urban environment, thus identifying the attacker.
Persistics technology will soon be made available as open source to the government and subcontractors. The technology is modular to allow future plug and play, as Livermore scientists develop additional automated techniques. For example, they are researching ways to make possible the three-dimensional viewing of targets, which could further enhance data compressibility. They are also exploring methods to overlay multiple sensor inputs—including infrared, radar, and visual data—and then merge data to obtain a multilayered assessment.
Vaidya notes that unmanned aircraft have demonstrated their ISR value for years in Afghanistan and Iraq. As U.S. soldiers return home, the role of overhead video imagery aided by Persistics technology is expected to increase. Persistics could also support missions at home, such as monitoring security at U.S. borders or guarding ports and energy production facilities. Clearly, with Persistics, video means knowledge—and strengthened national security.
Key Words: Angel Fire; Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System (ARGUS-IS); Constant Hawk; graphics processing unit (GPU); intelligence, surveillance, and reconnaissance (ISR); Persistics; pixel; Predator; Reaper; Sonoma.
For further information contact Sheila Vaidya (925) 423-5428 (firstname.lastname@example.org).
Lawrence Livermore National Laboratory
Privacy & Legal Notice | UCRL-TR-52000-11-4/5 | April 7, 2011