Report: DOD needs autonomous systems to analyze surveillance data
By CHRIS CARROLL | STARS AND STRIPES Published: October 12, 2012
WASHINGTON — Machines that can make some of their own decisions have strengthened the U.S. military, but Department of Defense needs even more autonomous capability, an influential Pentagon board declared in a recent report.
The need is especially acute when it comes intelligence, surveillance and reconnaissance activities by unmanned aerial vehicles, like the Predator and Reaper. The ability of the latest sensors and cameras — which can survey dozens of square miles at once — to gather data far outstrips the DOD’s capability to analyze it all, and the gap is widening, the Defense Science Board said in a report titled “The Role of Autonomy in DOD Systems.”
“Today, 19 analysts are required per UAV orbit,” the report said. “With the advent of Gorgon Stare, ARGUS and other Broad Area Sensors, up to 2,000 analysts will be required per orbit.”
The National Geospatial Intelligence Agency already has 25 million minutes of full-motion surveillance video in its archives, the report said. The demand for drones, which unlike manned aircraft can spy on a battlefield nonstop for a day or more at a time, is pushing more of them into service and increasing the data flood.
That abundance is a good thing, said Peter Singer, an expert on UAVs and a senior fellow at the Brookings Institution, a Washington think tank. But he said the demands of analyzing and exploiting the data will quickly grow out of hand if DOD doesn’t find new ways to do it.
“As you multiply out the number of unmanned systems and multiply the number of sensors they carry and what they can track ... you simply can’t keep track if you keep approaching it the same way,” Singer said. “Not only don’t you have enough warm bodies in the force, you don’t have enough warm bodies within the entire American citizenship.”
The answer, Singer said, is more autonomy. Sensors on UAVs must be smart enough to discern what’s important to track at dozens of surveillance sites at once without constant human input. After the real-time tactical surveillance is over, future advanced autonomous systems should be able to comb through the collected data for telling patterns, he said.
“On the back end, it will be sifting through the data and making connections you as a person might not be able to,” he said. “It might say, ‘Hey, human, this van that just arrived at the suspected enemy compound was also in the same neighborhood during the IED strike four days ago.’”
Although software that can do limited analysis of video data already exists, experts have said systems that can reliably pull together the many elements needed to autonomously deliver actionable intelligence remain a pipe dream.
DOD should revise views of how to develop autonomous systems to get to such an advanced level of capability, the Defense Science Board report suggested. Part of what’s needed is an attitude shift.
Too much time and energy is spent in unproductive efforts to define what autonomy is, and how much autonomy machines should have as they become more advanced, the report said. Such thinking “reinforces fears of unbounded autonomy and does not prepare commanders to factor into their understanding ... that there exist no fully autonomous systems, just as there exist no fully autonomous soldiers, sailors, airmen or Marines.”
Instead, DOD research should focus on giving autonomous machines greater capability to function as cognitive partners with the people who command them, the report says.
That means developing easy-to-use robotic systems that communicate with their operators using natural language inputs rather than unwieldy computer code, and that have the ability to progressively learn, as well as plan their actions, the report said.
A Navy roboticist whose work focuses on robots often modeled on human characteristics said natural speech and other abilities can make DOD systems not only more capable, but easier to use.
“As systems learn and communicate in ways that people are used to interacting with each other, you’re able to turn systems into collaborators,” said Greg Trafton, section head for intelligent systems at the Naval Research Laboratory in Washington. “Additionally, there is not yet another jargon-filled system to learn to use.”