Intelligent, autonomous robots set to change combat landscape
Stars and Stripes
The squad had just stepped off the road and into a field when a bone-shaking thud rattled the hot July morning. The Marines fanned out and soon staccato M-16 fire rang out.
Thirty yards back, the rearmost squad member watched those out front drop to their knees to shoot and, sensing trouble, immediately stopped. It surveyed the engagement with electronic eyes until the firing ended.
A cry went up that a man was down. Only then did the robot called GUSS, a wheeled contraption similar to a driverless all-terrain golf cart, move forward. The injured Marine — actually a dummy being used for a demonstration of the system — was loaded onto the back and driven to safety by the humble little machine.
When intelligent robots debut on battlefields in the next few years, they’re going look and act less like the Terminator or other robots of science fiction and more like the quotidian load-carrying machines the Marines were testing last week at Fort Pickett.
That’s not a slam on GUSS, or Ground Unmanned Support Surrogate, being developed by TORC Robotics of Blacksburg, Va., or on other similar systems designed to do tiring but basic jobs. What sets this new breed apart is that unlike the thousands of other “robots” now in service with the U.S. military, GUSS and its brethren have the intelligence to make some of their own decisions — known as “autonomy” in the robotics field.
The goal, said Jesse Hurdus, TORC’s technical lead for the project, is a useful “mule” that knows what to do and when to do it and doesn’t get in the way of the fighting, which in the foreseeable future will remain the province of real, human troops.
If successfully fielded, robots like GUSS could lighten 100-pound-plus loads troops often carry on patrol and, more importantly, pare the number of people in the line of fire.
“What we’re asking is, how do you integrate this into the squad to the point that they almost trust it as another member of the squad?” Hurdus said.
Robotics experts say technical, ethical and doctrinal issues will likely prevent ground-based autonomous fighting robots from being fielded in the foreseeable future, but the situation will be different in the air. The Navy predicts that its experimental autonomous drone, the Northrop-Grumman X-47B, will conduct its first autonomous takeoff and landing from a carrier in 2013. After that, it will develop an expanding repertoire of moves, including surveillance and even strike capabilities.
In the near future, human-piloted aircraft and autonomous drones will operate in concert, said Rear Adm. Randolph Mahr, commander of the aircraft division of the Naval Air Warfare Center.
“This is what you will see at any Navy or Marine Corps [air station] within the next decade,” he said last week at Naval Air Station Patuxent River in Maryland, where the X-47B had completed its first East Coast test flight two days earlier.
Unlike the famous military robots of the post-9/11 era, including Predator drones and the small, tracked counter-IED robots, autonomous systems can observe what’s happening around them and react without direct control from remote pilots or operators.
In addition to moving humans out of dangerous or physically impossible jobs — think multiday aerial surveillance missions — robotic autonomy holds the promise of vastly reducing the manpower required for support roles. The Marines have taken one step in that direction with the introduction of an unmanned version of the K-MAX lift helicopter in Afghanistan. The choppers have autonomously flown some 600 sorties and delivered about 1.6 million pounds of cargo to Marine outposts, eliminating the need for scores of potentially dangerous cargo convoys.
Within the next 10 years, robotics experts say, certain types of intelligent systems should be capable of operating without human input.
“At perhaps the five-year point from today, you may have robotic vehicles, or a robotic warehouse worker, that doesn’t need a human in the loop at all,” said James Giordano, director of the Center for Neurotechnology Studies at the Potomac Institute for Policy Studies, an Arlington, Va., think tank focused on science and defense issues.
The key to that ability is working toward systems that learn on their own, Giordano said. Scientific and technical breakthroughs still stand between present-day robots and ones that are truly able to learn.
“You want it to be the proverbial ‘point-and-shoot’ system,” Giordano said. “I don’t want to be tethered to my machine, walking it through every process.”
A roboticist developing Oshkosh’s autonomous Cargo UGV robot, which the Marine Corps tested for unmanned cargo hauling at Fort Pickett alongside the far smaller GUSS, agreed that robots have to be able teach themselves to do things. The test pitted a convoy of human-driven vehicles against a convoy that included several unmanned Cargo UGVs. The results are still being analyzed.
“Machine learning is key,” said Tom Pilarski, principal investigator for the system, a robotic version of Oshkosh’s Medium Tactical Vehicle. “It seems impossible to program into the system examples of every possible thing it’s going to see when it goes out in the world. It needs to be able to learn over time.”
Like children learning to walk, the current generation of ground and aerial autonomous robots are still learning basic movements.
For the X-47B, that means navigating the airspace around a carrier, including taking off and landing on the deck of a moving carrier. Human pilots spend years reaching that level of proficiency.
Autonomous ground vehicles, meanwhile, still have difficulty negotiating obstacles that a human driver wouldn’t give a second thought, said Pilarski, a staff member at Carnegie Mellon University’s National Robotics Engineering Center in Pittsburgh.
For example, a Marine driver would plow through a windblown shopping bag and keep going. A robot, peering at the world through its infrared and laser sensors, has no way of telling if the looming object weighs a few grams or 100 pounds, or whether it’s inanimate or living.
Although traditional trucks require more personnel and put crewmembers in danger of attack, they have their advantages, Pilarski said. Present-day robots lack the contextual understanding of situations necessary to make such distinctions, he said.
As a result, a convoy might grind to a halt and make a target of itself until the bag blows away. The opposite miscalculation could send the truck hurtling into dangerous obstacles or running down civilians.
Systems smart enough to discern for now exist only in the lab.
“They take hours, days, to crunch out results,” Pilarski said. “It’s still quite a ways off for robots to have the ability to make these decisions in real time, traveling at 30 to 40 miles per hour.”
Despite the current limits of robotic technology, it’s time for the Defense Department and other authorities to seriously consider how the advanced robots of the future will be used, said a leading scholar of the ethics of military robotics.
“Technology is fast, and ethics and law are slow,” philosopher Patrick Lin, of California Polytechnic State University, said in an email interview. “It’ll be far too late if we wait for robots to have real autonomy ... [because] robotic autonomy may be upon us suddenly and well before we’d expect. We’ll be caught by surprise.”
Leaving aside hypothetical future combat robots capable of autonomous attacks, even highly capable logistical robots raise ethical questions, Lin said. Will advanced “mules” like GUSS or fleets of unmanned Cargo UGVs lower the cost of entering a war of choice enough to make conflict more likely?
Technical glitches in programming could lead to accidental atrocities or other battlefield errors, he said. How would culpability be determined?
The autonomous robots of the future may help solve some of the legal and ethical problems that exist in battle today, Lin said.
“Autonomous robots could act more ethically as warfighters,” he said. “They are immune to fatigue, emotions, and morale issues that may cause a human soldier to overreact, abuse civilians, and commit war crimes.”
In the field
Such complex autonomous uses remain in the realm of science fiction for now. GUSS’ responsibilities are much simpler.
It picks its way through obstacles along a predetermined route, heading from one GPS waypoint to the next. If the robot sees the troops it supports drop to a knee, it grinds to a halt in expectation of combat.
If it gets stuck — for instance, navigating a bank of flimsy bushes as an impassible solid obstacle — Marines can take control and get it past the rough spot.
It’s an instinctive, easy-to-learn system, said Lance Cpl. David Barrientos, 21, of 3rd Battalion, 6th Marines Regiment, who had been operating GUSS for less than a week.
Would it be useful in combat?
“It would be fine moving things around on a FOB,” said Barrientos, who recently returned from a deployment to Afghanistan. “But when we go out, we climb over things, go down through wadis. ... I don’t think so.”
GUSS is a rough draft of an eventual squad-level helper, said Marine Lt. Gen. Richard Mills, commander of Marine Corps Combat Development. Among other shortcomings, its gas engine isn’t stealthy enough for operational use, he said. But the developing electronic brain of the system can eventually be grafted onto a more capable vehicle platform, lightening the load for patrolling Marines and taking vulnerable support troops out of harm’s way, he said.
“If it gets hit that’s a shame, but you haven’t lost any Marines,” Mills said. “This is just a first step, and a good one.”