Quantcast

From board games to the battlefield, can artificial intelligence take over?

By SETH ROBSON | STARS AND STRIPES Published: March 25, 2016

The defeat this month of a Korean Go master by Google’s AlphaGo program has many people excited and, in some cases, fearful about the future of artificial intelligence.

The Defense Advanced Research Projects Agency — the Defense Department arm responsible for developing emerging technologies — is already looking at using articifical intelligence, or AI, to analyze vast amounts of data from battlefield sensors and give commanders options for how to respond, according to online magazine Breaking Defense.

However, the day when a computer could replace a human general remains in the distant future, experts say.

“AI creates a good assistant for a human, but it’s still many years out from achieving anything like human intelligence,” said David Johnson from the Center of Advanced Defense Studies, a Washington-based think tank.

There’s more to the art of generalship than a game of Go, he said.

The board game, which originated in China 2,500 years ago, is a test of strategy as players attempt to capture territory by moving black and white stones around a 19-by-19 grid.

“In Go, the pieces move where they are supposed to and achieve what they are supposed to,” Johnson said. “On the battlefield, you have the fog and fiction of war.”

The human mind deals with complexity by doing things that are good enough to work, he said, whereas computers have a very hard time doing things that are less than optimal. Plus, human soldiers respond to orders in different ways, Johnson said.

“A commander has to really understand each of his troops and what motivates them and get them motivated differently,” he said.

A computer can land an airplane, but it might run into problems with situations involving humans who don’t always behave in predictable ways, Johnson said.

“An artificial general might provide a real general options, but at best it would be a fine assistant,” he said.

Computers are used extensively to provide combat simulations during training; However, those models can never mirror the real world, Johnson said.

“They can’t replicate an actual fight with an enemy or sometimes several enemies who are intent on defeating you,” he said. “The real world is more complex.”

Artificial intelligence will become more important as commanders seek to make sense of data streams coming in from a growing array of battlefield sensors, said Arizona State University engineering professor Braden Allenby. However, he agreed computers won’t replace military leaders anytime soon.

“You would be replacing not just a lot of rational processes but intuitive, emotional and human knowledge these individuals have built up, and that is really hard to do,” he said.

Despite its limitations, it’s critical for the military to learn how to interact with artificial intelligence, Allenby said.

“To some extent you might say that some of the robots we use today are using primitive AI,” he said.

For example, when a communication link is cut to a remotely piloted drone, the aircraft might be programed to fly to a pre-set way point and go into a holding pattern.

Modern weapons and warfare move too rapidly and are too complex for humans to stay on top of, Allenby said.

Cyberattacks, for example, happen too quickly for humans to detect and react to them, whereas defensive software can react in a timely manner to protect data, he said.

The Turing Test, which gauges a machine’s ability to interact with humans through a series of questions and answers, is one gauge of AI. However, there is no guarantee that machines, as they become more complex, will think like humans, Allenby said.

“The Google machine … is not intelligent in the sense of a human being,” he said. “It’s training itself so you can’t predict what it is going to do.”

Despite AlphaGo’s prowess at a board game, computers are nowhere near smart enough to make the sort of comprehensive, fluid analyses and decisions a seasoned military officer can make in the complex and uncertain environment of battle, Allenby said.

Even if the machines were capable of that, there are serious cultural issues that come into play, he said.

“Part of what it means to be a warrior is following your leader into battle, figuratively or literally,” Allenby said. “Going into difficult combat because a computer thinks you should may not be entirely different in an objective sense … but it certainly seems different to anyone that has served.”

Email: robson.seth@stripes.com
Twitter: @SethRobson1


0

comments Join the conversation and share your voice!  

from around the web