Army developing binoculars that can identify far-off faces in the dark
January 31, 2020
The Army is developing technology that will show faces in striking detail from a distance and immediately match them to identity databases — even in the dark.
The advanced facial recognition technology could aid soldiers in the field, but comes with some growing cybersecurity challenges, researchers and analysts said.
An Army prototype can boost image quality from thermal infrared cameras, so a soldier can identify faces at night through handheld binoculars from as far as 500 yards away, said Sean Hu, an electronics engineer for the U.S. Army Research Laboratory.
A computer checks the face and sends a notification if it matches a person on a watchlist, Hu said.
The technology, first announced in 2018, is in the experimental testing phase. The binoculars could enter field testing in just a few years, Hu said.
“We are working with industry partners to make small handheld binoculars with sensors with the capacity to identify matches in real-time at nighttime,” Hu said by phone Wednesday.
Thermal infrared cameras, widely used within the military, don’t need visible light as they capture heat signatures. However, their images appear blurrier and at lower resolution than from traditional cameras, Hu said.
The prototype binoculars and an attached laptop translate these thermal images into high-resolution so soldiers can see people’s eyes, nose and mouth in the dark.
But as the Army develops increasingly advanced technology, its enemies are also getting better at hacking into these systems, researchers said.
Hackers can cripple facial and object recognition systems during development, said MaryAnne Fields, program manager for intelligent systems at the Army Research Office.
The Army recently funded work by researchers at Duke University on stopping this type of “backdoor” attack, a statement said.
Fields used the example of a vandal and a stop sign. If a vandal puts black tape on a stop sign, humans know enough to recognize it’s still a stop sign. But a machine can be taught to learn the wrong thing. A hacker could add “triggers” to training images to teach the machine that the altered stop sign is a speed limit sign.
Similar false triggers in a facial recognition software’s code could lead to it misidentify anyone wearing a certain hat or sunglasses.
“These technologies are coming,” Fields said. “It’s up to us to make sure that these technologies … won’t make catastrophic mistakes.”
Identifying targets at night through binoculars could be just the beginning of what the technology can do, said Peter W. Singer, senior fellow at New America, a Washington-based think tank.
He noted use of facial recognition technologies by the Chinese government, which is believed to be using it for surveillance on its citizens. Police departments in the U.S. and U.K. are also interested in using facial recognition.
The increasing use of facial recognition by governments raises ethical concerns about privacy rights and legal questions, such as what happens when errors in code lead to the wrong person getting arrested or killed, said Singer, author of an upcoming book on how AI will affect war.
“Something that not even George Orwell could have imagined is ahead of us,” Singer said.