Max Grattan has been repeatedly shot, taken countless falls and “blown up thousands of times.”
The research technician at the 711th Human Performance Wing inside the Air Force Research Laboratory at Wright-Patterson was not hurt in any of these incidents.
He was role playing a 3-D avatar in virtual training simulations meant to mimic real-world scenarios Air Force imagery analysts might encounter peering thousands of feet down on the ground from cameras aboard a remotely piloted vehicle, or drone.
Using life-like video game-style technology, the avatars have true-to-life movements based on real people who model for the scenarios at AFRL’s Human Measurement and Signatures Intelligence (H-MASINT) program. The size, shape and movements of the human body, referred to as biofedelic by researchers, are captured in stages to create the computer animation.
“We’re trying to blend a little bit of science and Hollywood for what they’ll see when they’re in the field,” said John Camp, a 711th Human Performance Wing biofedelic modeling lead researcher on the avatar project.
The animation looks like a high-tech video game, but with more realistic motions and cultural cues than a game picked up off a store shelf or played on-line.
The realism is detailed enough to show people talking into a cell phone or smoking a cigarette in a crowd while an analyst looks for cues to recognize when someone might be holding a gun versus a cane, hiding a bomb under clothing, or to spot a soldier injured on the ground, researchers said.
“What we were giving them was a sense of realism that they can pick up from 20,000 feet,” said Isiah Davenport, Infoscitex creative director of 3-D and multimedia on the AFRL project. “… They know exactly what’s going on on the ground.”
The human models are from multiple regions to obtain the accuracy researchers say analysts need to distinguish someone through cues such as the way they walk.
“What we try to do is get a variety of people from a variety of places around the world and the United States to capture cultural cues,” Davenport said.
Creating the avatar is a multi-step process. In the first step, cameras scan the body measurements of a person wearing workout clothing, for example.
A second step in the 3-D Human Signatures Lab uses motion capture cameras to record the human body’s movements 120 times each second.
On a recent visit to the Air Force Research Lab, Grattan donned a black cap and skin-tight, wet suit-like clothing with marble-size, silver spheres to reflect light beamed from motion capture cameras. The cameras record the spheres, or markers, locations.
To be as realistic as possible, “we want to capture how the bone moves,” said Dustin Bruening, a 711th Human Performance Wing biosignature lead researcher on the project.
For some scenarios, Grattan, a technician with Infoscitex, mimicked throwing a baseball or carried an AK-47 rifle.
The scanned and recorded images are fused together into an animation. The avatar’s clothing, such as a military uniform, will be added on a computer screen to fit the scenario. The clothing the avatar wears moves naturally, such as a loose-fitting camouflage blouse that billows when the wearer jumps over an obstacle.
“We’re always trying to get things as natural as possible as much as you can wearing a black suit,” Camp said. “This is essentially the same technology they use to animate Gollum (in the Lord of the Rings movie series) and King Kong” in films.
Air Force analysts have asked for more scenarios to sharpen training, Camp said.
Researchers said avatars could be used in a variety of commercial applications outside of Hollywood films. Someone could be fitted for custom-made clothing, or a kitchen could be designed to fit the space needs of person in a wheelchair, for example, said Zhiqing Cheng, an Infoscitex engineer and the company’s H-MASINT program manager.
“There are a lot of commercial applications for avatars,” he said.