Iowa Engineer magazine, 2007 Number 1
He boasts a body in fighting trim—corded biceps, washboard abs, and a jaw that could shatter rocks. He can maneuver a giant earthmover and is at home in the cockpit of an F-16 fighter jet. Although three years old, he is virtually an adult. He doesn’t smile much, but everyone seems to like him. His friends call him an “avatar,” the Hindu word for the incarnation of a god. He is Santos™, the digital human.
Santos™ and his virtual environment serve as a comprehensive tool for simulating and assessing biomechanics, dynamics, and many other aspects of human effort in work and leisure environments. He is the centerpiece of the Virtual Soldier Research (VSR) project founded and directed by Karim Abdel-Malek, professor of biomedical engineering and director of CCAD. Since Santos™ was created three years ago, he has attracted more than $13 million of external funding, enabling CCAD’s VSR Laboratory to collaborate in 15 areas of research with partners such as the US Army, Caterpillar, Inc., and USCAR, a consortium of American car manufacturers. The research focuses on a variety of human/product interfaces, including the prediction of motions needed to execute a task, the effect of protective clothing on performance, and the calculation of force and strength needed to accomplish a given job.
Not your ordinary virtual human, Santos™ is the first and most powerful example of an entirely new generation of realistic digital humans who can perform real-time tasks with remarkable autonomy and even learn from their mistakes.
“There are a variety of ways to analyze designs in a computer-aided design environment,” says VSR Senior Project Manager Steve Beck, “including finite element analysis, vibration and harshness, and aerodynamics. What’s missing is a way to analyze the human-in-the-loop in a computer-aided design (CAD) environment.”
So, for instance, researchers using CAD designs before Santos™ was born had to pose a virtual mannequin much like they would pose a jointed doll on a chair—one limb at a time. Next, human factors experts would consult a formidable ergonomic handbook to assess whether the computer mannequin appeared to be posed the way humans would pose in a similar scenario. Using this handbook, researchers could program these virtual mannequins to perform complex movements such as bending an elbow or turning a wrist.
Santos™, however, boasts 109 degrees of freedom, which enables him to perform, predict, and assess a complex series of motions. He not only can carry out tasks in his virtual environment such as “Reach for the gear shift,” but he also can respond to questions such as, “Is this comfortable? Are you strong enough to lift that?” and even, “What should you do next?” Employing Santos™ allows researchers to study not simply movement but also choice.
In his early months, however, Santos™ always chose the easy way, and, says Beck, performed some tasks in “goofy” ways. For instance, if you asked him to reach behind his back to grab a tool, he would stare straight ahead and reach directly backwards, which was reasonable if the only goal was to minimize discomfort. It was not, however, very realistic.
“Humans just don’t do that,” says Beck. “If you ask someone to reach behind him, he will first turn his body to look at whatever it is he’s been asked to retrieve.”
So VSR researchers instilled a variety of cost functions into Santos™, including potential energy, effort, comfort, joint displacement, visual acuity, and visual displacement. Now the researchers can ask the virtual human to perform and assess a task based on one or more of these cost functions.
“Of course,” says Beck, “if you highlight the potential energy function alone he might well do the task, but he’ll try to lie on the floor while doing it!”
Modeled on commercially available body scans created by Eyetronics, a company that provides digital images for movies, Santos™ is a disarmingly “human” model. CCAD researchers such as Applications Developer Chris Murphy have achieved this verisimilitude by fine-tuning the character rigging that creates and controls both the hierarchy of Santos’s™ joints and the polygons that comprise his skin. He displays a remarkably sophisticated musculoskeletal structure, including skin that shrinks and stretches like human skin.
Like any good parents, the 37 members of the VSR team, including about 12 students, have big dreams for their offspring. One of their goals for Santos™ is to enable the researchers themselves—the real humans—to work alongside him in his own, virtual world. Once that threshold is crossed, humans and virtual humans will be able to team up and perform virtual tasks to help solve real-world problems.