Dr. Ryan P. McMahan

Dr. Ryan McMahan

A prestigious grant from the National Science Foundation (NSF) will support a UT Dallas computer scientist in exploring new ways that virtual reality can help companies improve their training programs and, ultimately, save lives.

Dr. Ryan McMahan, assistant professor in the Erik Jonsson School of Engineering and Computer Science and the School of Arts, Technology, and Emerging Communication, recently was awarded a $544,000, five-year grant to pursue a novel approach to workplace training using virtual reality (VR) technologies. The NSF Faculty Early Career Development (CAREER) Program provides support for junior faculty who have demonstrated outstanding research and teaching skills.

“My approach is new, which is why I think I received the CAREER award,” McMahan said. “My argument is that virtual reality will never be as realistic as the real world. But there are things we can do in VR that you can’t do in the real world — things that can improve the training process.”

McMahan is researching one approach to virtual reality that renders unimportant or irrelevant information at a lower fidelity while rendering important training information in a high-fidelity manner. This approach effectively directs the trainee’s attention to the key virtual objects relevant to the current training step.

Another training method being investigated by McMahan uses a cause-and-effect technique called time warping. When a person makes a mistake, the system will fast-forward the simulation to show the consequence of that mistake.

Take, for example, a training scenario on the preparation of an operating room. The VR user might touch a sterile tool with a non-sterile hand, contaminating the entire surgery.

“If we fast-forward the simulation, you’ll see the patient being brought in, you’ll see the surgery begin and then you’ll see how that contamination spreads to the patient,” McMahan said. “Then we’ll rewind back just before your mistake and let you fix your mistake. We’re really highlighting the cause and effect of the different things you should be focusing on.”

Variations on the training research also will involve purposely introducing errors to test the breadth of the trainee’s knowledge, requiring the trainee to recall necessary objects before they appear within the virtual environment and only accepting correct physical movements to execute training tasks, despite real-world physics allowing a greater set of motions.

My argument is that virtual reality will never be as realistic as the real world. But there are things we can do in VR that you can’t do in the real world — things that can improve the training process

Dr. Ryan McMahan,
assistant professor in the Erik Jonsson School of Engineering and Computer Science and the School of Arts, Technology, and Emerging Communications

The VR training is intended to improve both cognitive and psychomotor skills.

Through another research grant, McMahan has been collaborating with Intuitive Surgical, the corporation that produces the da Vinci robot, to develop training solutions for robot-assisted operating room teams. He also has a collaboration with the National Institute for Occupational Safety and Health (NIOSH) focused on the pre-shift inspections of off-highway trucks. McMahan will center his training research on these two areas.

“We focused on two domains so that we could demonstrate that our techniques can be applied to virtually any workplace situation,” McMahan said.

McMahan said his hope is that workers ultimately will learn more from the virtual reality training than from real-world exercises and that the VR training will be more efficient.

“If you can cut down on the time required to train people and, at the same time, improve the efficiency or the effectiveness of those trainings, then companies can save time and money while reducing injuries and deaths,” McMahan said. “We think we can positively impact a lot of industries in one fell swoop.”

Virtual reality provides computer-generated simulation of a three-dimensional environment in which a person interacts in a seemingly real or physical way, typically by using special electronic equipment, such as a helmet with a screen inside or gloves fitted with sensors.

McMahan is the 12th CAREER award holder in the Department of Computer Science. Other current and past recipients include Dr. Alvaro Cárdenas, Dr. Xiaohu Guo, Dr. Kevin Hamlen, Dr. Jason Jue, Dr. Murat Kantarcioglu, Dr. Zhiqiang Lin, Dr. Yang Liu, Dr. Andrian Marcus, Dr. Ravi Prakash, Dr. Balakrishnan Prabhakaran and Dr. Edwin Sha.