Elmqvist Part of $5M NSF Award to Develop Immersive Training for Manufacturing Jobs

Elmqvist Part of $5M NSF Award to Develop Immersive Training for Manufacturing Jobs

A University of Maryland expert in human-computer interaction is part of a multi-institutional project to develop innovative workforce training for the manufacturing sector using augmented reality (AR), virtual reality (VR) and extended reality (XR).

Niklas Elmqvist, a professor in the College of Information Studies (iSchool) with an appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS), is one of five researchers tasked with developing an AR/VR/XR training program called Skill-XR.

The project, which will use immersive technologies to efficiently and affordably train manufacturing workers, is being funded by a two-year, $5 million grant from the National Science Foundation (NSF).

“The novelty with Skill-XR is that we are using the new generation of XR technology—augmented, virtual, and extended reality hardware—to put the trainee into the shoes of trainer, enabling an entirely new and more scalable form of workforce education,” says Elmqvist.

XR is when the VR or AR system extends to interface with the user’s physical environment. For example, a factory worker could use AR glasses to see directions overlaid on a machine explaining how to operate the controls.

The award comes at a vital moment, the researchers say, as the nation recovers from the COVID-19 pandemic. The need for remote training has never been greater, and retraining and upskilling are necessary steps toward economic sustainability.

Elmqvist, the director of the university’s Human-Computer Interaction Lab (HCIL), will be responsible for the visualization and analytics of large amounts of real-time performance data collected from AR training sessions.

The traditional one-on-one apprenticeship model is very costly, he explains, because the trainer has to take the time to train each worker individually for complex manufacturing and assembly tasks.

Whereas in the Skill-XR approach, the trainer can still be engaged in their everyday productivity tasks on the factory floor, while multiple trainees can simultaneously view immersive pre-recorded training sessions with smart phones or devices like the HoloLens 2.

Moreover, the program will provide workers with instant feedback while learning to perform the task correctly, ensuring that they are being trained quickly, accurately and safely.

Karthik Ramani(link is external), the Donald W. Feddersen Distinguished Professor in Mechanical Engineering at Purdue University and the project’s lead PI, sees immense potential for Skill-XR beyond the manufacturing sector.

“In this generation, YouTube videos have become stand-in educators. People turn to YouTube to figure out how to tie a tie, change a tire or cook a meal,” Ramani says(link is external). “We want to push that concept into the next dimension–so that it’s no longer just a 2D video on a screen–but an augmented reality experience that actually responds and gives you feedback.”

In addition to Elmqvist and Ramani, other members of the NSF-funded project include Alex Quinn(link is external), assistant professor of electrical and computer engineering at Purdue University; Thomas Reddick(link is external), associate professor of cognitive psychology at Purdue University; and Kylie Peppler(link is external), associate professor of informatics at University of California at Irvine.

The funding for Skill-XR comes from the NSF’s Convergence Accelerator program(link is external), which aims to accelerate research that has the potential for significant societal impact on a national scale.


Original news story written by Maria Herd

November 25, 2020


Prev  Next



Connect

Twitter     LinkedIn     RSS Feed

    Division of Research
    University of Maryland
    College Park, MD 20742-1541

    Email: vpr@.umd.edu

        

    Did You Know

    Turtle Image

    UMD is the only major public research university inside the Washington, DC beltway!!