Skip to main content

Augmented reality helps robots tackle complex jobs

Posted: 

Researchers in the ISE Department are using machine learning and augmented reality to train robots to paint more intricate aerospace components that typically have to be done by hand. 

Michael Groeber
Groeber

Working with the Advanced Robotics for Manufacturing (ARM) Institute, located in Pittsburgh, ISE Associate Professor Mike Groeber is leading the award of $1 million with another $1 million cost share from Ohio State. The project is a follow-on to an Air Force Research Lab (AFRL) funded effort, also coordinated through the ARM Institute.  

The assignment is to determine a way to (semi-) autonomously program the robotic coating process of aircraft components. The flat surfaces of the plane can be handled autonomously relatively easily. It becomes more difficult with components that have more complex geometric shapes. Consequently, Groeber says, those are typically hand-painted, which often caused delays. 

“The challenge we run into is setting path plans for complex objects that actually yield quality results,” he says. 

By using machine learning and humans in-the-loop utilizing augmented reality, the study can “run fast-acting simulations on what the paint likely will look like under many conditions,” Groeber says. “Users can visualize and share natural feedback on the plan and the actual results, and the system can learn strategies for yielding quality results on complex geometries from that information.” 

At completion of the initial project, a human will simulate painting a particular part and visualize the results by wearing AR goggles. Changes can be made based on the visualized results. Once the results are acceptable, the path can be converted to a motion plan, which is then sent to the robot. “The simulations are fairly fast, but not instantaneous,” Groeber says. “And the human has to make a lot of decisions. It’s still a pretty laborious task.” 

He notes that if a new component is even slightly different, “The system didn’t learn anything to apply to that new part. We need to be running simulations in the background and have the system constantly learning and training itself.” 

He says the to-be-developed process will take data from the simulations, but also will take guidance from the human. “There are lots of aspects of the process that are challenging to simulate in a useful amount of time,” he says. “We can use human intuition and tacit knowledge to fill in the information the simulation doesn’t account for.”  

A key is “to make the user interface natural and not cumbersome to use,” Groeber adds. 

The idea is to develop the system to learn over time and give more autonomy to the system, so that it refines itself. “It’s a true integrated system in every sense of the word,” he says. 

 

Story by Nancy Richison

Category: Faculty