With the help of a 3D vision system and robotic arms from FANUC, MetalQuest was able to automate a physically challenging bin-picking task.
“Bin picking is basically the Holy Grail of robotic automation,” says Scott Harms, President, of MetalQuest Unlimited Inc. For a human, bin-picking is often non-ergonomic, especially when parts are heavy, leading to carpal tunnel and other repetitive-use injuries. It also tends to be dull work. “People that are doing it can get burned out relatively rapidly,” he adds. However, this task has proved challenging to automate because it requires 3D vision. And according to Harms, “Vision is the ultimate complexity when it comes to robotics.”
Harms’ company, MetalQuest, is not only a manufacturer of precision machined components with locations in Nebraska and Idaho, but it’s also an authorized internal FANUC integrator. The shop recently tackled the challenge of automating bin picking head-on by implementing its first 3D vision system from FANUC. According to Harms, integrating an automation cell with 3D vision is ‘more than twice as complex’ as a 2D vision system, but automating this task has been beneficial to the company’s culture.
Casting call
The part the new cell makes starts with a casting. “These parts are literally dumped into a box at the foundry,” Harms explains. When they arrive, they are all at different angles in the box, slanted in different ways. Traditionally, getting these castings into an automation cell would require an operator to move the eight-pound casting from the box and place it onto a tray or conveyor so a robot can pick it up and load it into a machine tool. These castings are not symmetrical, leading to further complexity — they typically need to be oriented in a certain way for a robot to pick them up and load them into the machine.
A 3D vision system enables a robot to pick up castings directly from this box. Now, the operator pushes a box of castings to each of the cell’s two stations. A 3D vision system mounted on a gantry scans one of the boxes and gives a nearby robot positional information on where the castings are located and how they are oriented. If the vision system cannot identify any castings that are oriented in a way the robot can grab, it tells the robot to run what’s called a ‘stir routine’, where the robot moves the parts around in the box until some of them are reoriented in a way it can grab. The system works through one box at a time, at which point an air cylinder slides the vision system over to the other box and works through that. When both boxes are empty, an alarm sounds, notifying operators to bring over more boxes of castings.
After the robot has picked up a casting, the 3D vision system verifies the part and gives the robot more information on the casting’s orientation and how it was picked up. The top and bottom of the casting are not symmetrical, and it needs to be oriented correctly when it goes into the machine tool, so the robot sets the casting in a regrip station with a 2D vision system. This system tells the robot which side of the casting is on top and reorients it if necessary.
Then the robot sets the castings into a load station, where the second and third robots can pick it up and perform foreign object detection on the casting. These castings are made using cores, so there’s a risk of material such as excess cast iron or deburring media getting trapped inside. “We have got these very expensive tools that go into these ports and it's really important that those holes be there,” Harms explains. “When they are not, very bad things happen.” The robots have fingers that go inside the casting and confirm it’s clear. If the robot does find debris in the casting, it sets the casting off to the side and moves on to the next one, though Harms notes that this doesn’t happen often. The second and third robots load the castings into one of three Okuma HMCs and unload them when the machining cycle is finished, placing them onto a rack where operators prepare the finished parts for shipping. |
The vision system is mounted on a gantry above the boxes and provides the robotic arms with data that enables the robot to pick up castings regardless of where they are in the box and how they’re oriented. Humans no longer need to remove each eight-pound casting from the box and place them on a tray or conveyor. |
According to Harms, implementing the 3-D vision system was significantly more challenging than implementing a 2D vision system, which MetalQuest has extensive experience with (in fact, it was a component of the first cell the company implemented in 2011). |
Adding another dimension
According to Harms, implementing the 3-D vision system was significantly more challenging than implementing a 2D vision system, which MetalQuest has extensive experience with (in fact, it was a component of the first cell the company implemented in 2011).
To prepare for this new technology, the company sent several employees to FANUC headquarters for training on implementing 3D vision systems. Employees also learned a lot on their own. “There was a lot of trial and error involved,” Harms notes.
The robot then verifies the casting, reorients it if necessary, in a regrip station, and sets it into a load station. Another robot picks up the casting, performs foreign object testing, then loads it into an Okuma HMC for machining. |
The robot’s movement as a result of the extra dimension of vision makes this application very complex. Harms points out that when a robot picks parts off a tray or conveyor, the Z-height the robot travels is fixed. When picking parts out of a box, the Z-height changes as the box empties. Furthermore, the robot has to figure out how to work around other parts in the box and reach all four corners. “It has to recognize the surfaces that it is programmed to pick off of,” he explains, “and then it has to articulate in a way to make its end effector parallel to the surfaces it needs to pick off of.” End effectors also have different considerations in 3D vision applications versus ones with 2D vision. Harms says that end effectors for 2D vision systems are typically basic and straightforward, while 3D vision systems need end effectors that are more streamlined and can maneuver around other objects (in this case, the other castings in the box). The robot with the 3D vision system uses a magnetic end effector from Magswitch. |
Because this cell deals with castings that might have foreign objects inside that could potentially damage a cutting tool or a machine, it’s vital for the shop to detect abnormalities in cutting tool forces and stop the machine if it detects something unusual. |
Running lights out, but not actually
According to Harms, this cell can run for hours unattended. “We designed this thing to run a full shift without user intervention,” Harms says. The cell even runs lights-out, though not in the literal sense of the term. “You literally leave the lights on,” he explains. MetalQuest has LED lights on its shop floor, which emit certain light patterns. These light patterns need to stay consistent for the 3D vision system to work. “Otherwise you're screwing up the eyes of the vision systems if that makes sense,” he says.
This consistent extra run time makes up for the fact that the cell actually moves slower than a human. The cell can keep running where humans need to take breaks and go home at the end of the shift.
Process control
As with any automation cell, process control plays a key element in its success, and it notes that everything from fixturing to chip control needs to be taken into account.
One aspect Harms highlights is tooling. Because this cell deals with castings that might have foreign objects inside that could potentially damage a cutting tool or a machine, it’s vital for the shop to detect abnormalities in cutting tool forces and stop the machine if it detects something unusual. “You have to use quality cutting tools and you have to get consistent life out of them,” he says. Additionally, the shop uses tool management software and tool breakage detection on its machines. MetalQuest uses Tool Life Management and Cutting Status Monitor from its machine tool supplier, Okuma.
Automation impacts
Harms says the biggest impact of this cell has been on its employees, who are now freed from a physically challenging and mentally monotonous task. “Culture is so very important to us,” he says, and automation plays a key role in the culture at MetalQuest. “The people we have typically embrace something like this because it’s a job that they don’t really want to do and it’s being automated,” he adds. However, Harms also cautions that companies should never approach automation with the intention of replacing people. Instead, companies should look for tasks that will make employees’ jobs more enjoyable. “We have just changed the roles and responsibilities of our people,” he explains. “Instead of loading the machine itself, they are making sure the robot has what it needs to do its job and the entire process itself is functioning as it’s supposed to be.” |
The cell runs unattended during overnight shifts, though not literally ‘lights out’ due to the vision system’s need for light. Tool life management and tool load software from Okuma help prevent crashes and other issues during unattended machining. |
Harms cautions that companies should never approach automation with the intention of replacing people. Instead, companies should look for tasks that will make employees’ jobs more enjoyable. |
Seeing into the future
Automation is an important element of MetalQuest’s culture. By freeing employees from physically changing and monotonous tasks, they’re free to take on more interesting and higher-value work. |
According to Harms, 3D vision and robotics are a bit of a work in progress. For now, it’s best suited to high-volume work. “There’s just too much refinement and there’s too much process you have to put in place,” he explains. “Everything is down to pixels, so the system is looking at all these pixels and trying to translate them into known shapes, and then it has to convert that to geometric data the robot can use.” The time and effort required to set up the system’s hardware and software is prohibitive for lower volumes. But Harms expects that won’t be the case indefinitely. “It’s definitely an evolving technology,” he adds. |
JULIA HIDER
|