Automotive manufacturing has long benefited from the thoughtful deployment of robotics and automation, and for good reason. Among the many complex production processes, some can be too difficult or too tedious to be completed safely and efficiently by human workers. As a result, industrial automation technologies can add tremendous value to original equipment manufacturers and automotive parts suppliers alike in many ways. For example, leading global automotive parts manufacturer DENSO, recently sought to automate a physically challenging and repetitive tote-handling task.
Instead of letting employees continue to load and unload large stacks of heavy totes to and from a paint booth (Figure 1), the company partnered with CapSen Robotics, which customized its 3D vision, motion planning, and control software to suit the task.
DENSO also partnered with systems integrator Invent Automation to develop a system that would streamline the process and allow employees to contribute on the plant floor in more valuable, less physically-demanding ways. The activity contributed to DENSO’s broader efforts to complement human work with automation for improved team member and company performance.
Gone in (under) 60 seconds at DENSO
As part of DENSO’s operations, automotive parts are painted with a low gloss, charcoal-colored paint that visually smooths out the surface for a cleaner appearance while also helping to absorb light and reduce glare inside the vehicle.
In the previous setup, employees stacked six totes full of plastic automotive parts onto a conveyor that were then presented to an operator who would manually remove the parts and put them into a paint booth station (Figure 2), then pack them back into the totes. An employee would stack or de-stack a tote every 30 seconds, day in and day out.
This repetitive motion was identified as an automation opportunity, which would free up team members to perform more impactful work. However, layout constraints on the shop floor made traditional forms of automation difficult, leading to the creation of a compact and flexible system.
With the new system, an inbound conveyor transports totes of unpainted objects to a six-axis collaborative robot (Figure 3) with an Intel RealSense 3D RGB depth camera attached to its end effector. The camera helps the robot visually identify the tote and measure its height for picking.
The CapSen PiC 2.0 software, running on an industrial PC with a graphics processing unit (GPU), allows the robot to plan its motion, locate, pick, and manipulate the tote and move it toward another conveyor headed to the paint booth station. There, the parts are unloaded, painted, put into an oven for curing, and inspected before being put back into the totes and onto the conveyor going back toward the robot, which identifies the tote and places it onto an outbound conveyor.
Partners address maneuverability, vision challenges
Precise movements along multiple axes are required for the robot to pick and place large totes without causing any collisions. The CapSen PiC 2.0 simulation environment allowed the team to test the robot’s maneuverability and ensure everything was reachable and that the robot would not collide with anything.
But when the team tested the real-world system, it discovered that totes began to bend when they were picked up by the robot. To compensate for the tote’s weight and to ensure fluid motion without collision within the compact cell, the partners worked to customize the system.
“We use AI in our software to enable robots to perform a variety of pick-and-place tasks, including random bin picking, machine tending, packaging, assembly, and tote handling” said Jared Glover, CEO of CapSen Robotics. “For challenging applications, especially those where the simulation environment alone won’t suffice, we customize our AI algorithms and models based on data from the production task. This enables the software to optimize the motion of the robot to allow it to move freely without collisions and to transfer the tote within the cycle time.”
For DENSO’s tote-handling project, CapSen Robotics first trained its AI software to detect and localize the tote’s handle. The RGB-D camera captures images of the handle, and the software runs machine learning algorithms that detect the handle, allowing the software to then do the motion planning for the robot.
Once the robot’s end effector is placed into the handle, the weight of the tote must be identified so the robot doesn’t stop working or make sudden jerk motions that cause the tote to fall. Instead of using a scale, CapSen’s software analyzes the torque values on the motor and uses these values to determine the weight for picking.
“Using the torque sensing of the robot to obtain the weight and of course confirm that the tote was picked up was a valuable poka-yoke that CapSen added to the system,” said Kevin Peek, production engineer at DENSO.
In addition, some machine vision challenges emerged during the design and installation of the system, but the collaborative team worked together to prevent them from becoming an issue.
“Stacks of totes can also have up to six totes in them, so the system needed to ensure proper stacking,” according to Peek. “If there are only two totes, maybe it’s less of an issue, but if there are six totes stacked up and one is not nested properly, the whole stack could fall over and cause a big delay in the process.”
To solve this challenge, the system is designed so that once a robot sets a tote onto a stack the robot moves backward slightly to view the stack with the camera. Machine learning algorithms in the software helped the system to look for a gap between two totes and ensure proper stacking, while the software also counted and verified the correct number of totes in the stack.
Register today to save 40% on conference passes!
DENSO reports zero drops, zero missed picks
Ultimately, the versality of the CapSen PiC 2.0 software led to it being chosen as the software solution for this project, as it allows end users to fully use all the data coming from the different pieces of system hardware. This is instead of working through native programming, like when using a programmable logic controller (PLC). In this application, there is no PLC in the cabinet.
Everything is handled in the CapSen human-machine interface (HMI). Operators can use it to control machines, obtain robot calibration information, and visualize what the camera sees, among other features (Figure 4).
The user interface was designed so that non-technical employees are empowered to operate the system as part of their daily routines without the need for automation or engineering expertise.
The software is also hardware agnostic, which provided a specific benefit to DENSO. In fact, initial testing of the system involved mobile manipulation, where an autonomous mobile robot (AMR) carried the robot around the facility with the purpose of moving totes and other containers throughout the warehouse.
In this configuration, CapSen’s software controls the AMR, obtains coordinates from it, and sends correction instructions to the AMR to line up the AMR to flow rack lanes.
While stacking and destacking totes is a common application in industrial environments, this system was designed with flexibility in mind, so that it can be expanded or retrofitted to other applications or made mobile with an AMR.
In addition, since the system was installed, the robot has not dropped one tote or missed one pick, removing both a physical and metaphorical weight from the shoulders of DENSO employees.
About the author
Bo Ridley is executive vice president at Knoxville, Tenn.-based Invent Automation. He previously worked as a production engineer and a machine designer at DENSO. This case study is posted with permission.