Robots Autonomously Assemble IKEA Chair using Sensors & 3D Vision
Robot programming is getting easier and easier, with lead-to-teach path programming, user-friendly plugins for grippers and other peripherals, and even programming training courses from the major manufacturers. But if you still can’t figure out how to write a motion program for your assembly task, you may not have to worry for long.
According to a recent release from Science Robotics, Researchers at Nanyang Technological University in Singapore have shown how commercial off-the-shelf robotic hardware, including Denso robots and Robotiq grippers, can assemble an IKEA chair out of factory settings – a demonstration of independent movement that has thus far been restricted to elementary tasks. Their findings highlight the ability of manufacturing robots to perform tasks that require human-like dexterity, suggesting they may soon be ready for use in a wider range of applications beyond a factory assembly line.
Though often second nature to humans, dexterity involves the mastery of various skills, including hand-eye coordination, the detection of forces and fine control of multiple movements. Here, Researcher Francisco Suárez-Ruiz and colleagues presented randomly scattered chair parts to a group of industrial robot arms, parallel grippers, six-axis force-torque sensors at the wrist, and 3-D cameras.
Using visual and tactile cues, the robots successfully assembled the chair in around 20 minutes. Three major operations in particular allowed for completion of the task: the robots quickly and reliably identified correct parts in a randomly cluttered environment; coordinated fast, collision-free motions to construct the chair, and detected force changes as they gripped onto the chair pieces (to verify that pins slid into the correct holes, for example). While substantial coding was used to program the robots’ movements, the authors hope that combining the capabilities of these robots with advanced AI could lead to fully autonomous function.