We are creating some awesome events for you. Kindly bear with us.

Robot developed by NTU scientists autonomously assembles IKEA chair

Robot developed by NTU scientists autonomously assembles IKEA chair

Scientists at Nanyang
Technological University (NTU) have developed
a robot than can autonomously assemble an IKEA chair without interruption. The team
from NTU’s School of Mechanical and Aerospace Engineering is led by Assistant Professor Pham Quang Cuong. The
results are published
in the journal Science Robotics: Volume 3, Issue 17, April 2018.

The research which took three years was supported by grants
from the Ministry of Education, NTU’s innovation and enterprise arm, NTUitive,
and the Singapore-MIT Alliance for Research & Technology (SMART). 

In a demonstration, the robot assembled IKEA’s Stefan
in 8 minutes and 55 seconds. Before starting the assembly, the robot spent
11 minutes and 21 seconds to independently plan the motion pathways and 3
seconds to locate the parts.

The team coded algorithms using three different open-source
libraries to help the robot complete its job of putting together the IKEA

The ‘eyes’ and the ‘arms’

A 3D camera performs the function of the robot’s ‘eyes’ and
two industrial robotic arms equipped with parallel grippers to pick up objects
and capable of six-axis motion act as its ‘arms’.  Force sensors are mounted on the wrists that
determine how strongly the “fingers” are gripping and how powerfully they push
objects into contact with each other.

Asst Prof Pham explained why the task of putting together an
IKEA chair for a robot is much more complex than it looks. The job of assembly which
comes naturally to humans has to be broken down into different steps. This
includes identifying the location of the different chair parts, the force
required to grip the parts, and ensuring that the robotic arms move without
colliding into each other.

The robot starts the assembly process by taking 3D photos of
the parts laid out on the floor to generate a map of the estimated positions of
the different parts. This scenario aims to replicate, as closely as possible,
the cluttered environment after humans unbox and prepare to put together a
build-it-yourself chair. The challenge here is to determine a sufficiently
precise localisation in a cluttered environment quickly and reliably.

Using algorithms developed by the team, the robot plans a
two-handed motion that is fast and collision-free. The motion pathway needs to
be integrated with visual and tactile perception, grasping and execution.

It is challenging to regulate the amount of force exerted so
that the robotic arms are able to grip the pieces tightly and perform tasks
such as inserting wooden plugs. This is because industrial robots are designed
to be precise at positioning and are bad at regulating forces.

The force sensors mounted on the wrists help to determine
the amount of force required, allowing the robot to precisely and consistently
detect holes by sliding the wooden plug on the surfaces of the work pieces, and
perform tight insertions.

The team is looking to integrate more artificial intelligence
(AI) into the approach to make robot more autonomous, so that it can learn the
different steps of assembling a chair through human demonstration or by reading
the instruction manual, or even from an image of the assembled product.


Exploring applications of autonomous dexterous

The NTU team of Asst Prof Pham, research fellow Dr Francisco
Suárez-Ruiz and alumnus Mr Zhou Xian believe that their robot could be of
greatest value in performing specific tasks with precision in industries where
tasks are varied and do not merit specialised machines or assembly lines.

The robot is being used to explore dexterous manipulation,
an area of robotics that requires precise control of forces and motions with
fingers or specialised robotic hands, so that the robot is more human-like in
its manipulation of objects. Till now, autonomous demonstration of dexterous
manipulation has been limited to elementary tasks.

“One reason could be that complex manipulation tasks in
human environments require many different skills. This includes being able to
map the exact locations of the items, plan a collision-free motion path, and
control the amount of force required. On top of these skills, you have to be
able to manage their complex interactions between the robot and the environment.
The way we have built our robot, from the parallel grippers to the force
sensors on the wrists, all work towards manipulating objects in a way humans
would,” Asst Prof Pham explained.

The team is now working with companies to apply this form of
robotic manipulation to a range of industries. The team is now working to
deploy the robot to do glass bonding that could be useful in the automotive
industry, and drilling holes in metal components for the aircraft manufacturing
industry. All the components in the robotic setup can be bought off the shelf
and hence, cost is not an issue.

Send this to a friend