Files

Abstract

This article presents ur work from research project which objective is to allow a robot to perform pick & place and assembly tasks by intuitively teaching and programming the robot trajectories from human demonstrations. Based on motion acquisition systems, we aim at developing a system capable of acquiring and analyzing the manipulation actions performed by an operator to extract their primitives and compound characteristics. A Leap Motion sensor with a 3D camera are used to acquire gestures and movements at different scales and objects position. Classification algorithms and deep learning models are used in order to recognize gestures. An expert system is allowing the translation of recognized gestures into robot trajectories. The challenge in our case is the automation of tasks through artifical intelligence. ABB's YuMi Robot is used to validate our solution.

Details

Actions