Autonomous Robotic Arm with Human Interaction Capability

 




 

Giam, Kee Teck (2020) Autonomous Robotic Arm with Human Interaction Capability. Final Year Project (Bachelor), Tunku Abdul Rahman University College.

[img] Text
GiamKeeTeck_Full Text.pdf
Restricted to Registered users only

Download (4MB)

Abstract

Although the robot arm has the advantage of flexibly changing the handle task by reprogramming the robot. However, its adaptability to the environment is still limited. For instant, in many areas of robotics applications, it still relies on fixed locations. This has led to more investment and time in designing fixed stations for robot application in industrial. Besides, most industrial robots are not characterised by human interaction because the technology is not mature enough. For example, gesture robot programming has been studied in recent years but is still not applicable to the industrial robots for human-robot interaction. This is because the gesture control causes the robot arm to lose its precise ability to move. Besides, to train an image-based gestures recognition is time-consuming due to webcam noise and to seamless integration machine vision (camera), gestures recognition with robotic system is difficult. To solve these problems, new gesture control methods can be applied to robotic systems with machine vision. However, the main challenge in developing such a robotic system is to accurately recognize gestures and convert it to the appropriate robot control instructions because the shape and duration of the gestures occur dynamically. For robotic arms with machine vision, increasing flexibility and reducing the need for stationary workstations also helps solve the accuracy issues caused by gesture control. The proposed system is limited to wearing gloves for gesture recognition to ensure accuracy, while the shape of the target object is limited by the gripper design. For gesture recognition, a single camera (webcam) that supports CNN is developed and integrated into a robotic arm in the simulation software Vrep. In order to ensure the precise movement of the robotic arm through gesture control, a GUI for robotic speed control and machine vision for guiding the robotic arm are provided. Users can control the robotic arm in any direction with gestures, and program a simple pick and place tasks. To improve it further, a single gesture representing a series of robotic instructions were tested. For example, a pick gesture is displayed to the webcam, and the robotic arm will automatically pick the target object assigned by the gesture. Based on the results of this thesis, new high-level robot programming using gestures is more intuitive and user-friendly. However, left-handed or right-handed people may affect the accuracy of gesture recognition because the data used to train CNN comes from a single left-handed user. Furthermore, this research will test gesture recognition on a webcam based on CNN and compare it with other methods provided in previous researches. Overall, the proposed system successfully achieved the objectives mentioned in this research. However, since limited gesture vocabulary has been trained, high level robot programming is still limited to pick and place applications. It is best to expand the vocabulary of gestures to increase the tasks that the gesture control robot can handle. The accuracy of gesture recognition can be improved by using RGBD cameras to filter out noise or increase the CNN dataset.

Item Type: Final Year Project
Subjects: Technology > Mechanical engineering and machinery
Technology > Electrical engineering. Electronics engineering
Technology > Mechanical engineering and machinery > Robotics
Faculties: Faculty of Engineering and Technology > Bachelor of Mechatronics Engineering with Honours
Depositing User: Library Staff
Date Deposited: 24 Apr 2020 16:05
Last Modified: 01 Oct 2020 07:36
URI: https://eprints.tarc.edu.my/id/eprint/14287