Beh, Jay Min (2024) Motion Control of Robot Using Perception Robot towards Human. Final Year Project (Bachelor), Tunku Abdul Rahman University of Management and Technology.
Text
BEH JAY MIN_Full Text.pdf Restricted to Registered users only Download (7MB) |
Abstract
Moving into Industrial 4.0, human-robot collaboration is viewed as pivotal and powerful in the modern day. Robot arm is one of the most basic and crucial part of automation industry in the present, and the requirement of safety and limits of efficiency of the robot arm are even more demanded so by the industry. In this thesis, the different ways robot can perceive human are discussed, and the most suitable method is chosen as facial expression recognition. The direction of the thesis is focused on the application of Facial Expression Recognition (FER) system onto a robotic arm, whereby its intention is to achieve a human-robot collaboration. The proposed robot is able to collaborate with the operator by sensing the facial expressions of the operator; the robotic arm is able to differentiate if the operator is in distress or not. Thus, allowing the robotic arm to react in-time to prevent any mistakes made by the operator during the operation of the machine; in other words, utilising the ability of FER system as a collision avoidance system. The programmed route will be changed or modified in order to avoid the mistake, such as an impending collision of the robotic arm with other objects or itself. Using the studied FER algorithms, the data of the different facial expressions are collected and categorised based on the training and testing. DeepFace demonstrates high accuracy in recognising "Happy" facial expressions, with a precision of approximately 91%. Following this, "Neutral" expressions show the next highest accuracy at 80%, trailed by "Surprise" and "Angry" expressions at 76%. "Sad" expressions follow closely behind with an accuracy of 66%. However, DeepFace exhibits lower accuracy in detecting "Fear" expressions, around 20%. The lowest accuracy is observed in identifying "Disgust" expressions, with a precision of approximately 18%. When a certain range of facial expressions is detected, the FER system will then alert the robot. The robot will then adjust its trajectory if the detected facial expressions received is in distress or a surprised manner. Hence, preventing an impending collision due to human error. The percentage of error in the position of the robot arm is measured and calculated to be less than 1%. Two sets of experiments are conducted in this thesis; the robot arm has proven its ability to move accordingly to the defined emotions; the robot arm is able to carry out active-collision avoidance. The integration of the image processing algorithms, which in this case facial expressions, into an example of industrial application such as a robotic arm is the goal of the project; to allow deeper human-robot collaboration to take place effectively.
Item Type: | Final Year Project |
---|---|
Subjects: | Technology > Mechanical engineering and machinery > Robotics |
Faculties: | Faculty of Engineering and Technology > Bachelor of Mechatronics Engineering with Honours |
Depositing User: | Library Staff |
Date Deposited: | 12 Aug 2024 03:04 |
Last Modified: | 12 Aug 2024 03:04 |
URI: | https://eprints.tarc.edu.my/id/eprint/29695 |