Facial Expression Recognition for Robot Motion Control

 




 

Faun, Ke Xin (2024) Facial Expression Recognition for Robot Motion Control. Final Year Project (Bachelor), Tunku Abdul Rahman University of Management and Technology.

[img] Text
Faun Ke Xin_Full Text.pdf
Restricted to Registered users only

Download (4MB)

Abstract

This research paper focuses on developing a Human-Robot Cooperation (HRC) system by incorporating Facial Expression Recognition (FER) to accurately identify and interpret human emotions. The goal is to enhance human-robot interaction by enabling robots to respond appropriately to human emotions. The study also explores the challenges of designing effective human-robot interfaces and the different ways in which robots perceive humans. The FER system was developed using Kaggle services and Python, integrated into an existing robotic system, and compared with a traditional robotic system to determine the most effective approach. The study aims to enhance the design and formation of future robotic systems, particularly in the areas of human-robot interaction and other applications where the accuracy of FER is essential. The results of this study will provide insights into the accuracy and effectiveness of FER-based HRC systems and aid in developing more intuitive and user-friendly robotic systems. The FER key findings include the robustness of the "FER 2013 dataset" for training emotion recognition models. Two distinct training approaches were evaluated: Approach 1, characterized by a deep model with dropout layers, reached a training accuracy of approximately 93.27%, while Approach 2, with a simplified architecture, achieved around 99.68% accuracy in less time. The FER system demonstrated an impressive 99.68% accuracy in recognizing trained facial expressions, with minor errors observed in limited instances. Approach 2 also addressed scenarios with multiple individuals, prioritizing the closest person for emotion recognition, and enhancing context-aware human-robot cooperation. Microcontroller implementation revealed some challenges posed by Raspberry Pi’s limitations in handling the analogue input, ultimately designating Arduino as the preferred microcontroller for the robot. Lastly, the trajectory analysis visually demonstrated the robot’s ability to respond to human emotional cues, further validating the integrated emotion recognition and motion control system.

Item Type: Final Year Project
Subjects: Technology > Mechanical engineering and machinery
Technology > Mechanical engineering and machinery > Robotics
Faculties: Faculty of Engineering and Technology > Bachelor of Mechatronics Engineering with Honours
Depositing User: Library Staff
Date Deposited: 12 Jan 2024 07:55
Last Modified: 12 Jan 2024 07:55
URI: https://eprints.tarc.edu.my/id/eprint/27475