Chua, Yee Heng (2025) Real-Time Worker States Detection and Adaptive Control of Collaborative Robot Using Webcam-Based Monitoring. Final Year Project (Bachelor), Tunku Abdul Rahman University of Management and Technology.
|
Text
CHUA YEE HENG_FULL TEXT.pdf Restricted to Registered users only Download (12MB) |
Abstract
This study presents the development of a real-time webcam-based monitoring and adaptive control framework aimed at enhancing the safety and productivity of collaborative robotic systems. The proposed system employs facial expression recognition (FER) using aYOLOv11 deep learning model to detect critical worker conditions, including alertness, drowsiness, and fatigue, through a non-intrusive visual input. The facial state information is integrated with the Robot Operating System (ROS), enabling a collaborative robot to dynamically adjust its operational behaviour based on the worker’s condition. To ensure detection reliability, a temporal filtering mechanism was implemented, combining a sliding window and priority-based logic to stabilise transient classifications and prioritise high- risk states. The adaptive control system was developed within a ROS framework using publisher- subscriber communication, allowing real-time modulation of the robot’s speed and trajectory. The entire system was validated in a simulation environment using the Panda robotic arm and the MoveIt motion planning framework. Experimental results demonstrated effective recognition of worker states with classification confidence ranging between 74 percent and 87 percent. The robot responded with an average reaction time of under 350 milliseconds, thereby satisfying the safety performance requirements specified by ISO/TS 15066. Four structured test scenarios as: emergency stop, speed transition, idle state, and recovery were conducted to evaluate system responsiveness, stability, and impact on task continuity. All experiments indicated consistent robot reactions with minimal disruption to productivity, with performance ratings ranging from 4 to 5. The research delivers a novel integration of facial expression-based monitoring and adaptive robotic control that addresses key challenges in human-robot interaction. It contributes a practical and scalable solution that improves operator safety while preserving operational fluency, and offers significant potential for implementation in human-centric automation and smart manufacturing environments.
| Item Type: | Final Year Project |
|---|---|
| Subjects: | Technology > Electrical engineering. Electronics engineering Technology > Mechanical engineering and machinery > Robotics |
| Faculties: | Faculty of Engineering and Technology > Bachelor of Mechatronics Engineering with Honours |
| Depositing User: | Library Staff |
| Date Deposited: | 14 Aug 2025 08:04 |
| Last Modified: | 14 Aug 2025 08:04 |
| URI: | https://eprints.tarc.edu.my/id/eprint/33695 |