Ong, Chun Chee (2023) Development of a Gesture-Based Interface for Desktop Applications Using Machine Learning on Raspberry PI Pico. Final Year Project (Other), Tunku Abdul Rahman University College.
Text
Full Text - Ong Chun Chee.pdf Restricted to Registered users only Download (7MB) |
Abstract
A computer-based user interface (UI) is a point of interaction or communication between humans and computers in a device. This includes devices such as keyboards, mouse, joysticks, or remote control which are the conventional computer-based user interface. However, there are some problems with using the conventional user interface, which is it needs to use a mouse and keyboard to interact with the desktop. Other than that, there are a lot of hot key presses that need to be memorized. Hence, a gesture-based user interface is a potential solution. The objective of this project is to interface Raspberry Pi Pico with MPU-6060 accelerometer, use machine learning to recognize hand gestures and use hand gestures to replace keyboard shortcut key presses. MPU- 6050 and Raspberry Pi Pico are connected in I2C communication to acquire the accelerometer data to build a dataset for training of machine learning models in Edge Impulse. The algorithm used for machine learning is the Neural Network (NN) algorithm. The trained model is able to recognize 7 different hand gestures with an accuracy of 96.9%. Pyautogui is used in Python Script to trigger the correct hotkey presses in MultiSIM when any hand gestures are detected to control MultiSIM
Item Type: | Final Year Project |
---|---|
Subjects: | Science > Computer Science Technology > Electrical engineering. Electronics engineering |
Faculties: | Faculty of Engineering and Technology > Diploma of Electronic Engineering |
Depositing User: | Library Staff |
Date Deposited: | 30 Dec 2022 01:32 |
Last Modified: | 30 Dec 2022 01:32 |
URI: | https://eprints.tarc.edu.my/id/eprint/23881 |