Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Mouseless Mouse: Enhancing Digital Interaction for Individuals With Limited Hand Mobility via Amplify and Smooth Extracted Data From Head Movement Using Real-Time Facial Tracking and Deep Learning

Booth Id:
SOFT045T

Category:
Systems Software

Year:
2024

Finalist Names:
Hemdan, Mohamed (School: Obour STEM School)
Abdalla, Mohamed (School: Obour STEM School)

Abstract:
Individuals with limited hand mobility usually face challenges in daily activities and technology interactions that require fine motor skills and dexterity. To address this issue, innovative technology has been developed for interaction with machines and controlling a mouse with head movements. Using a laptop's camera for commands eliminates the need for additional hardware for accurate control. The proposed system tackles the problem of limited input options while also minimizing erratic movements for people with neck movement restrictions. The system uses face mesh technology to map facial features accurately and translate head movements into cursor movements, and the data is amplified to fill the entire screen while maintaining comfortable movement. Exponential moving average (EMA) was used to improve cursor control by giving more weight to recent data points, thus reducing jitter. Because of the latency caused by EMA, Kalman filter is utilized for its predictive capabilities, which significantly enhance the system's responsiveness and enable more accurate real-time cursor movements. Due to limitations in actions, eye wink detection utilizes the eye-aspect ratio (EAR) to accurately measure eye wink representation. A specialized Vision in Transformer (ViT) model for detecting facial gestures has been developed. By implementing a two-phase fine-tuning approach. The model is adapted to function under various lighting conditions. By amplifying and smoothing data, the project reduced movement deviation. Lessening jittery movement. Utilizing ViT enhances real-time classification accuracy by expanding command options. Using a laptop's camera for accurate interaction aids individuals with limited hand mobility, benefiting all users and promoting inclusivity in technology.