Deep Learning-Enabled Smart Glove for Real-Time Sign Language Translation
Main Article Content
Abstract
Individuals with speech impairments face unique challenges in communication. They find it difficult to communicate with people conveniently. American Sign Language is a widely recognized sign language that employs hand gestures. It is important for society to promote inclusivity and better understand the different communication needs of these individuals. This research aims at developing an improvised sign language to text translator using deep learning and sensor fusion. The glove consists of five flex sensors and an MPU-6050 sensor to capture hand gestures and movements. Data collected from the sensors is processed and transmitted to a deep learning model trained on a specially created set of various sign language actions. The integration of flex sensors enables the detection of finger movements, while the MPU-6050 sensor provides information about hand orientation and motion. By combining data from these sensors, the glove is effective at accurately recognising gestures.
Article Details
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.