Sign Language Recognition System

Main Article Content

Safa Ali Abed
Dr. Ammar Dawood Jasim

Abstract

Gesture-based languages have recently attracted a lot of attention. To communicate and
converse with one another and regular people without impairments, it is most frequently used
by deaf and dumb persons who have voice or hearing difficulties. The use of an ASL
translator helps signers and non-signers communicate. The most effective way for them to
express themselves and communicate is through gestures, which is specifically referred to as
sign language. Individuals with speech difficulties communicate by gestures, which makes it
challenging for them to interact with others. They are separated from the rest of the world and
find it difficult to interact with hearing-impaired people. As a result, they need an
interpretative system for those who do not understand the sign language. In this research, a
machine learning-based approach is proposed for the translation of hand motions from
gesture to text. We study a method based on the recently released MediaPipe Hands (MPH).
MPH is a very accurate, well-trained hand-keypoint detection model, enabling deaf and dumb
people to communicate with each other and with everyday people more easily. A current
problem is that hand sign languages are not highly accessible to many people. This paper's
main goal is to show how to implement Sign Language Recognition and simplify with
improvements utilizing the open-source MediaPipe framework and machine learning
algorithm and proposes a skeleton-based machine learning (ML) system that provides very
accurate alphabetical sign language recognition in real time.

Article Details

Section
Articles