SURVEY ON TRANSLATION OF GESTURE-BASED SIGN LANGUAGE |
Author(s): |
Shivangi Deshmukh |
Keywords: |
Gesture Recognition, Convolutional Neural Network, Deep Learning, Sign Language Recognition, Human Computer Interaction (HCI). |
Abstract |
Most of us communicate our thoughts through voice and facial gestures, but according to the most recent poll, around 1% of the population in India is deaf and mute. These folks interact with others using hand gestures and facial expressions. However, most individuals find it difficult to comprehend gestures. To close this gap, we provide static gesture categorization based on sign language norms, which we subsequently transform to text and speech of a specific local dialect. Approaches for identifying hands and recognizing sign language may be classified into two categories: standard methods and deep learning methods. With the great successes of deep learning in the field of computer vision in recent years, it has been demonstrated that the deep learning approach has many benefits, such as rich feature extraction, powerful modelling capacity, and intuitive training. As a result, this article investigates hand locating and sign language recognition of common sign language using a neural network. |
Other Details |
Paper ID: IJSARTV Published in: Volume : 8, Issue : 2 Publication Date: 2/2/2022 |
Article Preview |
Download Article |