Multiple Languages to Sign Language Using NLTK
DOI:
https://doi.org//10.32628/IJSRST2310189Keywords:
Sign Language, Natural Language Processing (NLP), Asynchronous Server Gateway Interface (ASGI), Webserver Gateway Interface (WSGI).Abstract
Sign language involves visual gestures and signs that deaf and mute people adopt as their first conversation. People other than deaf or hard of hearing can also use sign language, like, people suffering from autism, apraxia of speech, cerebral palsy or down syndrome. It entails hand gestures, non-verbal communication or physical movement, and face emotions all at the same time. It may be used by communities who have difficulty to speak, those who can hear but cannot talk, and helps normal individuals to interact with hearing impaired people. There are several groups of hearing-impaired people throughout the globe, and each community's language is unique. American Sign Language (ASL) is utilized throughout United States; British Sign Language (BSL) is a form of communication in the United Kingdom; Australian Sign Language (AUSLAN) is used in Australia; and Indian Sign Language (ISL) is used in India. ISL signs are divided into three categories: one handed, two handed, and non-manual signs. One-handed and two-handed signs are sometimes known as manual signs. Changes in body position and facial emotions provide non-manual indicators. This translator converts English text to Sign language, allowing hearing-impaired persons in India to engage with others. This application takes multiple languages as input and gives signs as information.
References
- https://www.kaggle.com/datasets/ vaishnaviasonawane/indian-sign-language- dataset/code
- M Elmezain, A. Al-Hamadi, J. Appenrodt and B. Michaelis, A Hidden Markov Model- based Continuous Gesture Recognition System for Hand Motion Trajectory, 19th International Conference on IEEE, Pattern Recognition, 2008, ICPR 2008, pp. 1–4, (2008).
- P Morguet and M. Lang M, Comparison of Approaches to Continuous Hand Gesture Recognition for a Visual Dialog System, IEEE International Conference on IEEE Acoustics, Speech, and Signal Processing, 1999, Proceedings, 1999, vol. 6, pp. 3549–3552, 15– 19 March (1999).
- Rao, RR, Nagesh, A, Prasad, K. and Babu, K E (2007) Text-Dependent Speaker Recognition System for Indian Languages. International Journal of Computer Science and Network Security, Vol. 7, No.11.
- T. Starner, “Visual Recognition of American Sign Language Using Hidden Markov Models,” Master’s thesis, MIT, Media Laboratory, Feb. 1995.
- Neha Poddar, Shrushti Rao, Shruti Sawant, Vrushali Somavanshi, Prof. Sumita Chandak "Study of Sign Language Translation using Gesture Recognition" International Journal of Advanced Research in Computer and Communication Engineering Vol. 4, Issue 2, February 2015.
- Deaf Mute Communication Interpreter Anbarasi Rajamohan, Hemavathy R., Dhanalakshmi M. (ISSN: 2277-1581) Volume 2 Issue 5, pp: 336-341 1 May 2013.
- Zouhour Tmar, Achraf Othman & Mohamed Jemni: A rule-based approach for building an artificial English-ASL corpus http://ieeexplore.ieee.org/document/6578458
- Dictionary | Indian Sign Language. (n.d.). Retrieved July 15, 2016, from http://indiansignlanguage.org/dictionary
- P. Kar, M. Reddy, A. Mukherjee, A. M. Raina. 2017. INGIT: Limited Domain Formulaic Translation from Hindi Strings to Indian Sign Language. ICON.
- M Vasishta, J. Woodward and S. DeSantis. 2011. An Introduction to Indian Sign Language. All India Federation of the Deaf (Third Edition).
- V Lpez-Ludea, C. Gonzlez-Morcillo, J.C. Lpez, E. Ferreiro, J. Ferreiros, and R. San- Segundo. Methodology fordeveloping an advanced communications system for the deaf in a new domain. Knowledge-Based Systems, 56:240 – 252, 2014.
Downloads
Published
Issue
Section
License
Copyright (c) IJSRST
This work is licensed under a Creative Commons Attribution 4.0 International License.