Can AI Become Skilled At American Sign Language? Just Ask a Student Developer in India


It'd be so awesome if artificial intelligence (AI) could learn American Sign Language. However, is it even possible? Well, one student in India named Priyanjali Gupta built an AI model which can translate American Sign Language into English in real-time. Gupta’s AI model was in fact inspired by a data scientist Nicholas Renotte’s video on Real-Time Sign Language Detection. As stated in an Inquirer.net article, "She invented the AI model using Tensorflow object detection API that translates hand gestures using transfer learning from a pre-trained model named ssd_mobilenet." When Gupta signed basic signs such as Hello, I Love You, Thank You, Please, Yes, and No, the AI was able to translate these basic phrases to English.

Is Learning American Sign Language From an Artificial Intelligence Effective?


Technology is advancing, and individuals have the ability to e intricate inventions. While it's amazing that people design inventions like AIs that are able to translate ASL to English hoping to bridge the communication gap between the Deaf and hearing people, it's probably not recommended and realistic to learn ASL from AIs for quite a few reasons.

1) AI is quite limited.

As stated, American Sign Language isn't only about interacting with the hands but also involves facial expressions and body movements. The facial expressions have different meanings when signing. For example, "raised or lowered eyebrows" can be used depending on what questions are being asked. "Raised eyebrows" often show that the questions are a yes or no type of question. Meanwhile, "lowered eyebrows" questions usually demonstrate questions that need an answer. Body movements include moving when referring to a dialogue of various speakers in a conversation, or demonstrations of timid versus proud, etc.. You must see the person’s face and also the whole body, so that you get the whole input of both facial expressions and the body language. Many people prefer to learn American Sign Language virtually or in person, where they can experience the whole body, including the signer’s signing, facial expressions, and body movements.

2) AI will not be able to translate the value of facial expressions, body language, ASL grammar, and also sentence structure, nor key facets of the Deaf culture and community.

ASL is an expressive language, and so facial expressions and the language are essential when signing. Body language and facial expressions can change the message of a story. ASL’s grammar and syntax aren't the same as in English. For instance, the correct sentence structure in English is, "I am going to the store," but in ASL, the sentence turns to, "Store I go." The one who programs the AI is most likely not Deaf; thus, the program could easily convey wrong ASL.

3) AI won't be able to respond to questions

If somebody is learning a new language, he/she will most likely have a lot of questions to ask about the language structure itself. Unless the AI is programmed with a lot of knowledge about the linguistics of ASL, the key aspects of Deaf culture, and is regularly immersed within the Deaf community, it'd be impossible to answer most questions properly. Every day life is constantly changing, and individuals, as well as their language, conform to the changes. New signs are constantly being invented to this day. AI would not be able to stay up with those changes; and so would quickly be filled with outdated info. The AI would consist of superficial knowledge, which simply demonstrates the basic signs, and those signs are translated to English.

4) AI doesn't have the everyday real-life experience

It's still a long way to go for AI before it even comes close to simulating a real person’s knowledge. It is not even able to recognize most specific signs or signer’s styles. For a person to become fluent in ASL, the top solutions are to watch slow-motion ASL video classes, private one-on-one lessons, go to Deaf socials and connect with Deaf people. You can certainly learn much from real-life interactions when it comes to how ASL is being utilized into day to day life.

5) A conversation with AI seems not real and not authentic.

AI is absolutely robotic and doesn't sign as fast or as easily as a real person can. A real person’s expressions are a lot more animated than any known AI, which makes the conversation more personal and also heart-felt. It will always be highly suggested that the newbie signers interact with Deaf people in real-life interactions.

To summarize, it is actually wonderful that people are inventing new types of AI which help bridge the communication gap between Deaf and hearing people. Nevertheless, ASL, the Deaf culture, and the Deaf community hold a lot of historical background and significance. A lot of Deaf people think that AI would only take away the core value of both their language and culture. If perhaps AI teaches ASL, the language can easily be wrongly modified and stray away from the real ASL structure, and Deaf people definitely want to prevent that from happening.

In the end, AI wouldn't make communication between Deaf and hearing people much better or less difficult. The best solution to this problem would be for hearing individuals to learn American Sign Language either online or in person from an actual Deaf teacher. The moment more hearing people start learning true American Sign Language, it'll make Deaf people’s lives and communication a whole lot easier.