Artificial Intelligence (AI) is reshaping how we communicate, but for interpreting American Sign Language (ASL), the technology may have a long way to go. In recent years, researchers and universities have been developing AI-driven tools aimed at bridging communication gaps for Deaf and hard-of-hearing individuals. While these advances offer new possibilities, they also raise serious questions about accuracy and accessibility ethics.
Startups like Sign-Speak are developing platforms to translate between ASL and spoken English, hoping to improve communication access for Deaf individuals in everyday situations. Institutions like Gallaudet University are also exploring this space through programs like the Artificial Intelligence, Accessibility, and Sign Language (AIASL) Center.
Community Opinions
Janeva Mosher, an RIT student studying Community Development and Inclusive Leadership with a concentration in Deaf Leadership, shares her thoughts on the ethics of the program.
“There is so much culture involved with using ASL that AI just can’t capture. AI is also still not a perfect science and suggesting that l even use AI interpreting where it will not understand me and I will not understand it is a form of taking away my access to communication.”
Abigail Brown, a fourth-year ASL interpreting student, also points out that while some AI tools can mimic basic signs, they often miss important aspects of the language. Facial expressions and classifiers are essential for communication in ASL, and can go uninterpreted.
“AI is not at a point in development that it can fully replicate ASL. From the samples I have seen on the Sign-Speak website, the AI model can accurately sign words, and what I mean by this is that it can make the appropriate handshapes, movements, palm orientation, and have it be in the right location. What I see is very clearly missing is facial expression, which is critical to conveying tone, emotion, intensity and effort, as well as for asking questions,” Brown says.
While acknowledging the potential of AI to help bridge communication gaps, Brown says,
“Businesses thinking of partnering with Sign-Speak must know beforehand that there are extreme limitations to the AI model and it is by no means providing an interpreted message on par with a real human ASL Interpreter… I honestly think a better option would be to contract with a VRI (Video remote interpreter) company, or offer a pre-recorded ASL tour.”
Brown favors this technology, as long as the Deaf community is leading the conversation.
“This AI technology can broaden communication options for Deaf and hearing consumers, but I don’t see a future where it replaces human ASL interpreters. If this technology can improve access for Deaf individuals, I am all for it. I am in this field because I care about language access and equity and if technology can support that goal, then I am thrilled,” she says.
Lola Johnston has interpreted in the Rochester community for more than 30 years. While working full-time at the Department of Access Services (DAS) and freelance interpreting in the Rochester community, she has witnessed a growing demand for interpreters at the National Technical Institute for the Deaf (NTID). The need for interpreting has increased, but the number of available interpreters hasn’t. Budget constraints and scheduling have made it harder to cover last-minute and after-hours assignments.
Interpreting, Johnston explains, is more than just words. Interpreters weigh ethical considerations, scheduling and topic familiarity before taking assignments, which can make the pool of qualified interpreters even smaller for specialized events.
Johnston acknowledges the potential but is cautious, saying, “AI should be used with discretion, based on the assignment and the stakes involved.”
Can AI replace Interpreters?
AI programs can provide rudimentary interpretation in emergency or last-minute situations. It also can be used in informal settings where it’s unnecessary to contract an interpreter, or if the option is not available. Programs such as these can benefit Deaf individuals stay informed in hearing environments, from dinner table conversations or announcements in public spaces. According to Sign-Speak, they are committed to creating a world where Deaf individuals can communicate freely and be understood and using AI to bridge the gap between ASL and spoken English.
The goal is not to replace interpreters entirely, but offer options when an interpreter is not available.
For Johnston, interpreting is about more than language; it’s about connecting cultures. She worries that a switch to AI could separate communities rather than bring them together. Like many interpreters, she dreams of a world where people can communicate directly, without needing someone in the middle—not because her work isn’t important, but because true accessibility means shared understanding.

