A comprehensive way to automatically translate sign language into text or speech, and vice versa, has remained elusive, however.
Jeffrey Bigham, an expert in human-computer interaction from Carnegie Mellon University, says Mr Singh’s project is “a great proof of concept” but a system fully capable of recognising sign language would be hard to design “as it requires both computer vision and language understanding that we don’t yet have”.
“Alexa doesn’t really understand English either, of course,” he adds, noting that voice assistants understand only a relatively small set of template phrases.
Aine Jackson, of the British Deaf Association, says that, with the increase in voice-assisted technologies, many developments are leaving deaf sign language users behind.
“Many of these technologies are shaping the world we live in and with exciting new capabilities there is now the scope for some really imaginative solutions to language access for deaf people.”
She notes a number of similar projects, from sign language reading gloves to signing avatars, but also the difficulties in communicating the grammar of signed languages – conveyed not just by the hands but by body position and facial movements.
“We would encourage companies to take steps to make their technologies accessible for all, and congratulate individuals such as Abhishek Singh who are turning their minds to the matter,” she adds.