AR could play a role in helping people understand sign language
Students at New York have developed an app that uses AR and machine learning to translate sign language into spoken words, and vice versa. Unfortunately, many deaf people still feel that they live fairly isolated lives, with things like appointments and going out often massively complicated due to the lack of general knowledge of sign language throughout the hearing population.
The ARSL app uses computer vision models to translate sign language into spoken language and the other way around in real time. The app prototype has gained funding as part of the Verizon Connected Futures challenge, which was set up to form a pipeline between NYC universities and the company in order to promote next gen technology.
The ARSL team is headed by Zhongheng Li, who was inspired by his experiences of growing up with two deaf parents. Speaking of his mother, who moved to the US from Hong Kong, Li noted her frustration at the fact that there was no universal sign language
You can see the team’s presentation video here.
A lot to learn
There are, in fact, hundreds of different sign languages used all over the world. This can often make the act of translating between sign languages a difficult task, let alone translating from signs to spoken languages.
The ARSL app is currently limited to a single use case of making an appointment at a medical clinic. The task of expanding the functionality out is a complex one. Sign language is an intricate system of movement and facial expressions, with the words often coming in a different order than they would in English. Add into the equation the individual subtleties that each person brings to signing, and you have a lot for a machine attempting to learn and translate to handle.
Li and his team hope that the funding they’ve received will help them add a range of other scenarios to the app, with communicating at small medical clinics being at the top of the list. Larger medical facilities and hospitals are likely to have sign language interpreters on hand at all times. It seems, though, that we are still some way off from a full sign language translation app.
Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located IoT Tech Expo, Blockchain Expo, AI & Big Data Expo and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam and explore the future of enterprise technology.
- » Nissan and NTT DOCOMO test ‘invisible-to-visible’ technology in moving vehicle using 5G
- » Executives see bright future for VR, AR and XR particularly in gaming – but barriers to adoption remain
- » NexTech AR Solutions announces upgraded UX for web-enabled AR platform
- » New VR solution helps engineers detect early design risks in product and process lifecycles
- » Toyota allows customers to see inside C-HR model with AR Brandwidth initiative