Video conferencing for sign language users is about to get a lot easier, as Google is reportedly researching new features that will allow for a more comprehensive experience for deaf and mute users.
The pandemic has made video conferencing apps an integral aspect of everyday life. With work from home policies on the rise and remote social events becoming the go-to solution for social distancing, the technology is becoming part of everyone’s new normal.
This trend has made accessibility a necessity, and Google is apparently leading the charge for a more inclusive experience with these new sign language features.
Google Researching Automatic Sign Language Detection for Video Chat
In a Google AI blog post, the company announced that it’s developing a real-time automatic sign language detection feature for its video conferencing solutions in hopes of providing a bit more accessible and inclusive functionality for deaf and mute users.
“Video conferencing should be accessible to everyone, including users who communicate using sign language,” starts the post.
As you can imagine, video conferencing can be quite difficult for sign language users, as the technology uses sound to denote which speaker should be focused on. With these new features, Google’s video conferencing solutions will be able to quickly and effectively recognize when sign language is being used, and will allow users to communicate more effectively.
How Will Sign Language Detection Work?
As Google regularly points out in the blog post, the technology can’t just be effective; it also needs to be efficient, so as to avoid affecting video and audio quality for all users.
This necessity in development led Google to utilize pose estimation and optical flow features, allowing the focus to be on a small portion of the video, rather than attempting to fully view the entire picture. This way, sign language users can be recognized as signing without consuming too many CPU cycles.
Check out our hearing aid guides for more information
This is all pretty technical, even for this writer, but the gist of it is that Google, in true software dynamo fashion, is working on a way to easily recognize when a person is using sign language. Check out the company’s demo below, taking note of the boxes in the upper left corner to see how quickly and uniformly the technology recognizes a sign language user in action.
Accessibility in the Tech Industry
The tech industry has been doing a good job lately of making an effort to develop technology that is more accessible to disabled users. From virtual assistants allowing for voice commands to video game consoles and controllers specifically built for those with disabilities, accessibility is finally getting a little time in the spotlight.
Learn more about video conferencing equipment now
That’s not always been the case though. Given the fact that the Americans with Disabilities Act (ADA) turned 30 years old earlier this year, tech has been admittedly slow to act. Those developing technology still require a kick in the pants to get it done, which is a bit silly when you consider the benefits it can have on the entire industry.
“I’ve heard from a lot of people that when you make something accessible to the blind it makes it better for everybody,” said Claire Stanley, Advocacy and Outreach Specialist at the American Council for the Blind to TechCrunch.
Obviously Google’s efforts to be more inclusive are a step in the right direction, but the reality is that these kinds of features should be built-in from the start, rather than added after the fact. Because if tech is supposed to save the world, it has to save it for everyone.