ADVERTISEMENT
Filtered By: Scitech
SciTech
Microsoft's new Kinect tool aims to translate sign language into sound
After finding various uses outside of gaming, Microsoft's Kinect motion-sensing console accessory is about to embark on something amazing, yet again: bridging the gap between the deaf and those who can hear.
Microsoft researchers are developing a Kinect-based sign language tool that can translate sign into spoken language and back.
"Our system is still a research prototype. It is progressing from recognizing isolated words signed by a specific person (translator mode) to understanding continuous communication from any competent signer (communication mode)," said Guobin Wu, research program manager for Microsoft Research Asia.
So far, Wu said the prototype can successfully produce good results for translator mode, and they are working to overcome the technology hurdles so that the system can reliably understand and interpret in communication mode.
The team is also building up the system’s vocabulary of American Sign Language gestures, which are different from those of Chinese Sign Language, Wu added.
"Every month, we had a team meeting to review the progress and plan our next steps. Experts from a host of disciplines – language modeling, translation, computer vision, speech recognition, 3D modeling, and special education – contributed to the system design," Wu said.
Microsoft Research also posted a video of the Kinect's sign language translation capabilities on YouTube.
Wu said the project could potentially benefit the deaf and the hard of hearing, noting there are about 360 million people worldwide who are hard of hearing, including more than 20 million in China alone.
Participating in the project were students from the special education school at Beijing Union University.
In the first six months, the team focused on Chinese sign language data collection and labeling, with student Dandan Yin standing out.
Wu said it was Yin's dream to "create a machine to help people who can’t hear.”
During the development, Yin displayed her professionalism as a signer and later managed to communicate with a hearing employee by using the system.
Under the prototype, an avatar on the computer screen represents the hearing person and interprets their spoken language into sign language.
The project has since received much attention from researchers and the deaf community, especially in the United States, Wu said.
"We expect that more and more researchers from different disciplines and different countries will collaboratively build on the prototype, so that the Kinect Sign Language Translator system will ultimately benefit the global community of those who are deaf or hard of hearing," Wu said.
Stewart Tansley, director of Microsoft Research Connections, branded Yin's communication as "magic."
"Equally inspiring though, and far away from the crowds, I watched the diminutive and delightful Dandan Yin gesture to the Kinect device connected to the early sign language translator prototype—and words appeared on the screen! I saw magic that day, and not just on stage," Tansley said in a separate blog post.
In September 2013, he said they demonstrated the system at the Microsoft annual company meeting, with 18,000 in-person attendees and more than 60,000 watching online worldwide.
"We look forward to making this technology a reality for all!" he said.
A separate report on The Next Web said there is much work ahead, as the present system needs five people to establish the recognition patterns for just one word.
So far, it added only 300 Chinese sign language words have been added out of a total of 4,000. — KDM, GMA News
Tags: microsoftxboxkinect
More Videos
Most Popular