University of Washington computer scientists are hard at work to help hearing-impaired people to be able to use sign language over a mobile phone…similar to in Japan or Sweden.
Hearing-impaired people have obviously already been able to use text messages for communication. But the goal is for them to be able to communicate in their language — American Sign Language (ASL). The next step is to convince a commercial cell phone manufacturer to integrate the new MobileASL software before this service becomes widely available.
Unfortunately, two-way real-time video communication is not really possible yet in the US because of low data transmission rates on U.S. cellular networks. Additionally, limited processing power on mobile devices have prevented real-time video transmission with enough frames per second. Communication rates on US cellular networks allow about one-tenth of the data rates as compared with Europe and Asia.
The solution was to design software to work even with this bandwidth limitation. The team discovered that the most important part of the image to transmit in high resolution is around the face. The team developed a way to transmit the person’s face and hands in high resolution with the background transmitted in lower resolution. Currently they are working on a separate feature that identifies when people are moving their hands so that battery consumption can be conserved.