A high-tech glove equipped with motion sensors can translate American Sign Language into spoken English through a smartphone app. The translation takes place in real-time, enabling people with speech disabilities to communicate with the outside world.
“Our hope is that this opens up an easy way for people who use sign language to communicate directly with non-signers without needing someone else to translate for them,” said Jun Chen, an assistant professor of bioengineering at the University of California, Los Angeles and the principal investigator on the research. “In addition, we hope it can help more people learn sign language themselves.”
Gloves that track people’s gestures are already proving very useful in applications from virtual reality to telesurgery.
The sign-language-to-speech gloves are layered with thin, stretchable sensors that run the length of each digit on the hand. The sensors can pick up even the subtlest gestures and hand motions, such as finger placements that represent letters, numbers, words, and phrases in American Sign Language.
The finger movements are converted into electrical signals, which are sent to a smartphone via a coin-sized circuit board worn on the wrist. Finally, an app on the phone translates the signals into spoken words at a rate of about one word per second.
Researchers also experimented with adhesive sensors glued to the subjects’ faces in order to also translate facial expressions that are part of American Sign Language.
In the United States, more than 500,000 individuals use sign language as their primary mode of communication. It’s unclear how many sign languages there are in the world, and there is still very little research in this regard, but one often-quoted estimate puts the number at 137.
Previous wearable systems designed to translate American Sign Language for common folk also proved somewhat effective, but they were limited by bulky auxiliary equipment and could be uncomfortable to wear.
In contrast, the UCLA gloves are lightweight, durable, and employ flexible and inexpensive electronic sensors.
During tests with four deaf subjects, the participants performed hand gestures while wearing the gloves, repeating each gesture 15 times. The machine-learning algorithm could recognize and translate 660 signs, including letters, numbers, and words.
The findings appeared in the journal Nature Electronics.