homehome Home chatchat Notifications


Google's Neural Machine can translate nearly as well as a human

It all happened because they let the program learn without interference.

Alexandru Micu
October 5, 2016 @ 7:19 pm

share Share

A new translation system unveiled by Google, the Neural Machine Translation (GNMT) framework comes close to human translators in it’s proficiency.

Public domain image.

Not knowing the local language can be hell — but Google’s new translation software might prove to be the bilingual travel partner you’re always wanted. A recently released paper notes that Google’s Neural Machine Translation system (GNMT) reduces translation errors by an average of 60% compared to the familiar phrase-based approach. The framework is based on unsupervised deep learning technology.

Deep learning simulates the way our brains form connections and process information inside a computer. Virtual neurons are mapped out by a program, and the connections between them receive a numerical value, a “weight”. The weight determines how each of these virtual neurons treats data imputed to it — low-weight neurons recognize the basic features of data, which they feed to the heavier neurons for further processing, and so on. The end goal is to create a software that can learn to recognize patterns in data and respond to each one accordingly.

Programmers train these frameworks by feeding them data, such as digitized images or sound waves. They rely on big sets of training data and powerful computers to work effectively, which are becoming increasingly available. Deep learning has proven its worth in image and speech recognition in the past, and adapting it to translation seems like the logical next step.

And it works like a charm

GNMT draws on 16 processors to transform words into a value called “vector.” This represents how closely it relates to other words in its training database — 2.5 billion sentence pairs for English and French, and 500 million for English and Chinese. “Leaf” is more related to “tree” than to “car”, for example, and the name “George Washington” is more related to “Roosevelt” than to “Himalaya”, for example. Using the vectors of the input words, the system chooses a list of possible translations, ranked based on their probability of occurrence. Cross-checking helps improve overall accuracy.

The increased accuracy in translation happened because Google let their neural network do without much of the previous supervision from programmers. They fed the initial data, but let the computer take over from there, training itself. This approach is called unsupervised learning, and has proven to be more efficient than previous supervised learning techniques, where humans held a large measure of control on the learning process.

In a series of tests pitting the system against human translators, it came close to matching their fluency for some languages. Bilingually fluent people rated the system between 64 and 87 percent better than the previous one. While some things still slip through GNMT’s fingers, such as slang or colloquialisms, those are some solid results.

Google is already using the new system for Chinese to English translation, and plans to completely replace it’s current translation software with GNMT.

 

share Share

Researchers Say They’ve Solved One of the Most Annoying Flaws in AI Art

A new method that could finally fix the bizarre distortions in AI-generated images when they're anything but square.

The small town in Germany where both the car and the bycicle were invented

In the quiet German town of Mannheim, two radical inventions—the bicycle and the automobile—took their first wobbly rides and forever changed how the world moves.

Scientists Created a Chymeric Mouse Using Billion-Year-Old Genes That Predate Animals

A mouse was born using prehistoric genes and the results could transform regenerative medicine.

Americans Will Spend 6.5 Billion Hours on Filing Taxes This Year and It’s Costing Them Big

The hidden cost of filing taxes is worse than you think.

Evolution just keeps creating the same deep-ocean mutation

Creatures at the bottom of the ocean evolve the same mutation — and carry the scars of human pollution

Underwater Tool Use: These Rainbow-Colored Fish Smash Shells With Rocks

Wrasse fish crack open shells with rocks in behavior once thought exclusive to mammals and birds.

This strange rock on Mars is forcing us to rethink the Red Planet’s history

A strange rock covered in tiny spheres may hold secrets to Mars’ watery — or fiery — past.

Scientists Found a 380-Million-Year-Old Trick in Velvet Worm Slime That Could Lead To Recyclable Bioplastic

Velvet worm slime could offer a solution to our plastic waste problem.

A Dutch 17-Year-Old Forgot His Native Language After Knee Surgery and Spoke Only English Even Though He Had Never Used It Outside School

He experienced foreign language syndrome for about 24 hours, and remembered every single detail of the incident even after recovery.

Your Brain Hits a Metabolic Cliff at 43. Here’s What That Means

This is when brain aging quietly kicks in.