Neural Machine Translation (NMT), one of the most important topics in deep learning, has gained much attention from the industries and academia over the last few years. In order to create simple models out of the complex ones, tech giant Google has been doing a lot of innovations in the domain of human to machine and machine to human translations for quite a few years now. Back in 2017, the tech giant introduced a solution to use a simple Neural Machine Translation (NMT) model to translate between multiple languages where the researchers merged 12 language pairs into a single model. Models into three types which are many-to-one, one-to-many and many-to-many models. Recently, the researchers at Google AI Team built a more enhanced system for neural machine translation (NMT) and published a paper known as "Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges".
Nov-6-2019, 23:28:40 GMT