×
Multi-way, multilingual neural machine translation with a shared attention mechanism. Orhan Firat, Kyunghyun Cho, Yoshua Bengio.
Jan 6, 2016 · The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows ...
People also ask
The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly ...
The proposed multi-way, multilingual neural machine translation approach enables a single neural translation model to translate between multiple languages, ...
The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly ...
Missing: Shared Attention Mechanism.
The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly ...
The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly ...
The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly ...
Multi-way, multilingual neural machine translation ... This is made possible by having a single attention mechanism that is shared across all language pairs.
This paper addresses an interesting problem: how to build a multi-way, multilingual NMT. Having a seq2seq that handles multilingual tasks is somewhat trivial.