Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
Document-Level Neural Machine Translation With Recurrent Context States
oleh: Yue Zhao, Hui Liu
Format: | Article |
---|---|
Diterbitkan: | IEEE 2023-01-01 |
Deskripsi
Integrating contextual information into sentence-level neural machine translation (NMT) systems has been proven to be effective in generating fluent and coherent translations. However, taking too much context into account slows down these systems, especially when context-aware models are applied to the decoder side. To improve efficiency, we propose a simple and fast method to encode all sentences in an arbitrary large context window. It makes contextual representations in the process of translating each sentence so that the overhead introduced by the context model is almost negligible. We experiment with our method on three widely used English-German document-level translation datasets, which obtain substantial improvements over the sentence-level baseline with almost no loss in efficiency. Moreover, our method also achieves comparable performance with previous strong context-aware baselines and speeds up the inference by <inline-formula> <tex-math notation="LaTeX">$1.53\times $ </tex-math></inline-formula>. The speed-up is even larger when more contexts are taken into account. On the ContraPro pronoun translation dataset, it significantly outperforms the strong baseline.