Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
Neural machine translation model combining dependency syntax and LSTM
oleh: Zheng Xin, Chen Hailong, Ma Yuqun, Wang Qing
Format: | Article |
---|---|
Diterbitkan: | EDP Sciences 2022-01-01 |
Deskripsi
For the problem of the lack of linguistic knowledge in the neural machine translation model, which is called Transformer, and the insufficient flexibility of positional encoding, this paper introduces the dependency syntax analysis and the long short-term memory network. The source language syntactic structure information is constructed in the neural machine translation system, and the more accurate position information is obtained by using the memory characteristics of LSTM. Experiments show that using the improved model improves by 1.23 BLEU points in the translation task of the IWSLT14 Chinese-English language pair.