Wen Zhang, Jiawei Hu, Yang Feng, Qun Liu

Although neural machine translation (NMT) with the encoder-decoder frameworkhas achieved great success in recent times, it still suffers from somedrawbacks: RNNs tend to forget old information which is often useful and theencoder only operates through words without considering word relationship. Tosolve these problems, we introduce a relation networks (RN) into NMT to refinethe encoding representations of the source. In our method, the RN firstaugments the representation of each source word with its neighbors and reasonsall the possible pairwise relations between them. Then the sourcerepresentations and all the relations are fed to the attention module and thedecoder together, keeping the main encoder-decoder architecture unchanged.Experiments on two Chinese-to-English data sets in different scales both showthat our method can outperform the competitive baselines significantly.

location

arxiv.org

Advertisements