Sequence to Sequence Learning with Neural Networks 2019-07-21 paper seq2seq nmt mt 摘要 本文提出了用多层LSTM编码句子,再用多层LSTM解码句子的翻译模型,同时可用于其它序列到序列的NLP任务。同时发现了反转源语言句子(目标语言不反转)可以改善模型表现。 Please enable JavaScript to view the comments powered by Disqus.