PDF4PRO ⚡AMP

Modern search engine that looking for books and documents around the web

Example: dental hygienist

Effective Approaches to Attention-based Neural Machine ...

Effective Approaches to Attention-based Neural Machine Translation Minh-Thang Luong Hieu Pham Christopher D. Manning Computer Science Department, Stanford University, Stanford, CA 94305. Abstract X Y Z <eos>. An attentional mechanism has lately been used to improve Neural Machine transla- tion (NMT) by selectively focusing on parts of the source sentence during trans- lation. However, there has been little work exploring useful architectures for Attention-based NMT. This paper exam- A B C D <eos> X Y Z. ines two simple and Effective classes of at- tentional mechanism: a global approach Figure 1: Neural Machine translation a stack- which always attends to all source words ing recurrent architecture for translating a source and a local one that only looks at a subset sequence A B C D into a target sequence X Y.

(Mnih et al., 2014), between speech frames and text in the speech recognition task (Chorowski et al., 2014), or between visual features of a picture and its text description in the image caption gen-eration task (Xu et al., 2015). In the context of NMT, Bahdanau et al. (2015) has successfully ap-plied such attentional mechanism to jointly trans-

Tags:

  Attention

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Spam in document Broken preview Other abuse

Transcription of Effective Approaches to Attention-based Neural Machine ...

Related search queries