- Hands-On Deep Learning Architectures with Python
- Yuxi (Hayden) Liu Saransh Mehta
- 126字
- 2021-06-24 14:48:10
Encoder-decoder structure
Neural machine translation models are Recurrent Neural Networks (RNN), arranged in encoder-decoder fashion. The encoder network takes in variable length input sequences through RNN and encodes the sequences into a fixed size vector. The decoder begins with this encoded vector and starts generating translation word by word, until it predicts the end of sentence. The whole architecture is trained end-to-end with input sentence and correct output translation. The major advantage of these systems (apart from the capability to handle variable input size) is that they learn the context of a sentence and predict accordingly, rather than making a word-to-word translation. Neural machine translation can be best seen in action on Google translate in the following screenshot:

- TIBCO Spotfire:A Comprehensive Primer(Second Edition)
- 程序設(shè)計(jì)語(yǔ)言與編譯
- 數(shù)據(jù)庫(kù)原理與應(yīng)用技術(shù)學(xué)習(xí)指導(dǎo)
- Visual C# 2008開發(fā)技術(shù)詳解
- 永磁同步電動(dòng)機(jī)變頻調(diào)速系統(tǒng)及其控制(第2版)
- 空間站多臂機(jī)器人運(yùn)動(dòng)控制研究
- 邊緣智能:關(guān)鍵技術(shù)與落地實(shí)踐
- 網(wǎng)絡(luò)服務(wù)搭建、配置與管理大全(Linux版)
- Photoshop行業(yè)應(yīng)用基礎(chǔ)
- Salesforce Advanced Administrator Certification Guide
- Visual Studio 2010 (C#) Windows數(shù)據(jù)庫(kù)項(xiàng)目開發(fā)
- Artificial Intelligence By Example
- Learning Cassandra for Administrators
- 機(jī)器人剛?cè)狁詈蟿?dòng)力學(xué)
- 運(yùn)動(dòng)控制系統(tǒng)