- Neural Network Programming with TensorFlow
- Manpreet Singh Ghotra Rajdeep Dua
- 148字
- 2021-07-02 15:17:11
Understanding backpropagation
When a feedforward neural network is used to accept an input x and produce an output y?, information flows forward through the network elements. The input x provides the information that then propagates up to the hidden units at each layer and produces y?. This is called forward propagation. During training, forward propagation continues onward until it produces a scalar cost J(θ). The backpropagation algorithm, often called backprop, allows the information from the cost to then flow backward through the network in order to compute the gradient.
Computing an analytical expression for the gradient is straightforward, but numerically evaluating such an expression can be computationally expensive. The backpropagation algorithm does so using a simple and inexpensive procedure.
- Google Visualization API Essentials
- Lean Mobile App Development
- iOS and OS X Network Programming Cookbook
- MySQL 8.x從入門到精通(視頻教學(xué)版)
- 網(wǎng)站數(shù)據(jù)庫(kù)技術(shù)
- Chef Essentials
- Mastering LOB Development for Silverlight 5:A Case Study in Action
- SQL Server深入詳解
- 數(shù)字IC設(shè)計(jì)入門(微課視頻版)
- Unity 2018 By Example(Second Edition)
- 中文版Access 2007實(shí)例與操作
- Spring Boot 2.0 Cookbook(Second Edition)
- PostgreSQL高可用實(shí)戰(zhàn)
- MySQL性能調(diào)優(yōu)與架構(gòu)設(shè)計(jì)
- 成功之路:ORACLE 11g學(xué)習(xí)筆記