- Python Deep Learning
- Ivan Vasilev Daniel Slater Gianmario Spacagna Peter Roelants Valentino Zocca
- 297字
- 2021-07-02 14:31:05
Training neural networks
We have seen how neural networks can map inputs onto determined outputs, depending on fixed weights. Once the architecture of the neural network has been defined and includes the feed forward network, the number of hidden layers, the number of neurons per layer, and the activation function, we'll need to set the weights, which, in turn, will define the internal states for each neuron in the network. First, we'll see how to do that for a 1-layer network using an optimization algorithm called gradient descent, and then we'll extend it to a deep feed forward network with the help of backpropagation.
The general concept we need to understand is the following:
Every neural network is an approximation of a function, so each neural network will not be equal to the desired function, but instead will differ by some value called error. During training, the aim is to minimize this error. Since the error is a function of the weights of the network, we want to minimize the error with respect to the weights. The error function is a function of many weights and, therefore, a function of many variables. Mathematically, the set of points where this function is zero represents a hypersurface, and to find a minimum on this surface, we want to pick a point and then follow a curve in the direction of the minimum.
- ASP.NET Core:Cloud-ready,Enterprise Web Application Development
- PHP 從入門到項目實踐(超值版)
- 垃圾回收的算法與實現(xiàn)
- jQuery從入門到精通 (軟件開發(fā)視頻大講堂)
- Magento 1.8 Development Cookbook
- SQL Server 2016數(shù)據(jù)庫應(yīng)用與開發(fā)
- Python數(shù)據(jù)結(jié)構(gòu)與算法(視頻教學(xué)版)
- Learning Concurrent Programming in Scala
- Java Web開發(fā)就該這樣學(xué)
- 人工智能算法(卷1):基礎(chǔ)算法
- Java Web應(yīng)用開發(fā)項目教程
- Practical GIS
- PHP+MySQL動態(tài)網(wǎng)站開發(fā)從入門到精通(視頻教學(xué)版)
- Visual Basic程序設(shè)計全程指南
- 計算機(jī)輔助設(shè)計與繪圖技術(shù)(AutoCAD 2014教程)(第三版)