官术网_书友最值得收藏!

Backpropagation – a method for neural networks to learn

Great! We have come a long way, from looking at the biological neuron, to the types of neuron, to determining accuracy, and correcting the learning of the neuron. Only one question remains: how can the whole network of neurons learn together?

Backpropagation is an incredibly smart approach to making gradient descent happen throughout the network across all layers. Backpropagation leverages the chain rule from calculus to make it possible to transfer information back and forth through the network:

In principle, the information from the input parameters and weights is propagated through the network to make a guess at the expected output and then the overall inaccuracy is backpropagated through the layers of the network so that the weights can be adjusted and the output can be guessed again.

This single cycle of learning is called a training step or iteration. Each iteration is performed on a batch of the input training samples. The number of samples in a batch is called batch size. When all of the input samples have been through an iteration or training step, then it is called an epoch.

For example, let's say there are 100 training samples and in every iteration or training step, there are 10 samples being used by the network to learn. Then, we can say that the batch size is 10 and it will take 10 iterations to complete a single epoch. Provided each batch has unique samples, that is, if every sample is used by the network at least once, then it is a single epoch. 

This back-and-forth propagation of the predicted output and the cost through the network is how the network learns.

We will revisit training step, epoch, learning rate, cross entropy, batch size, and more during our hands-on sections. 

主站蜘蛛池模板: 安阳县| 石狮市| 临高县| 邵武市| 玛纳斯县| 宣汉县| 桐乡市| 浏阳市| 西贡区| 崇仁县| 惠东县| 峨眉山市| 汕头市| 盱眙县| 麦盖提县| 和政县| 牡丹江市| 淄博市| 民权县| 观塘区| 榆中县| 合肥市| 合川市| 汾西县| 朝阳县| 云林县| 安徽省| 元江| 海兴县| 阆中市| 印江| 依兰县| 疏附县| 无极县| 桓台县| 九龙县| 太白县| 张家川| 哈巴河县| 大宁县| 前郭尔|