官术网_书友最值得收藏!

Batch normalization

Batch normalization is a technique that normalizes the feature vectors to have no mean or unit variance. It is used to stabilize learning and to deal with poor weight initialization problems. It is a pre-processing step that we apply to the hidden layers of the network and it helps us to reduce internal covariate shift.

Batch normalization was introduced by Ioffe and Szegedy in their 2015 paper, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. This can be found at the following link: https://arxiv.org/pdf/1502.03167.pdf.

The benefits of batch normalization are as follows:

  • Reduces the internal covariate shift: Batch normalization helps us to reduce the internal covariate shift by normalizing values.
  • Faster training: Networks will be trained faster if the values are sampled from a normal/Gaussian distribution. Batch normalization helps to whiten the values to the internal layers of our network. The overall training is faster, but each iteration slows down due to the fact that extra calculations are involved.
  • Higher accuracy: Batch normalization provides better accuracy.
  • Higher learning rate: Generally, when we train neural networks, we use a lower learning rate, which takes a long time to converge the network. With batch normalization, we can use higher learning rates, making our network reach the global minimum faster.
  • Reduces the need for dropout: When we use dropout, we compromise some of the essential information in the internal layers of the network. Batch normalization acts as a regularizer, meaning we can train the network without a dropout layer.

In batch normalization, we apply normalization to all the hidden layers, rather than applying it only to the input layer.

主站蜘蛛池模板: 崇义县| SHOW| 咸阳市| 武威市| 德令哈市| 扎鲁特旗| 乐东| 县级市| 松溪县| 普安县| 丰镇市| 安岳县| 鞍山市| 富民县| 阳原县| 太康县| 精河县| 隆尧县| 大石桥市| 沽源县| 陇西县| 布尔津县| 莎车县| 碌曲县| 普兰店市| 永清县| 宾川县| 临邑县| 玉田县| 盐源县| 灵山县| 射阳县| 比如县| 宁都县| 阳泉市| 恩平市| 鱼台县| 望都县| 漯河市| 西吉县| 温州市|