官术网_书友最值得收藏!

What happens if we use too many neurons?

If we make our network architecture too complicated, two things will happen:

  • We're likely to develop a high variance model
  • The model will train slower than a less complicated model

If we add many layers, our gradients will get smaller and smaller until the first few layers barely train, which is called the vanishing gradient problem. We're nowhere near that yet, but we will talk about it later.

In (almost) the words of rap legend Christopher Wallace, aka Notorious B.I.G., the more neurons we come across, the more problems we see. With that said, the variance can be managed with dropout, regularization, and early stopping, and advances in GPU computing make deeper networks possible.

If I had to pick between a network with too many neurons or too few, and I only got to try one experiment, I'd prefer to err on the side of slightly too many.  

主站蜘蛛池模板: 喜德县| 香格里拉县| 芮城县| 巴中市| 平泉县| 潮安县| 阳信县| 莱阳市| 嘉祥县| 怀仁县| 于都县| 临海市| 微山县| 平凉市| 罗田县| 汝南县| 东至县| 三门峡市| 大宁县| 炎陵县| 府谷县| 桐庐县| 兰坪| 天全县| 铜陵市| 三江| 天柱县| 湛江市| 青海省| 清徐县| 观塘区| 怀集县| 乌兰县| 循化| 乐清市| 酉阳| 山阳县| 上思县| 承德县| 怀远县| 东至县|