官术网_书友最值得收藏!

Limitations of deep learning

Deep neural networks are black boxes of weights and biases trained over a large amount of data to find hidden patterns through inner representations; it would be impossible for humans, and even if it were possible, then scalability would be an issue. Every neural probably has a different weight. Thus, they will have different gradients.

Training happens during backpropagation. Thus, the direction of training is always from the later layers (output/right side) to the early layers (input/left side). This results in later layers learning very well as compared to the early layers. The deeper the network gets, the more the condition deteriorates. This give rise to two possible problems associated with deep learning, which are:

  • The vanishing gradient problem
  • The exploding gradient problem
主站蜘蛛池模板: 泸水县| 静宁县| 巢湖市| 通州区| 石城县| 哈尔滨市| 涞源县| 罗甸县| 阿克苏市| 固阳县| 绥化市| 肥乡县| 江华| 察雅县| 曲松县| 西贡区| 玛曲县| 铜川市| 调兵山市| 玛纳斯县| 金山区| 丹巴县| 兴国县| 桃园市| 康保县| 涡阳县| 渭源县| 政和县| 樟树市| 商洛市| 监利县| 博爱县| 德江县| 舞钢市| 潼南县| 靖安县| 安义县| 黄山市| 绥芬河市| 上思县| 鄂温|