官术网_书友最值得收藏!

Residual neural networks

Since there are sometimes millions of billions of hyperparameters and other practical aspects, it's really difficult to train deeper neural networks. To overcome this limitation, Kaiming He et al. (see https://arxiv.org/abs/1512.03385v1) proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously.

They also explicitly reformulated the layers as learning residual functions with reference to the layer inputs, instead of learning unreferenced functions. This way, these residual networks are easier to optimize and can gain accuracy from considerably increased depth.

The downside is that building a network by simply stacking residual blocks inevitably limits its optimization ability. To overcome this limitation, Ke Zhang et al. also proposed using a Multilevel Residual Network (https://arxiv.org/abs/1608.02908).

主站蜘蛛池模板: 古蔺县| 焉耆| 贡觉县| 哈巴河县| 南乐县| 双牌县| 新晃| 家居| 横山县| 光山县| 汉源县| 闻喜县| 读书| 清镇市| 吴旗县| 绥德县| 曲沃县| 孝义市| 雷山县| 南安市| 铁力市| 隆化县| 铜山县| 如皋市| 彝良县| 德江县| 宁明县| 石家庄市| 芜湖县| 荃湾区| 富裕县| 龙川县| 集安市| 东乌珠穆沁旗| 公主岭市| 班戈县| 洪洞县| 新竹县| 绵阳市| 宿松县| 嘉峪关市|