官术网_书友最值得收藏!

Tanh

As we said, the logistic sigmoid can cause a neural network to get stuck, as a high or low value input will produce a result very near zero. This will mean that the gradient descent will not update the weights and not train the model.

The hyperbolic tangent, or the tanh function, is an alternative to sigmoid, and it still has a sigmoidal shape. The difference is that it will output a value between -1 and 1. Hence, strongly negative input to the tanh function will map to negative output. Additionally, only zero-valued input is mapped to near-zero output. These properties make the network less likely to get stuck during training:

Hyperbolic tangent function
主站蜘蛛池模板: 集安市| 德令哈市| 浦江县| 诸暨市| 大邑县| 曲靖市| 松江区| 温泉县| 拜泉县| 上犹县| 长岭县| 镇平县| 苗栗市| 凤台县| 赤水市| 瑞昌市| 丹凤县| 清流县| 留坝县| 星座| 确山县| 韩城市| 且末县| 昂仁县| 宜州市| 军事| 阳曲县| 广水市| 肥城市| 凉城县| 周至县| 江川县| 徐闻县| 湖州市| 永宁县| 庐江县| 江都市| 西盟| 金乡县| 华宁县| 桦甸市|