- Hands-On Neural Networks
- Leonardo De Marchi Laura Mitchell
- 117字
- 2021-06-24 14:00:15
Tanh
As we said, the logistic sigmoid can cause a neural network to get stuck, as a high or low value input will produce a result very near zero. This will mean that the gradient descent will not update the weights and not train the model.
The hyperbolic tangent, or the tanh function, is an alternative to sigmoid, and it still has a sigmoidal shape. The difference is that it will output a value between -1 and 1. Hence, strongly negative input to the tanh function will map to negative output. Additionally, only zero-valued input is mapped to near-zero output. These properties make the network less likely to get stuck during training:

Hyperbolic tangent function
推薦閱讀
- 腦動力:Linux指令速查效率手冊
- Canvas LMS Course Design
- 智能傳感器技術與應用
- PowerShell 3.0 Advanced Administration Handbook
- Getting Started with Oracle SOA B2B Integration:A Hands-On Tutorial
- 模型制作
- Multimedia Programming with Pure Data
- 可編程序控制器應用實訓(三菱機型)
- Ceph:Designing and Implementing Scalable Storage Systems
- 21天學通C語言
- The Python Workshop
- Hands-On Reactive Programming with Reactor
- 網絡服務搭建、配置與管理大全(Linux版)
- 教育機器人的風口:全球發展現狀及趨勢
- 計算機組網技術