- Neural Networks with R
- Giuseppe Ciaburro Balaji Venkateswaran
- 137字
- 2021-08-20 10:25:18
Hyperbolic tangent
Another very popular and widely used activation feature is the tanh function. If you look at the figure that follows, you can notice that it looks very similar to sigmoid; in fact, it is a scaled sigmoid function. This is a nonlinear function, defined in the range of values (-1, 1), so you need not worry about activations blowing up. One thing to clarify is that the gradient is stronger for tanh than sigmoid (the derivatives are more steep). Deciding between sigmoid and tanh will depend on your gradient strength requirement. Like the sigmoid, tanh also has the missing slope problem. The function is defined by the following formula:

In the following figure is shown a hyberbolic tangent activation function:

This looks very similar to sigmoid; in fact, it is a scaled sigmoid function.
- Cocos2d Cross-Platform Game Development Cookbook(Second Edition)
- 移動UI設計(微課版)
- Java Web基礎與實例教程(第2版·微課版)
- Go語言高效編程:原理、可觀測性與優化
- 羅克韋爾ControlLogix系統應用技術
- Mastering PHP Design Patterns
- 區塊鏈:以太坊DApp開發實戰
- HTML5游戲開發案例教程
- MATLAB實用教程
- 面向對象程序設計(Java版)
- Python漫游數學王國:高等數學、線性代數、數理統計及運籌學
- Jupyter數據科學實戰
- Mastering Business Intelligence with MicroStrategy
- Test-Driven Machine Learning
- 深入分析GCC