- Hands-On Deep Learning Architectures with Python
- Yuxi (Hayden) Liu Saransh Mehta
- 122字
- 2021-06-24 14:48:12
Sigmoid activation
The output range of this function is from zero to one for all real number inputs. This is very important for generating probabilistic scores from neurons. The function is also continuous and non-linear and helps to preserve the non-linearity of outputs. Also, the gradient of curve is steep near the origin and saturates as we start moving away on the x-axis. This means significant change in output will occur for a small change in input around the origin. This characteristic aids in the classification task as it tries to keep the output close to either zero or one. Following is the equation for sigmoid activation against the input x:

The following is a plot of the sigmoid activation function:

推薦閱讀
- 精通Windows Vista必讀
- 來吧!帶你玩轉(zhuǎn)Excel VBA
- 計算機網(wǎng)絡應用基礎
- 精通Excel VBA
- Windows游戲程序設計基礎
- Prometheus監(jiān)控實戰(zhàn)
- 網(wǎng)中之我:何明升網(wǎng)絡社會論稿
- 網(wǎng)站前臺設計綜合實訓
- 人工智能:語言智能處理
- Puppet 3 Beginner’s Guide
- 機器人制作入門(第4版)
- 常用傳感器技術及應用(第2版)
- Web滲透技術及實戰(zhàn)案例解析
- Kubernetes Design Patterns and Extensions
- ASP.NET 4.0 MVC敏捷開發(fā)給力起飛