- Deep Learning with Keras
- Antonio Gulli Sujit Pal
- 109字
- 2021-07-02 23:58:02
Activation functions
Sigmoid and ReLU are generally called activation functions in neural network jargon. In the Testing different optimizers in Keras section, we will see that those gradual changes, typical of sigmoid and ReLU functions, are the basic building blocks to developing a learning algorithm which adapts little by little, by progressively reducing the mistakes made by our nets. An example of using the activation function σ with the (x1, x2, ..., xm) input vector, (w1, w2, ..., wm) weight vector, b bias, and Σ summation is given in the following diagram:

Keras supports a number of activation functions, and a full list is available at https://keras.io/activations/.
推薦閱讀
- 用“芯”探核:龍芯派開(kāi)發(fā)實(shí)戰(zhàn)
- Aftershot Pro:Non-destructive photo editing and management
- Arduino入門基礎(chǔ)教程
- Learning Cocos2d-x Game Development
- 顯卡維修知識(shí)精解
- Deep Learning with PyTorch
- 計(jì)算機(jī)應(yīng)用與維護(hù)基礎(chǔ)教程
- VCD、DVD原理與維修
- Internet of Things Projects with ESP32
- Source SDK Game Development Essentials
- 電腦組裝與維護(hù)即時(shí)通
- 嵌入式系統(tǒng)原理及應(yīng)用:基于ARM Cortex-M4體系結(jié)構(gòu)
- 單片機(jī)原理及應(yīng)用
- 從企業(yè)級(jí)開(kāi)發(fā)到云原生微服務(wù):Spring Boot實(shí)戰(zhàn)
- Practical Artificial Intelligence and Blockchain