- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- Aftershot Pro:Non-destructive photo editing and management
- FPGA從入門到精通(實戰篇)
- 計算機組裝與系統配置
- 電腦維護365問
- 分布式微服務架構:原理與實戰
- WebGL Hotshot
- Hands-On Deep Learning for Images with TensorFlow
- Angular 6 by Example
- The Deep Learning with PyTorch Workshop
- 微服務實戰
- Service Mesh微服務架構設計
- 計算機組裝與維護
- Applied Deep Learning with Keras
- Hands-On Markov Models with Python
- Arduino+3D打印創新電子制作2