- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- 用“芯”探核:龍芯派開發實戰
- Augmented Reality with Kinect
- 新型電腦主板關鍵電路維修圖冊
- 計算機組裝與系統配置
- Linux運維之道(第2版)
- 電腦組裝、維護、維修全能一本通(全彩版)
- 電腦軟硬件維修從入門到精通
- Learning Game Physics with Bullet Physics and OpenGL
- Apple Motion 5 Cookbook
- The Deep Learning with Keras Workshop
- Arduino BLINK Blueprints
- 無蘋果不生活:OS X Mountain Lion 隨身寶典
- 新編電腦組裝與硬件維修從入門到精通
- STM32自學筆記
- Intel FPGA權威設計指南:基于Quartus Prime Pro 19集成開發環境