- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- Unity 5.x Game Development Blueprints
- micro:bit魔法修煉之Mpython初體驗
- Svelte 3 Up and Running
- VCD、DVD原理與維修
- Large Scale Machine Learning with Python
- 微軟互聯網信息服務(IIS)最佳實踐 (微軟技術開發者叢書)
- VMware Workstation:No Experience Necessary
- Neural Network Programming with Java(Second Edition)
- 單片機技術及應用
- 新編電腦組裝與硬件維修從入門到精通
- FPGA實戰訓練精粹
- The Applied Artificial Intelligence Workshop
- FPGA進階開發與實踐
- DevOps實戰:VMware管理員運維方法、工具及最佳實踐
- 快·易·通:2天學會電腦組裝·系統安裝·日常維護與故障排除