- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- Istio入門與實戰
- Learning AngularJS Animations
- 硬件產品經理成長手記(全彩)
- INSTANT ForgedUI Starter
- micro:bit魔法修煉之Mpython初體驗
- The Deep Learning with Keras Workshop
- Intel Edison智能硬件開發指南:基于Yocto Project
- 筆記本電腦維修實踐教程
- 圖解計算機組裝與維護
- IP網絡視頻傳輸:技術、標準和應用
- 基于網絡化教學的項目化單片機應用技術
- 可編程邏輯器件項目開發設計
- Instant Website Touch Integration
- Deep Learning with Keras
- Advanced Machine Learning with R