- Machine Learning Projects for Mobile Applications
- Karthikeyan NG
- 69字
- 2021-06-10 19:41:43
Rectified linear units
The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:
Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.
推薦閱讀
- 基于Proteus和Keil的C51程序設計項目教程(第2版):理論、仿真、實踐相融合
- 辦公通信設備維修
- INSTANT Wijmo Widgets How-to
- AMD FPGA設計優化寶典:面向Vivado/SystemVerilog
- Hands-On Machine Learning with C#
- Creating Flat Design Websites
- 深入理解序列化與反序列化
- Intel Edison智能硬件開發指南:基于Yocto Project
- 基于PROTEUS的電路設計、仿真與制板
- STM32自學筆記
- Istio實戰指南
- 可編程邏輯器件項目開發設計
- Service Mesh微服務架構設計
- Hands-On One-shot Learning with Python
- Nagios系統監控實踐(原書第2版)