- Deep Learning with PyTorch
- Vishnu Subramanian
- 46字
- 2021-06-24 19:16:28
Leaky ReLU
Leaky ReLU is an attempt to solve a dying problem where, instead of saturating to zero, we saturate to a very small number such as 0.001. For some use cases, this activation function provides a superior performance to others, but it is not consistent.
推薦閱讀
- 圖解西門子S7-200系列PLC入門
- Python GUI Programming:A Complete Reference Guide
- 基于Proteus和Keil的C51程序設計項目教程(第2版):理論、仿真、實踐相融合
- 3ds Max Speed Modeling for 3D Artists
- micro:bit魔法修煉之Mpython初體驗
- AMD FPGA設計優化寶典:面向Vivado/SystemVerilog
- 筆記本電腦維修不是事兒(第2版)
- Spring Cloud微服務架構實戰
- Spring Cloud微服務和分布式系統實踐
- Neural Network Programming with Java(Second Edition)
- 單片機原理及應用
- Instant Website Touch Integration
- 嵌入式系統設計大學教程(第2版)
- MicroPython Cookbook
- ARM接口編程