官术网_书友最值得收藏!

  • Deep Learning with Keras
  • Antonio Gulli Sujit Pal
  • 94字
  • 2021-07-02 23:58:02

Activation function — ReLU

The sigmoid is not the only kind of smooth activation function used for neural networks. Recently, a very simple function called rectified linear unit (ReLU) became very popular because it generates very good experimental results. A ReLU is simply defined as , and the nonlinear function is represented in the following graph. As you can see in the following graph, the function is zero for negative values, and it grows linearly for positive values:

              

主站蜘蛛池模板: 太康县| 安新县| 旅游| 黑山县| 明溪县| 兴业县| 盐亭县| 克拉玛依市| 定陶县| 荔波县| 济宁市| 文成县| 普兰县| 资阳市| 通州区| 五大连池市| 柳林县| 江华| 澳门| 博乐市| 诸暨市| 郴州市| 承德县| 晋宁县| 获嘉县| 苍溪县| 从江县| 东乡| 钦州市| 北碚区| 阳信县| 周口市| 岚皋县| 叶城县| 哈密市| 鸡西市| 锡林郭勒盟| 昌吉市| 松潘县| 长兴县| 钟山县|