官术网_书友最值得收藏!

ReLU

ReLU is one of the most commonly used activation functions. It behaves like a linear function when the input is greater than 0; otherwise, it will always be equal to 0. It's the analog of the half-wave rectification in electrical engineering, :

The ReLU function

The range for this function is from 0 to infinite. The issue is that the negative values become zero; therefore, the derivative will always be constant. This is clearly an issue for backpropagation, but in practical cases, it does not have an effect. 

There are a few variants of ReLU; one of the most common ones is Leaky ReLU, which aims to allow a positive small gradient when the function is not active. Its formula is as follows:

Here,  is typically 0.01, as shown in the following diagram:

The Leaky ReLU function
主站蜘蛛池模板: 青浦区| 都匀市| 长沙市| 大英县| 沂水县| 宝清县| 始兴县| 泽州县| 宜宾县| 阳曲县| 密云县| 南丰县| 从化市| 彭阳县| 宿松县| 玉林市| 徐水县| 松原市| 贞丰县| 秀山| 陇川县| 仁布县| 南木林县| 东阳市| 石台县| 泰顺县| 嘉鱼县| 汽车| 隆昌县| 桓仁| 房山区| 曲松县| 理塘县| 江门市| 西峡县| 榆中县| 章丘市| 青海省| 界首市| 荣成市| 翼城县|