官术网_书友最值得收藏!

ReLU

The ReLU non-linearity is a piecewise linear function with a non-linearity introduced by rectification. Unlike the sigmoid and Tanh non-linearities that have continuous gradients, the gradients of ReLU have two values only: 0 for values smaller than 0, and 1 for values larger than 0. Hence, the gradients of ReLU are sparse. Although the gradient of ReLU at 0 is undefined, common practice sets it to 0. There are variations to the ReLU non-linearity including the ELU and the Leaky RELU. Compared to sigmoid and Tanh, the derivative of ReLU is faster to compute and induces sparsity in models:

主站蜘蛛池模板: 东乌| 叙永县| 东兴市| 福建省| 沅江市| 清徐县| 台中市| 南雄市| 彝良县| 新竹市| 乌鲁木齐市| 三门县| 大安市| 连山| 武安市| 西乡县| 安宁市| 锡林郭勒盟| 斗六市| 福泉市| 永嘉县| 象山县| 古蔺县| 阿坝县| 嵊泗县| 蒙城县| 格尔木市| 靖远县| 乐都县| 越西县| 梧州市| 新乐市| 扬中市| 自贡市| 金阳县| 昭平县| 宣武区| 新乡市| 江西省| 玉门市| 南雄市|