官术网_书友最值得收藏!

Rectified Linear Unit

Rectified Linear Unit (ReLU) is the most used activation function since 2015. It is a simple condition and has advantages over the other functions. The function is defined by the following formula:

 

In the following figure is shown a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we have covered the most important ones here.

主站蜘蛛池模板: 德昌县| 裕民县| 喜德县| 阳江市| 嘉荫县| 武威市| 常山县| 崇左市| 通化市| 杭州市| 乐平市| 桂东县| 马龙县| 秦安县| 明星| 大余县| 南岸区| 华池县| 新晃| 长子县| 木兰县| 丰镇市| 内丘县| 交城县| 平陆县| 巴塘县| 茶陵县| 田林县| 南城县| 香河县| 柳江县| 屯昌县| 祁门县| 杭锦后旗| 石泉县| 聂荣县| 安丘市| 芮城县| 沾化县| 佳木斯市| 长沙市|