官术网_书友最值得收藏!

ReLU function 

Then there is an activation function called the Rectified Linear Unit, ReLU(z), that transforms any value, z, to 0 or a value above 0. In other words, it outputs any value below 0 as 0 and any value above 0 as the value itself:

Just to summarize our understanding so far, the perceptron is the traditional and outdated neuron that is rarely used in real implementations. They are great to get a simplistic understanding of the underlying principle; however, they had the problem of fast learning due to the drastic changes in output values.

We use activation functions to reduce the learning speed and determine finer changes in z or  . Let's sum up these activation functions:

  • The sigmoid neuron is the neuron that uses the sigmoid activation function to transform the output to a value between 0 and 1.
  • The tanh neuron is the neuron that uses the tanh activation function to transform the output to a value between -1 and 1.
  • The ReLU neuron is the neuron that uses the ReLU activation function to transform the output to a value of either 0 or any value above 0.

The sigmoid function is used in practice but is slow compared to the tanh and ReLU functions. The tanh and ReLU functions are commonly used activation functions. The ReLU function is also considered state of the art and is usually the first choice of activation function that's used to build ANNs.

Here is a list of commonly used activation functions:

In the projects within this book, we will be primarily using either the sigmoid, tanh, or the ReLU neurons to build our ANN.

主站蜘蛛池模板: 贵南县| 宝鸡市| 盐亭县| 禹城市| 民乐县| 黄浦区| 肇庆市| 朝阳县| 外汇| 高安市| 大方县| 兰溪市| 大足县| 水富县| 万全县| 怀柔区| 保德县| 中阳县| 抚宁县| 凤庆县| 广德县| 大石桥市| 永昌县| 六盘水市| 陆丰市| 洪洞县| 苏州市| 凤庆县| 高雄县| 东阿县| 浦县| 方山县| 霍林郭勒市| 玛沁县| 玛纳斯县| 宿松县| 郴州市| 炎陵县| 南漳县| 大渡口区| 定安县|