官术网_书友最值得收藏!

Sigmoid or logistic function

A sigmoid function has a distinctive S shape and it is a differentiable real function for any real input value. Its range is between 0 and 1. It is an activation function in the following form:

Its first derivative, which is used during backpropagation of the training step, has the following form:

The implementation is as follows:

def sigmoid(x):
return tf.div(tf.constant(1.0),
tf.add(tf.constant(1.0), tf.exp(tf.neg(x))))

The derivative of a sigmoid function is as follows:

def sigmoidprime(x):
return tf.multiply(sigmoid(x), tf.subtract(tf.constant(1.0), sigmoid(x)))

However, a sigmoid function can cause the gradient vanishing problem or saturation of the gradient. It is also known to have a slow convergence. Therefore, in practical use, it is not recommended to use a sigmoid as the activation function, ReLU has become more popular.

主站蜘蛛池模板: 新田县| 平武县| 高尔夫| 登封市| 工布江达县| 志丹县| 德令哈市| 清徐县| 扶风县| 汉阴县| 平乐县| 九江市| 张掖市| 永丰县| 大荔县| 昌都县| 紫金县| 金门县| 绥宁县| 襄汾县| 邹城市| 永城市| 景泰县| 巫山县| 偃师市| 安达市| 长白| 蒲江县| 承德市| 金阳县| 盐边县| 新化县| 汉源县| 藁城市| 大名县| 霍林郭勒市| 南宫市| 孟州市| 新巴尔虎左旗| 庆元县| 盐山县|