官术网_书友最值得收藏!

The gradient descent algorithm

The gradient descent algorithm is an optimization algorithm to find the minimum of the function using first order derivatives, that is, we differentiate functions with respect to their parameters to first order only. Here, the objective of the gradient descent algorithm would be to minimize the cost function with regards to and.

This approach includes following steps for numerous iterations to minimize :

used in the above equations refers to the learning rate. The learning rate is the speed at which the learning agent adapts to new knowledge. Thus, , that is, the learning rate is a hyperparameter that needs to be assigned as a scalar value or as a function of time. In this way, in every iteration, the values of and are updated as mentioned in the preceding formula until the value of the cost function reaches an acceptable minimum value.

The gradient descent algorithm means moving down the slope. The slope of the the curve is represented by the cost function with regards to the parameters. The gradient, that is, the slope, gives the direction of increasing slope if it's positive, and decreasing if it's negative. Thus, we use a negative sign to multiply with our slope since we have to go opposite to the direction of the increasing slope and toward the direction of the decreasing.

Using the optimum learning rate, , the descent is controlled and we don't overshoot the local minimum. If the learning rate, , is very small, then convergence will take more time, while if it's very high then it might overshoot and miss the minimum and diverge owing to the large number of iterations:

主站蜘蛛池模板: 恭城| 湟源县| 长沙县| 孟津县| 二手房| 文安县| 勃利县| 嫩江县| 水富县| 无棣县| 浏阳市| 盈江县| 攀枝花市| 新龙县| 随州市| 承德市| 平昌县| 文化| 什邡市| 涟水县| 柞水县| 温泉县| 阜新市| 牡丹江市| 南和县| 肥东县| 临安市| 宜宾市| 河曲县| 巴林右旗| 连城县| 天门市| 建阳市| 娄烦县| 海淀区| 二连浩特市| 腾冲县| 河东区| 酒泉市| 华坪县| 德清县|