官术网_书友最值得收藏!

Gradient descent 

The gradient descent algorithm is also popular for estimating parameters for linear regression. The gradient descent algorithm is used to minimize a function. Based on what we are predicting, we start with a set of initial values for the parameters and iteratively move toward the parameters to minimize the error in the function. The function to iteratively minimize error is called the gradient. The idea is to descend the gradient toward the lowest point in the gradient plane. Different types of gradient descent algorithms include batch gradient descent, which looks at all observed examples in each example, and stochastic gradient descent, where we iterate with only one observation at a time. For this reason, batch gradient descent is more accurate than stochastic gradient descent, but is much slower and hence not suitable for large datasets.

There is a vast amount of research being done on regression algorithms as they are very well suited for predicting continuous variables. We encourage you to learn more about linear regression libraries and try different variants that are provided in the library to calculate the efficiency and effectiveness of the test datasets.

主站蜘蛛池模板: 仲巴县| 浦北县| 沅江市| 宁阳县| 苏尼特右旗| 大同县| 吉木萨尔县| 汤阴县| 仙游县| 疏勒县| 广昌县| 定西市| 辽阳市| 琼结县| 南宫市| 莲花县| 邛崃市| 玛纳斯县| 安丘市| 彭泽县| 城市| 建湖县| 南昌县| 沅陵县| 新和县| 张家港市| 鲁山县| 安图县| 阿巴嘎旗| 辽源市| 康保县| 松潘县| 漳平市| 当涂县| 额济纳旗| 常熟市| 琼中| 西安市| 东安县| 绥芬河市| 桃园市|