官术网_书友最值得收藏!

Getting to know additional linear regressors

Before moving on to linear classifiers, it makes sense to also add the following additional linear regression algorithms to your toolset:

  • Elastic-net uses a mixture of L1 and L2 regularization techniques, where l1_ratio controls the mix of the two. This is useful in cases when you want to learn a sparse model where few of the weights are non-zero (as in lasso) while keeping the benefits of ridge regularization.
  • Random Sample Consensus(RANSAC) is useful when your data has outliers. It tries to separate the outliers from the inlier samples. Then, it fits the model on the inliers only.
  • Least-Angle Regression (LARS) is useful when dealing with high-dimensional data—that is, when there is a significant number of features compared to the number of samples. You may want to try it with the polynomial features example we saw earlier and see how it performs there.

Let's move on to the next section of the book where you will learn to use logistic regression to classify data.

主站蜘蛛池模板: 泽库县| 甘德县| 江永县| 利津县| 保亭| 田阳县| 昌图县| 丹寨县| 前郭尔| 隆子县| 报价| 上林县| 乌兰浩特市| 城市| 多伦县| 咸丰县| 岑巩县| 永靖县| 陵川县| 洪江市| 益阳市| 石屏县| 商河县| 合山市| 沂南县| 芦溪县| 泰兴市| 巴彦淖尔市| 桦川县| 阆中市| 上栗县| 扎兰屯市| 屏东市| 若羌县| 新乡县| 绩溪县| 黎平县| 启东市| 达日县| 顺昌县| 吴旗县|