官术网_书友最值得收藏!

  • Advanced Machine Learning with R
  • Cory Lesmeister Dr. Sunil Kumar Chinnamgari
  • 233字
  • 2021-06-24 14:24:38

Advanced Feature Selection in Linear Models

"There is nothing permanent except change."
– Heraclitus

So far, we've examined the usage of linear models for both quantitative and qualitative outcomes with an eye on the techniques of feature selection, that is, the methods and techniques that exclude useless or unwanted predictor variables. We saw that linear models can be quite useful in machine learning problems, how piece-wise linear models can capture non-linear relationships as multivariate adaptive regression splines. Additional techniques have been developed and refined in the last couple of decades that can improve predictive ability and interpretability above and beyond the linear models that we discussed in the preceding chapters. In this day and age, many datasets, such as those in the two prior chapters, have numerous features. It isn't unreasonable to have datasets with thousands of potential features. 

The methods in this chapter might prove to be a better way to approach feature reduction and selection. In this chapter, we'll look at the concept of regularization where the coefficients are constrained or shrunk towards zero. There're many methods and permutations to these methods of regularization, but we'll focus on ridge regression, Least Absolute Shrinkage and Selection Operator (LASSO), and, finally, elastic net, which combines the benefits of both techniques into one.

The following are the topics we'll cover in this chapter:

  • Overview of regularization
  • Dataset creation
  • Ridge regression
  • LASSO
  • Elastic net
主站蜘蛛池模板: 云南省| 大余县| 霍山县| 永新县| 厦门市| 大同市| 芮城县| 北海市| 金沙县| 邹平县| 孝义市| 永福县| 绥化市| 平凉市| 潜江市| 西藏| 璧山县| 甘肃省| 克什克腾旗| 安康市| 丹凤县| 明星| 项城市| 鹤庆县| 阿巴嘎旗| 莱州市| 安陆市| 梁平县| 潼南县| 宜州市| 墨竹工卡县| 嘉峪关市| 综艺| 韩城市| 恩施市| 百色市| 新泰市| 霍山县| SHOW| 永宁县| 肇源县|