- Mastering Machine Learning with scikit-learn(Second Edition)
- Gavin Hackeling
- 353字
- 2021-07-02 19:01:07
What this book covers
Chapter 1, The Fundamentals of Machine Learning, defines machine learning as the study and design of programs that improve their performance of a task by learning from experience. This definition guides the other chapters; in each, we will examine a machine learning model, apply it to a task, and measure its performance.
Chapter 2, Simple Linear Regression, discusses a model that relates a single feature to a continuous response variable. We will learn about cost functions and use the normal equation to optimize the model.
Chapter 3, Classification and Regression with K-Nearest Neighbors, introduces a simple, nonlinear model for classification and regression tasks.
Chapter 4, Feature Extraction, describes methods for representing text, images, and categorical variables as features that can be used in machine learning models.
Chapter 5, From Simple Linear Regression to Multiple Linear Regression, discusses a generalization of simple linear regression that regresses a continuous response variable onto multiple features.
Chapter 6, From Linear Regression to Logistic Regression, further generalizes multiple linear regression and introduces a model for binary classification tasks.
Chapter 7, Naive Bayes, discusses Bayes’ theorem and the Naive Bayes family of classifiers, and compares generative and discriminative models.
Chapter 8, Nonlinear Classification and Regression with Decision Trees, introduces the decision tree, a simple, nonlinear model for classification and regression tasks.
Chapter 9, From Decision Trees to Random Forests and other Ensemble Methods, discusses three methods for combining models called bagging, boosting, and stacking.
Chapter 10, The Perceptron, introduces a simple online model for binary classification.
Chapter 11, From the Perceptron to Support Vector Machines, discusses a powerful, discriminative model for classification and regression called the support vector machine, and a technique for efficiently projecting features to higher dimensional spaces.
Chapter 12, From the Perceptron to Artificial Neural Networks, introduces powerful nonlinear models for classification and regression built from graphs of artificial neurons.
Chapter 13, K-means, discusses an algorithm that can be used to find structures in unlabeled data.
Chapter 14, Dimensionality Reduction with Principal Component Analysis, describes a method for reducing the dimensions of data that can mitigate the curse of dimensionality.
- 計(jì)算機(jī)圖形學(xué)編程(使用OpenGL和C++)(第2版)
- Microsoft Dynamics 365 Extensions Cookbook
- Practical Data Science Cookbook(Second Edition)
- Web Development with Django Cookbook
- 編寫高質(zhì)量代碼:改善Python程序的91個(gè)建議
- Haxe Game Development Essentials
- Android系統(tǒng)原理及開發(fā)要點(diǎn)詳解
- 深入分析GCC
- Web開發(fā)的平民英雄:PHP+MySQL
- 產(chǎn)品架構(gòu)評估原理與方法
- AngularJS by Example
- Scala實(shí)用指南
- 從零開始:Qt可視化程序設(shè)計(jì)基礎(chǔ)教程
- 西門子PLC 200/300/400應(yīng)用程序設(shè)計(jì)實(shí)例精講(第2版)
- 設(shè)計(jì)模式的藝術(shù)