首頁(yè) > 計(jì)算機(jī)網(wǎng)絡(luò) >
編程語(yǔ)言與程序設(shè)計(jì)
> Mastering Machine Learning with scikit-learn(Second Edition)最新章節(jié)目錄
舉報(bào)

會(huì)員
Mastering Machine Learning with scikit-learn(Second Edition)
最新章節(jié):
Summary
Thisbookisintendedforsoftwareengineerswhowanttounderstandhowcommonmachinelearningalgorithmsworkanddevelopanintuitionforhowtousethem,andfordatascientistswhowanttolearnaboutthescikit-learnAPI.FamiliaritywithmachinelearningfundamentalsandPythonarehelpful,butnotrequired.
目錄(147章)
倒序
- coverpage
- Title Page
- Credits
- About the Author
- About the Reviewer
- www.PacktPub.com
- Why subscribe?
- Customer Feedback
- Preface
- What this book covers
- What you need for this book
- Who this book is for
- Conventions
- Reader feedback
- Customer support
- Downloading the example code
- Errata
- Piracy
- Questions
- The Fundamentals of Machine Learning
- Defining machine learning
- Learning from experience
- Machine learning tasks
- Training data testing data and validation data
- Bias and variance
- An introduction to scikit-learn
- Installing scikit-learn
- Installing using pip
- Installing on Windows
- Installing on Ubuntu 16.04
- Installing on Mac OS
- Installing Anaconda
- Verifying the installation
- Installing pandas Pillow NLTK and matplotlib
- Summary
- Simple Linear Regression
- Simple linear regression
- Evaluating the fitness of the model with a cost function
- Solving OLS for simple linear regression
- Evaluating the model
- Summary
- Classification and Regression with k-Nearest Neighbors
- K-Nearest Neighbors
- Lazy learning and non-parametric models
- Classification with KNN
- Regression with KNN
- Scaling features
- Summary
- Feature Extraction
- Extracting features from categorical variables
- Standardizing features
- Extracting features from text
- The bag-of-words model
- Stop word filtering
- Stemming and lemmatization
- Extending bag-of-words with tf-idf weights
- Space-efficient feature vectorizing with the hashing trick
- Word embeddings
- Extracting features from images
- Extracting features from pixel intensities
- Using convolutional neural network activations as features
- Summary
- From Simple Linear Regression to Multiple Linear Regression
- Multiple linear regression
- Polynomial regression
- Regularization
- Applying linear regression
- Exploring the data
- Fitting and evaluating the model
- Gradient descent
- Summary
- From Linear Regression to Logistic Regression
- Binary classification with logistic regression
- Spam filtering
- Binary classification performance metrics
- Accuracy
- Precision and recall
- Calculating the F1 measure
- ROC AUC
- Tuning models with grid search
- Multi-class classification
- Multi-class classification performance metrics
- Multi-label classification and problem transformation
- Multi-label classification performance metrics
- Summary
- Naive Bayes
- Bayes' theorem
- Generative and discriminative models
- Naive Bayes
- Assumptions of Naive Bayes
- Naive Bayes with scikit-learn
- Summary
- Nonlinear Classification and Regression with Decision Trees
- Decision trees
- Training decision trees
- Selecting the questions
- Information gain
- Gini impurity
- Decision trees with scikit-learn
- Advantages and disadvantages of decision trees
- Summary
- From Decision Trees to Random Forests and Other Ensemble Methods
- Boosting
- Stacking
- Summary
- The Perceptron
- The perceptron
- Activation functions
- The perceptron learning algorithm
- Binary classification with the perceptron
- Document classification with the perceptron
- Limitations of the perceptron
- Summary
- From the Perceptron to Support Vector Machines
- Kernels and the kernel trick
- Maximum margin classification and support vectors
- Classifying characters in scikit-learn
- Classifying handwritten digits
- Classifying characters in natural images
- Summary
- From the Perceptron to Artificial Neural Networks
- Nonlinear decision boundaries
- Feed-forward and feedback ANNs
- Multi-layer perceptrons
- Training multi-layer perceptrons
- Backpropagation
- Training a multi-layer perceptron to approximate XOR
- Training a multi-layer perceptron to classify handwritten digits
- Summary
- K-means
- Clustering
- K-means
- Local optima
- Selecting K with the elbow method
- Evaluating clusters
- Image quantization
- Clustering to learn features
- Summary
- Dimensionality Reduction with Principal Component Analysis
- Principal component analysis
- Variance covariance and covariance matrices
- Eigenvectors and eigenvalues
- Performing PCA
- Visualizing high-dimensional data with PCA
- Face recognition with PCA
- Summary 更新時(shí)間:2021-07-02 19:01:34
推薦閱讀
- C及C++程序設(shè)計(jì)(第4版)
- Progressive Web Apps with React
- LabVIEW2018中文版 虛擬儀器程序設(shè)計(jì)自學(xué)手冊(cè)
- Manga Studio Ex 5 Cookbook
- 微服務(wù)設(shè)計(jì)原理與架構(gòu)
- Python Deep Learning
- Spring Cloud、Nginx高并發(fā)核心編程
- Python編程完全入門教程
- Kotlin Standard Library Cookbook
- C語(yǔ)言程序設(shè)計(jì)
- Learning Probabilistic Graphical Models in R
- PLC應(yīng)用技術(shù)(三菱FX2N系列)
- Mastering openFrameworks:Creative Coding Demystified
- INSTANT Adobe Edge Inspect Starter
- R語(yǔ)言數(shù)據(jù)可視化:科技圖表繪制
- Maker基地嘉年華:玩轉(zhuǎn)樂(lè)動(dòng)魔盒學(xué)Scratch
- Get Your Hands Dirty on Clean Architecture
- Mastering OpenStack
- Learn Linux Quickly
- Laravel 5.x Cookbook
- Spring Microservices
- Serverless從入門到進(jìn)階:架構(gòu)、原理與實(shí)踐
- Switching to Angular 2
- Java程序設(shè)計(jì)
- Netty源碼剖析與應(yīng)用
- CAE分析大系:ABAQUS有限元分析從入門到精通
- 并行算法設(shè)計(jì)與性能優(yōu)化
- Python程序設(shè)計(jì)基礎(chǔ)與應(yīng)用
- JavaScript和jQuery實(shí)戰(zhàn)手冊(cè)(原書(shū)第3版)
- Sencha Touch Cookbook