舉報

會員
Mastering Machine Learning Algorithms
Giuseppe Bonaccorso 著
更新時間:2021-06-25 22:08:15
開會員,本書免費讀 >
Thisbookisanidealandrelevantsourceofcontentfordatascienceprofessionalswhowanttodelveintocomplexmachinelearningalgorithms,calibratemodels,andimprovethepredictionsofthetrainedmodel.Abasicknowledgeofmachinelearningispreferredtogetthebestoutofthisguide.
最新章節
- Leave a review - let other readers know what you think
- Other Books You May Enjoy
- Summary
- Q-learning using a neural network
- Q-learning in the checkerboard environment
- Q-learning
品牌:中圖公司
上架時間:2021-06-25 21:02:19
出版社:Packt Publishing
本書數字版權由中圖公司提供,并由其授權上海閱文信息技術有限公司制作發行
- Leave a review - let other readers know what you think 更新時間:2021-06-25 22:08:15
- Other Books You May Enjoy
- Summary
- Q-learning using a neural network
- Q-learning in the checkerboard environment
- Q-learning
- SARSA in the checkerboard environment
- SARSA algorithm
- Actor-Critic TD(0) in the checkerboard environment
- TD(λ) in a more complex Checkerboard environment
- TD(λ) algorithm
- Advanced Policy Estimation Algorithms
- Summary
- TD(0) in the checkerboard environment
- TD(0) algorithm
- Value iteration in the checkerboard environment
- Value iteration
- Policy iteration in the checkerboard environment
- Policy iteration
- Policy
- Checkerboard environment in Python
- Rewards
- Environment
- Reinforcement Learning fundamentals
- Introduction to Reinforcement Learning
- Summary
- Example of Supervised DBN with Python
- Example of unsupervised DBN in Python
- DBNs
- RBMs
- MRF
- Deep Belief Networks
- Summary
- Example of WGAN with TensorFlow
- Wasserstein GAN (WGAN)
- Example of DCGAN with TensorFlow
- Adversarial training
- Generative Adversarial Networks
- Summary
- An example of a variational autoencoder with TensorFlow
- Variational autoencoders
- Adding sparseness to the Fashion MNIST deep convolutional autoencoder
- Sparse autoencoders
- An example of a denoising autoencoder with TensorFlow
- Denoising autoencoders
- An example of a deep convolutional autoencoder with TensorFlow
- Autoencoders
- Autoencoders
- Summary
- Transfer learning
- Example of an LSTM network with Keras
- GRU
- LSTM
- Backpropagation through time (BPTT)
- Recurrent networks
- Example of a deep convolutional network with Keras and data augmentation
- Examples of deep convolutional networks with Keras
- Other useful layers
- Pooling layers
- Transpose convolution
- Separable convolution
- Atrous convolution
- Strides and padding
- Bidimensional discrete convolutions
- Convolutions
- Deep convolutional networks
- Advanced Neural Models
- Summary
- Example of batch normalization with Keras
- Batch normalization
- Example of dropout with Keras
- Dropout
- Regularization and dropout
- AdaDelta with Keras
- AdaDelta
- AdaGrad with Keras
- AdaGrad
- Adam with Keras
- Adam
- RMSProp with Keras
- RMSProp
- SGD with momentum in Keras
- Momentum and Nesterov momentum
- Gradient perturbation
- Optimization algorithms
- Example of MLP with Keras
- Weight initialization
- Stochastic gradient descent
- Back-propagation algorithm
- Softmax
- Rectifier activation functions
- Sigmoid and hyperbolic tangent
- Activation functions
- Multilayer perceptrons
- Example of a perceptron with Scikit-Learn
- Perceptron
- The basic artificial neuron
- Neural Networks for Machine Learning
- Summary
- Ensemble learning as model selection
- Example of voting classifiers with Scikit-Learn
- Ensembles of voting classifiers
- Example of gradient tree boosting with Scikit-Learn
- Gradient boosting
- Example of AdaBoost with Scikit-Learn
- AdaBoost.R2
- AdaBoost.SAMME.R
- AdaBoost.SAMME
- AdaBoost
- Example of random forest with Scikit-Learn
- Random forests
- Ensemble learning fundamentals
- Ensemble Learning
- Summary
- Example of spectral clustering with Scikit-Learn
- Spectral clustering
- Example of fuzzy C-means with Scikit-Fuzzy
- Fuzzy C-means
- Silhouette score
- Adjusted Rand Index
- Completeness score
- Homogeneity score
- Evaluation metrics
- Example of K-means with Scikit-Learn
- K-means++
- K-means
- Example of KNN with Scikit-Learn
- Ball Trees
- KD Trees
- k-Nearest Neighbors
- Clustering Algorithms
- Summary
- Example of SOM
- Self-organizing maps
- Example of Rubner-Tavan's network
- Rubner-Tavan's network
- Example of Sanger's network
- Sanger's network
- Weight vector stabilization and Oja's rule
- Example of covariance rule application
- Analysis of the covariance rule
- Hebb's rule
- Hebbian Learning and Self-Organizing Maps
- Summary
- Addendum to HMMs
- An example of FastICA with Scikit-Learn
- Independent component analysis
- An example of PCA with Scikit-Learn
- Principal Component Analysis
- An example of factor analysis with Scikit-Learn
- Factor analysis
- An example of Gaussian Mixtures using Scikit-Learn
- Gaussian mixture
- An example of parameter estimation
- EM algorithm
- MLE and MAP learning
- EM Algorithm and Applications
- Summary
- Finding the most likely hidden state sequence with hmmlearn
- Viterbi algorithm
- Example of HMM training with hmmlearn
- HMM parameter estimation
- Backward phase
- Forward phase
- Forward-backward algorithm
- Hidden Markov Models (HMMs)
- Sampling example using PyMC3
- Example of Metropolis-Hastings sampling
- Metropolis-Hastings sampling
- Gibbs sampling
- A gentle introduction to Markov chains
- Example of direct sampling
- Direct sampling
- Sampling from a Bayesian network
- Bayesian networks
- Conditional probabilities and Bayes' theorem
- Bayesian Networks and Hidden Markov Models
- Summary
- Example of t-distributed stochastic neighbor embedding
- t-SNE
- Example of Laplacian Spectral Embedding
- Laplacian Spectral Embedding
- Example of locally linear embedding
- Locally linear embedding
- Example of Isomap
- Isomap
- Manifold learning
- Example of label propagation based on Markov random walks
- Label propagation based on Markov random walks
- Example of label spreading
- Label spreading
- Label propagation in Scikit-Learn
- Example of label propagation
- Label propagation
- Graph-Based Semi-Supervised Learning
- Summary
- Example of TSVM
- Transductive Support Vector Machines (TSVM)
- Example of S3VM
- Semi-supervised Support Vector Machines (S3VM)
- Example of contrastive pessimistic likelihood estimation
- Contrastive pessimistic likelihood estimation
- Weighted log-likelihood
- Example of a generative Gaussian mixture
- Generative Gaussian mixtures
- Manifold assumption
- Cluster assumption
- Smoothness assumption
- Semi-supervised assumptions
- Inductive learning
- Transductive learning
- Semi-supervised scenario
- Introduction to Semi-Supervised Learning
- Summary
- Early stopping
- ElasticNet
- Lasso
- Ridge
- Regularization
- Categorical cross-entropy
- Hinge cost function
- Huber cost function
- Mean squared error
- Examples of cost functions
- Loss and cost functions
- The Cramér-Rao bound
- Overfitting
- Variance of an estimator
- Underfitting
- Bias of an estimator
- Vapnik-Chervonenkis capacity
- Capacity of a model
- Features of a machine learning model
- Cross-validation
- Training and validation sets
- Zero-centering and whitening
- Models and data
- Machine Learning Model Fundamentals
- Reviews
- Get in touch
- Conventions used
- Download the color images
- Download the example code files
- To get the most out of this book
- What this book covers
- Who this book is for
- Preface
- Packt is searching for authors like you
- About the reviewer
- About the author
- Contributors
- PacktPub.com
- Why subscribe?
- Packt Upsell
- Dedication
- 版權信息
- 封面
- 封面
- 版權信息
- Dedication
- Packt Upsell
- Why subscribe?
- PacktPub.com
- Contributors
- About the author
- About the reviewer
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Download the color images
- Conventions used
- Get in touch
- Reviews
- Machine Learning Model Fundamentals
- Models and data
- Zero-centering and whitening
- Training and validation sets
- Cross-validation
- Features of a machine learning model
- Capacity of a model
- Vapnik-Chervonenkis capacity
- Bias of an estimator
- Underfitting
- Variance of an estimator
- Overfitting
- The Cramér-Rao bound
- Loss and cost functions
- Examples of cost functions
- Mean squared error
- Huber cost function
- Hinge cost function
- Categorical cross-entropy
- Regularization
- Ridge
- Lasso
- ElasticNet
- Early stopping
- Summary
- Introduction to Semi-Supervised Learning
- Semi-supervised scenario
- Transductive learning
- Inductive learning
- Semi-supervised assumptions
- Smoothness assumption
- Cluster assumption
- Manifold assumption
- Generative Gaussian mixtures
- Example of a generative Gaussian mixture
- Weighted log-likelihood
- Contrastive pessimistic likelihood estimation
- Example of contrastive pessimistic likelihood estimation
- Semi-supervised Support Vector Machines (S3VM)
- Example of S3VM
- Transductive Support Vector Machines (TSVM)
- Example of TSVM
- Summary
- Graph-Based Semi-Supervised Learning
- Label propagation
- Example of label propagation
- Label propagation in Scikit-Learn
- Label spreading
- Example of label spreading
- Label propagation based on Markov random walks
- Example of label propagation based on Markov random walks
- Manifold learning
- Isomap
- Example of Isomap
- Locally linear embedding
- Example of locally linear embedding
- Laplacian Spectral Embedding
- Example of Laplacian Spectral Embedding
- t-SNE
- Example of t-distributed stochastic neighbor embedding
- Summary
- Bayesian Networks and Hidden Markov Models
- Conditional probabilities and Bayes' theorem
- Bayesian networks
- Sampling from a Bayesian network
- Direct sampling
- Example of direct sampling
- A gentle introduction to Markov chains
- Gibbs sampling
- Metropolis-Hastings sampling
- Example of Metropolis-Hastings sampling
- Sampling example using PyMC3
- Hidden Markov Models (HMMs)
- Forward-backward algorithm
- Forward phase
- Backward phase
- HMM parameter estimation
- Example of HMM training with hmmlearn
- Viterbi algorithm
- Finding the most likely hidden state sequence with hmmlearn
- Summary
- EM Algorithm and Applications
- MLE and MAP learning
- EM algorithm
- An example of parameter estimation
- Gaussian mixture
- An example of Gaussian Mixtures using Scikit-Learn
- Factor analysis
- An example of factor analysis with Scikit-Learn
- Principal Component Analysis
- An example of PCA with Scikit-Learn
- Independent component analysis
- An example of FastICA with Scikit-Learn
- Addendum to HMMs
- Summary
- Hebbian Learning and Self-Organizing Maps
- Hebb's rule
- Analysis of the covariance rule
- Example of covariance rule application
- Weight vector stabilization and Oja's rule
- Sanger's network
- Example of Sanger's network
- Rubner-Tavan's network
- Example of Rubner-Tavan's network
- Self-organizing maps
- Example of SOM
- Summary
- Clustering Algorithms
- k-Nearest Neighbors
- KD Trees
- Ball Trees
- Example of KNN with Scikit-Learn
- K-means
- K-means++
- Example of K-means with Scikit-Learn
- Evaluation metrics
- Homogeneity score
- Completeness score
- Adjusted Rand Index
- Silhouette score
- Fuzzy C-means
- Example of fuzzy C-means with Scikit-Fuzzy
- Spectral clustering
- Example of spectral clustering with Scikit-Learn
- Summary
- Ensemble Learning
- Ensemble learning fundamentals
- Random forests
- Example of random forest with Scikit-Learn
- AdaBoost
- AdaBoost.SAMME
- AdaBoost.SAMME.R
- AdaBoost.R2
- Example of AdaBoost with Scikit-Learn
- Gradient boosting
- Example of gradient tree boosting with Scikit-Learn
- Ensembles of voting classifiers
- Example of voting classifiers with Scikit-Learn
- Ensemble learning as model selection
- Summary
- Neural Networks for Machine Learning
- The basic artificial neuron
- Perceptron
- Example of a perceptron with Scikit-Learn
- Multilayer perceptrons
- Activation functions
- Sigmoid and hyperbolic tangent
- Rectifier activation functions
- Softmax
- Back-propagation algorithm
- Stochastic gradient descent
- Weight initialization
- Example of MLP with Keras
- Optimization algorithms
- Gradient perturbation
- Momentum and Nesterov momentum
- SGD with momentum in Keras
- RMSProp
- RMSProp with Keras
- Adam
- Adam with Keras
- AdaGrad
- AdaGrad with Keras
- AdaDelta
- AdaDelta with Keras
- Regularization and dropout
- Dropout
- Example of dropout with Keras
- Batch normalization
- Example of batch normalization with Keras
- Summary
- Advanced Neural Models
- Deep convolutional networks
- Convolutions
- Bidimensional discrete convolutions
- Strides and padding
- Atrous convolution
- Separable convolution
- Transpose convolution
- Pooling layers
- Other useful layers
- Examples of deep convolutional networks with Keras
- Example of a deep convolutional network with Keras and data augmentation
- Recurrent networks
- Backpropagation through time (BPTT)
- LSTM
- GRU
- Example of an LSTM network with Keras
- Transfer learning
- Summary
- Autoencoders
- Autoencoders
- An example of a deep convolutional autoencoder with TensorFlow
- Denoising autoencoders
- An example of a denoising autoencoder with TensorFlow
- Sparse autoencoders
- Adding sparseness to the Fashion MNIST deep convolutional autoencoder
- Variational autoencoders
- An example of a variational autoencoder with TensorFlow
- Summary
- Generative Adversarial Networks
- Adversarial training
- Example of DCGAN with TensorFlow
- Wasserstein GAN (WGAN)
- Example of WGAN with TensorFlow
- Summary
- Deep Belief Networks
- MRF
- RBMs
- DBNs
- Example of unsupervised DBN in Python
- Example of Supervised DBN with Python
- Summary
- Introduction to Reinforcement Learning
- Reinforcement Learning fundamentals
- Environment
- Rewards
- Checkerboard environment in Python
- Policy
- Policy iteration
- Policy iteration in the checkerboard environment
- Value iteration
- Value iteration in the checkerboard environment
- TD(0) algorithm
- TD(0) in the checkerboard environment
- Summary
- Advanced Policy Estimation Algorithms
- TD(λ) algorithm
- TD(λ) in a more complex Checkerboard environment
- Actor-Critic TD(0) in the checkerboard environment
- SARSA algorithm
- SARSA in the checkerboard environment
- Q-learning
- Q-learning in the checkerboard environment
- Q-learning using a neural network
- Summary
- Other Books You May Enjoy
- Leave a review - let other readers know what you think 更新時間:2021-06-25 22:08:15