舉報

會員
Bayesian Analysis with Python
Osvaldo Martin 著
更新時間:2021-08-20 10:14:06
開會員,本書免費讀 >
ThesecondeditionofBayesianAnalysiswithPythonisanintroductiontothemainconceptsofappliedBayesianinferenceanditspracticalimplementationinPythonusingPyMC3,astate-of-the-artprobabilisticprogramminglibrary,andArviZ,anewlibraryforexploratoryanalysisofBayesianmodels.ThemainconceptsofBayesianstatisticsarecoveredusingapracticalandcomputationalapproach.Syntheticandrealdatasetsareusedtointroduceseveraltypesofmodels,suchasgeneralizedlinearmodelsforregressionandclassification,mixturemodels,hierarchicalmodels,andGaussianprocesses,amongothers.Bytheendofthebook,youwillhaveaworkingknowledgeofprobabilisticmodelingandyouwillbeabletodesignandimplementBayesianmodelsforyourowndatascienceproblems.Afterreadingthebookyouwillbebetterpreparedtodelveintomoreadvancedmaterialorspecializedstatisticalmodelingifyouneedto.
最新章節
- Leave a review - let other readers know what you think
- Other Books You May Enjoy
- Where To Go Next?
- Exercises
- Summary
- Non-centered parameterization
品牌:中圖公司
上架時間:2021-08-20 09:47:25
出版社:Packt Publishing
本書數字版權由中圖公司提供,并由其授權上海閱文信息技術有限公司制作發行
- Leave a review - let other readers know what you think 更新時間:2021-08-20 10:14:06
- Other Books You May Enjoy
- Where To Go Next?
- Exercises
- Summary
- Non-centered parameterization
- Divergences
- Effective sample sizes
- Autocorrelation
- Monte Carlo error
- Convergence
- Diagnosing the samples
- Sequential Monte Carlo
- Hamiltonian Monte Carlo
- Metropolis-Hastings
- Markov chain
- Monte Carlo
- Markovian methods
- Automatic differentiation variational inference
- Variational methods
- Quadratic method
- Grid computing
- Non-Markovian methods
- Inference engines
- Inference Engines
- Exercises
- Summary
- The redwood dataset
- The coal-mining disasters
- Cox processes
- Gaussian process classification
- Regression with spatial autocorrelation
- Gaussian process regression
- Gaussian processes
- Covariance functions and kernels
- Multivariate Gaussians and functions
- Modeling functions
- Linear models and non-linear data
- Gaussian Processes
- Exercises
- Summary
- The Student's t-distribution
- Beta-binomial and negative binomial
- Continuous mixtures
- Dirichlet process
- Non-finite mixture model
- Mixture models and clustering
- How to choose K
- Non-identifiability of mixture models
- The Dirichlet distribution
- The categorical distribution
- Finite mixture models
- Mixture models
- Mixture Models
- Exercises
- Summary
- Kullback-Leibler divergence
- Entropy
- WAIC in depth
- Regularizing priors
- Bayes factors and Information Criteria
- Using Sequential Monte Carlo to compute Bayes factors
- Common problems when computing Bayes factors
- Computing Bayes factors
- Some remarks
- Bayes factors
- Model averaging
- A note on the reliability of WAIC and LOO computations
- Model comparison with PyMC3
- Other Information Criteria
- Pareto smoothed importance sampling leave-one-out cross-validation
- Widely applicable Information Criterion
- Akaike information criterion
- Log-likelihood and deviance
- Information criteria
- Cross-validation
- Predictive accuracy measures
- The balance between simplicity and accuracy
- Too few parameters leads to underfitting
- Too many parameters leads to overfitting
- Occam's razor – simplicity and accuracy
- Posterior predictive checks
- Model Comparison
- Exercises
- Summary
- The GLM module
- Robust logistic regression
- Poisson regression and ZIP regression
- The zero-inflated Poisson model
- Poisson distribution
- Poisson regression
- Discriminative and generative models
- Softmax regression
- Dealing with unbalanced classes
- Dealing with correlated variables
- Interpreting the coefficients of a logistic regression
- Implementing the model
- The boundary decision
- Multiple logistic regression
- The logistic model applied to the iris dataset
- The Iris dataset
- The logistic model
- Logistic regression
- Generalized linear models
- Generalizing Linear Models
- Exercises
- Summary
- Variable variance
- Adding interactions
- Masking effect variables
- Multicollinearity or when the correlation is too high
- Confounding variables and redundant variables
- Multiple linear regression
- Polynomial regression – the ultimate model?
- Interpreting the parameters of a polynomial regression
- Polynomial regression
- Correlation causation and the messiness of life
- Hierarchical linear regression
- Robust linear regression
- Pearson coefficient from a multivariate Gaussian
- Pearson correlation coefficient
- Interpreting and visualizing the posterior
- Modifying the data before running
- Linear models and high autocorrelation
- The core of the linear regression models
- The machine learning connection
- Simple linear regression
- Modeling with Linear Regression
- Exercises
- Summary
- One more example
- Shrinkage
- Hierarchical models
- The tips dataset
- Probability of superiority
- Cohen's d
- Groups comparison
- Student's t-distribution
- Robust inferences
- Gaussian inferences
- Gaussians all the way down
- Loss functions
- ROPE
- Posterior-based decisions
- Summarizing the posterior
- Pushing the inference button
- Model specification
- Flipping coins the PyMC3 way
- PyMC3 primer
- Probabilistic programming
- Programming Probabilistically
- Exercises
- Summary
- Posterior predictive checks
- Highest-posterior density
- Summarizing the posterior
- Model notation and visualization
- Communicating a Bayesian analysis
- The influence of the prior and how to choose one
- Computing and plotting the posterior
- Getting the posterior
- Choosing the prior
- Choosing the likelihood
- The general model
- The coin-flipping problem
- Single-parameter inference
- Bayes' theorem
- Independently and identically distributed variables
- Probability distributions
- Defining probabilities
- Interpreting probabilities
- Probability theory
- Bayesian modeling
- Working with data
- Statistics models and this book's approach
- Thinking Probabilistically
- Reviews
- Get in touch
- Conventions used
- Download the color images
- Download the example code files
- To get the most out of this book
- What this book covers
- Who this book is for
- Preface
- Packt is searching for authors like you
- About the reviewer
- About the author
- Contributors
- Foreword
- Packt.com
- Why subscribe?
- About Packt
- Dedication
- Bayesian Analysis with Python Second Edition
- Copyright and Credits
- Title Page
- coverpage
- coverpage
- Title Page
- Copyright and Credits
- Bayesian Analysis with Python Second Edition
- Dedication
- About Packt
- Why subscribe?
- Packt.com
- Foreword
- Contributors
- About the author
- About the reviewer
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Download the color images
- Conventions used
- Get in touch
- Reviews
- Thinking Probabilistically
- Statistics models and this book's approach
- Working with data
- Bayesian modeling
- Probability theory
- Interpreting probabilities
- Defining probabilities
- Probability distributions
- Independently and identically distributed variables
- Bayes' theorem
- Single-parameter inference
- The coin-flipping problem
- The general model
- Choosing the likelihood
- Choosing the prior
- Getting the posterior
- Computing and plotting the posterior
- The influence of the prior and how to choose one
- Communicating a Bayesian analysis
- Model notation and visualization
- Summarizing the posterior
- Highest-posterior density
- Posterior predictive checks
- Summary
- Exercises
- Programming Probabilistically
- Probabilistic programming
- PyMC3 primer
- Flipping coins the PyMC3 way
- Model specification
- Pushing the inference button
- Summarizing the posterior
- Posterior-based decisions
- ROPE
- Loss functions
- Gaussians all the way down
- Gaussian inferences
- Robust inferences
- Student's t-distribution
- Groups comparison
- Cohen's d
- Probability of superiority
- The tips dataset
- Hierarchical models
- Shrinkage
- One more example
- Summary
- Exercises
- Modeling with Linear Regression
- Simple linear regression
- The machine learning connection
- The core of the linear regression models
- Linear models and high autocorrelation
- Modifying the data before running
- Interpreting and visualizing the posterior
- Pearson correlation coefficient
- Pearson coefficient from a multivariate Gaussian
- Robust linear regression
- Hierarchical linear regression
- Correlation causation and the messiness of life
- Polynomial regression
- Interpreting the parameters of a polynomial regression
- Polynomial regression – the ultimate model?
- Multiple linear regression
- Confounding variables and redundant variables
- Multicollinearity or when the correlation is too high
- Masking effect variables
- Adding interactions
- Variable variance
- Summary
- Exercises
- Generalizing Linear Models
- Generalized linear models
- Logistic regression
- The logistic model
- The Iris dataset
- The logistic model applied to the iris dataset
- Multiple logistic regression
- The boundary decision
- Implementing the model
- Interpreting the coefficients of a logistic regression
- Dealing with correlated variables
- Dealing with unbalanced classes
- Softmax regression
- Discriminative and generative models
- Poisson regression
- Poisson distribution
- The zero-inflated Poisson model
- Poisson regression and ZIP regression
- Robust logistic regression
- The GLM module
- Summary
- Exercises
- Model Comparison
- Posterior predictive checks
- Occam's razor – simplicity and accuracy
- Too many parameters leads to overfitting
- Too few parameters leads to underfitting
- The balance between simplicity and accuracy
- Predictive accuracy measures
- Cross-validation
- Information criteria
- Log-likelihood and deviance
- Akaike information criterion
- Widely applicable Information Criterion
- Pareto smoothed importance sampling leave-one-out cross-validation
- Other Information Criteria
- Model comparison with PyMC3
- A note on the reliability of WAIC and LOO computations
- Model averaging
- Bayes factors
- Some remarks
- Computing Bayes factors
- Common problems when computing Bayes factors
- Using Sequential Monte Carlo to compute Bayes factors
- Bayes factors and Information Criteria
- Regularizing priors
- WAIC in depth
- Entropy
- Kullback-Leibler divergence
- Summary
- Exercises
- Mixture Models
- Mixture models
- Finite mixture models
- The categorical distribution
- The Dirichlet distribution
- Non-identifiability of mixture models
- How to choose K
- Mixture models and clustering
- Non-finite mixture model
- Dirichlet process
- Continuous mixtures
- Beta-binomial and negative binomial
- The Student's t-distribution
- Summary
- Exercises
- Gaussian Processes
- Linear models and non-linear data
- Modeling functions
- Multivariate Gaussians and functions
- Covariance functions and kernels
- Gaussian processes
- Gaussian process regression
- Regression with spatial autocorrelation
- Gaussian process classification
- Cox processes
- The coal-mining disasters
- The redwood dataset
- Summary
- Exercises
- Inference Engines
- Inference engines
- Non-Markovian methods
- Grid computing
- Quadratic method
- Variational methods
- Automatic differentiation variational inference
- Markovian methods
- Monte Carlo
- Markov chain
- Metropolis-Hastings
- Hamiltonian Monte Carlo
- Sequential Monte Carlo
- Diagnosing the samples
- Convergence
- Monte Carlo error
- Autocorrelation
- Effective sample sizes
- Divergences
- Non-centered parameterization
- Summary
- Exercises
- Where To Go Next?
- Other Books You May Enjoy
- Leave a review - let other readers know what you think 更新時間:2021-08-20 10:14:06