舉報

會員
Mastering Predictive Analytics with scikit:learn and TensorFlow
Pythonisaprogramminglanguagethatprovidesawiderangeoffeaturesthatcanbeusedinthefieldofdatascience.MasteringPredictiveAnalyticswithscikit-learnandTensorFlowcoversvariousimplementationsofensemblemethods,howtheyareusedwithreal-worlddatasets,andhowtheyimprovepredictionaccuracyinclassificationandregressionproblems.Thisbookstartswithensemblemethodsandtheirfeatures.Youwillseethatscikit-learnprovidestoolsforchoosinghyperparametersformodels.Asyoumakeyourwaythroughthebook,youwillcoverthenitty-grittyofpredictiveanalyticsandexploreitsfeaturesandcharacteristics.YouwillalsobeintroducedtoartificialneuralnetworksandTensorFlow,andhowitisusedtocreateneuralnetworks.Inthefinalchapter,youwillexplorefactorssuchascomputationalpower,alongwithimprovementmethodsandsoftwareenhancementsforefficientpredictiveanalytics.Bytheendofthisbook,youwillbewell-versedinusingdeepneuralnetworkstosolvecommonproblemsinbigdataanalysis.
目錄(122章)
倒序
- 封面
- Title Page
- Copyright and Credits
- Mastering Predictive Analytics with scikit-learn and TensorFlow
- Packt Upsell
- Why subscribe?
- Packt.com
- Contributor
- About the author
- Packt is searching for authors like you
- Preface
- Who this book is for
- What this book covers
- To get the most out of this book
- Download the example code files
- Download the color images
- Conventions used
- Get in touch
- Reviews
- Ensemble Methods for Regression and Classification
- Ensemble methods and their working
- Bootstrap sampling
- Bagging
- Random forests
- Boosting
- Ensemble methods for regression
- The diamond dataset
- Training different regression models
- KNN model
- Bagging model
- Random forests model
- Boosting model
- Using ensemble methods for classification
- Predicting a credit card dataset
- Training different regression models
- Logistic regression model
- Bagging model
- Random forest model
- Boosting model
- Summary
- Cross-validation and Parameter Tuning
- Holdout cross-validation
- K-fold cross-validation
- Implementing k-fold cross-validation
- Comparing models with k-fold cross-validation
- Introduction to hyperparameter tuning
- Exhaustive grid search
- Hyperparameter tuning in scikit-learn
- Comparing tuned and untuned models
- Summary
- Working with Features
- Feature selection methods
- Removing dummy features with low variance
- Identifying important features statistically
- Recursive feature elimination
- Dimensionality reduction and PCA
- Feature engineering
- Creating new features
- Improving models with feature engineering
- Training your model
- Reducible and irreducible error
- Summary
- Introduction to Artificial Neural Networks and TensorFlow
- Introduction to ANNs
- Perceptrons
- Multilayer perceptron
- Elements of a deep neural network model
- Deep learning
- Elements of an MLP model
- Introduction to TensorFlow
- TensorFlow installation
- Core concepts in TensorFlow
- Tensors
- Computational graph
- Summary
- Predictive Analytics with TensorFlow and Deep Neural Networks
- Predictions with TensorFlow
- Introduction to the MNIST dataset
- Building classification models using MNIST dataset
- Elements of the DNN model
- Building the DNN
- Reading the data
- Defining the architecture
- Placeholders for inputs and labels
- Building the neural network
- The loss function
- Defining optimizer and training operations
- Training strategy and valuation of accuracy of the classification
- Running the computational graph
- Regression with Deep Neural Networks (DNN)
- Elements of the DNN model
- Building the DNN
- Reading the data
- Objects for modeling
- Training strategy
- Input pipeline for the DNN
- Defining the architecture
- Placeholders for input values and labels
- Building the DNN
- The loss function
- Defining optimizer and training operations
- Running the computational graph
- Classification with DNNs
- Exponential linear unit activation function
- Classification with DNNs
- Elements of the DNN model
- Building the DNN
- Reading the data
- Producing the objects for modeling
- Training strategy
- Input pipeline for DNN
- Defining the architecture
- Placeholders for inputs and labels
- Building the neural network
- The loss function
- Evaluation nodes
- Optimizer and the training operation
- Run the computational graph
- Evaluating the model with a set threshold
- Summary
- Other Books You May Enjoy
- Leave a review - let other readers know what you think 更新時間:2021-07-23 16:42:54
推薦閱讀
- Natural Language Processing Fundamentals
- 并行數據挖掘及性能優化:關聯規則與數據相關性分析
- Mobile DevOps
- 自動檢測與轉換技術
- 機艙監測與主機遙控
- 小型電動機實用設計手冊
- Spark大數據技術與應用
- 精通數據科學算法
- JSP從入門到精通
- 自動化生產線安裝與調試(三菱FX系列)(第二版)
- Bayesian Analysis with Python
- MATLAB-Simulink系統仿真超級學習手冊
- 大數據:引爆新的價值點
- 玩轉PowerPoint
- ARM嵌入式系統開發完全入門與主流實踐
- Intel Edison Projects
- Hands-On Data Analysis with Scala
- Learning Kibana 7(Second Edition)
- Learning Pentaho Data Integration 8 CE(Third Edition)
- 精通LabVIEW 8.x
- 人工智能時代移動學習服務
- 機器人學基礎
- ARM Cortex-M3嵌入式開發實例詳解
- 機器人力觸覺感知技術
- 自動化焦慮癥:科技與職場的未來(《經濟學人》選輯)
- Windows XP操作系統考前12小時
- 微網的預測、控制與優化運行
- Learning ObjectiveC by Developing iPhone Games
- ServiceNow IT Operations Management
- Windows 8入門與提高