最新章節
- Summary
- Breast cancer detection using darch
- Autoencoders using H2O
- PCA using H2O
- Working with autoencoders
- LSTM using the iris dataset
品牌:中圖公司
上架時間:2021-08-20 09:49:12
出版社:Packt Publishing
本書數字版權由中圖公司提供,并由其授權上海閱文信息技術有限公司制作發行
- Summary 更新時間:2021-08-20 10:25:39
- Breast cancer detection using darch
- Autoencoders using H2O
- PCA using H2O
- Working with autoencoders
- LSTM using the iris dataset
- MNIST HWR using R
- Keras integration with R
- TensorFlow integration with R
- Use Cases of Neural Networks – Advanced Topics
- Summary
- Humidity forecast using RNN
- Common CNN architecture - LeNet
- Step #4 – voting and classification in the fully connected layer
- Step #3 – ReLU for normalization
- Step #2 – pooling
- Step #1 – filtering
- Convolutional Neural Networks
- LSTM model
- The rnn package in R
- Recurrent Neural Network
- Recurrent and Convolutional Neural Networks
- Summary
- Ensemble predictions using neural networks
- Scaling of data in neural network models
- Generalization of neural networks
- Avoiding overfitting in the model
- Early stopping in neural network training
- Testing the network
- The network training phase
- Neural network model
- Exploratory analysis
- Classifing breast cancer with a neural network
- Neural network model
- Exploratory analysis
- Data fitting with neural network
- Training and Visualizing a Neural Network in R
- Summary
- MLP R implementation using RSNNS
- Multi-Layer Perceptron
- The perceptron function in R
- Linear separation
- Simple perceptron – a linear separable classifier
- Perceptrons and their applications
- Perceptron Neural Network Modeling – Basic Models
- Summary
- Deep autoencoders using H2O
- Training and modeling a DNN using H2O
- Multilayer neural networks with neuralnet
- R for DNNs
- Introduction of DNNs
- Deep Learning Using Multilayer Neural Networks
- Summary
- Kohonen SOM
- Competitive learning
- Unsupervised learning in neural networks
- Neural network regression with the Boston dataset
- Boston dataset
- Supervised learning in neural networks
- Neural network learning algorithm optimization
- Back to backpropagation
- Learning in neural networks
- Receiver Operating Characteristic curve
- F-score
- Precision and recall
- Accuracy
- True Negative Rate
- True Positive Rate
- Confusion matrix
- Evaluation metrics
- The data cycle
- Training and testing the model
- Reinforcement learning
- Unsupervised learning
- Supervised learning
- What is machine learning?
- Learning Process in Neural Networks
- Summary
- Quick note on GPU processing
- Best practices in neural network implementations
- Cons
- Pros
- Pros and cons of neural networks
- Deep learning
- Let us go through the code line-by-line
- Implementation using nnet() library
- Let us go through the code line-by-line
- Simple example using R neural net library - neuralnet()
- Taxonomy of neural networks
- Gradient descent
- Feed-forward and feedback networks
- Step-by-step illustration of a neuralnet and an activation function
- Forward and backpropagation
- Perceptron and multilayer architectures
- Which activation functions to use?
- Rectified Linear Unit
- Hyperbolic tangent
- Sigmoid
- Unit step activation function
- Linear function
- Different activation functions
- Activation functions
- Epoch
- Unsupervised learning
- Supervised learning
- Training neural networks
- Weights and biases
- Layered approach
- How do neural networks work?
- Inspiration for neural networks
- Introduction
- Neural Network and Artificial Intelligence Concepts
- Questions
- Piracy
- Errata
- Downloading the example code
- Customer support
- Reader feedback
- Conventions
- Who this book is for
- What you need for this book
- What this book covers
- Preface
- Customer Feedback
- Why subscribe?
- www.PacktPub.com
- About the Reviewer
- About the Authors
- Credits
- Neural Networks with R
- Copyright
- Title Page
- coverpage
- coverpage
- Title Page
- Copyright
- Neural Networks with R
- Credits
- About the Authors
- About the Reviewer
- www.PacktPub.com
- Why subscribe?
- Customer Feedback
- Preface
- What this book covers
- What you need for this book
- Who this book is for
- Conventions
- Reader feedback
- Customer support
- Downloading the example code
- Errata
- Piracy
- Questions
- Neural Network and Artificial Intelligence Concepts
- Introduction
- Inspiration for neural networks
- How do neural networks work?
- Layered approach
- Weights and biases
- Training neural networks
- Supervised learning
- Unsupervised learning
- Epoch
- Activation functions
- Different activation functions
- Linear function
- Unit step activation function
- Sigmoid
- Hyperbolic tangent
- Rectified Linear Unit
- Which activation functions to use?
- Perceptron and multilayer architectures
- Forward and backpropagation
- Step-by-step illustration of a neuralnet and an activation function
- Feed-forward and feedback networks
- Gradient descent
- Taxonomy of neural networks
- Simple example using R neural net library - neuralnet()
- Let us go through the code line-by-line
- Implementation using nnet() library
- Let us go through the code line-by-line
- Deep learning
- Pros and cons of neural networks
- Pros
- Cons
- Best practices in neural network implementations
- Quick note on GPU processing
- Summary
- Learning Process in Neural Networks
- What is machine learning?
- Supervised learning
- Unsupervised learning
- Reinforcement learning
- Training and testing the model
- The data cycle
- Evaluation metrics
- Confusion matrix
- True Positive Rate
- True Negative Rate
- Accuracy
- Precision and recall
- F-score
- Receiver Operating Characteristic curve
- Learning in neural networks
- Back to backpropagation
- Neural network learning algorithm optimization
- Supervised learning in neural networks
- Boston dataset
- Neural network regression with the Boston dataset
- Unsupervised learning in neural networks
- Competitive learning
- Kohonen SOM
- Summary
- Deep Learning Using Multilayer Neural Networks
- Introduction of DNNs
- R for DNNs
- Multilayer neural networks with neuralnet
- Training and modeling a DNN using H2O
- Deep autoencoders using H2O
- Summary
- Perceptron Neural Network Modeling – Basic Models
- Perceptrons and their applications
- Simple perceptron – a linear separable classifier
- Linear separation
- The perceptron function in R
- Multi-Layer Perceptron
- MLP R implementation using RSNNS
- Summary
- Training and Visualizing a Neural Network in R
- Data fitting with neural network
- Exploratory analysis
- Neural network model
- Classifing breast cancer with a neural network
- Exploratory analysis
- Neural network model
- The network training phase
- Testing the network
- Early stopping in neural network training
- Avoiding overfitting in the model
- Generalization of neural networks
- Scaling of data in neural network models
- Ensemble predictions using neural networks
- Summary
- Recurrent and Convolutional Neural Networks
- Recurrent Neural Network
- The rnn package in R
- LSTM model
- Convolutional Neural Networks
- Step #1 – filtering
- Step #2 – pooling
- Step #3 – ReLU for normalization
- Step #4 – voting and classification in the fully connected layer
- Common CNN architecture - LeNet
- Humidity forecast using RNN
- Summary
- Use Cases of Neural Networks – Advanced Topics
- TensorFlow integration with R
- Keras integration with R
- MNIST HWR using R
- LSTM using the iris dataset
- Working with autoencoders
- PCA using H2O
- Autoencoders using H2O
- Breast cancer detection using darch
- Summary 更新時間:2021-08-20 10:25:39