- Practical Convolutional Neural Networks
- Mohit Sewak Md. Rezaul Karim Pradeep Pujari
- 188字
- 2021-06-24 18:58:52
Building the network
For this example, you'll define the following:
- The input layer, which you should expect for each piece of MNIST data, as it tells the network the number of inputs
- Hidden layers, as they recognize patterns in data and also connect the input layer to the output layer
- The output layer, as it defines how the network learns and gives a label as the output for a given image, as follows:
# Defining the neural network def build_model(): model = Sequential() model.add(Dense(512, input_shape=(784,))) model.add(Activation('relu')) # An "activation" is just a non-linear function that is applied to the output # of the above layer. In this case, with a "rectified linear unit", # we perform clamping on all values below 0 to 0. model.add(Dropout(0.2)) #With the help of Dropout helps we can protect the model from memorizing or "overfitting" the training data model.add(Dense(512)) model.add(Activation('relu')) model.add(Dropout(0.2)) model.add(Dense(10)) model.add(Activation('softmax')) # This special "softmax" activation, #It also ensures that the output is a valid probability distribution, #Meaning that values obtained are all non-negative and sum up to 1. return model
#Building the model model = build_model()
model.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
推薦閱讀
- Spark快速大數據分析(第2版)
- App+軟件+游戲+網站界面設計教程
- 卷積神經網絡的Python實現
- 新型數據庫系統:原理、架構與實踐
- 醫療大數據挖掘與可視化
- 區塊鏈通俗讀本
- 智能數據時代:企業大數據戰略與實戰
- Splunk智能運維實戰
- Augmented Reality using Appcelerator Titanium Starter
- Visual Studio 2013 and .NET 4.5 Expert Cookbook
- Doris實時數倉實戰
- Expert Python Programming(Third Edition)
- The Natural Language Processing Workshop
- R Machine Learning Essentials
- Spring Boot 2.0 Cookbook(Second Edition)