官术网_书友最值得收藏!

Building the network

For this example, you'll define the following:

  • The input layer, which you should expect for each piece of MNIST data, as it tells the network the number of inputs
  • Hidden layers, as they recognize patterns in data and also connect the input layer to the output layer
  • The output layer, as it defines how the network learns and gives a label as the output for a given image, as follows:
# Defining the neural network
def build_model():
    model = Sequential()
    model.add(Dense(512, input_shape=(784,)))
    model.add(Activation('relu')) # An "activation" is just a non-linear function that is applied to the output
 # of the above layer. In this case, with a "rectified linear unit",
 # we perform clamping on all values below 0 to 0.
                           
    model.add(Dropout(0.2))   #With the help of Dropout helps we can protect the model from memorizing or "overfitting" the training data
    model.add(Dense(512))
    model.add(Activation('relu'))
    model.add(Dropout(0.2))
    model.add(Dense(10))
    model.add(Activation('softmax')) # This special "softmax" activation,
    #It also ensures that the output is a valid probability distribution,
    #Meaning that values obtained are all non-negative and sum up to 1.
    return model
#Building the model
model = build_model()
model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])
主站蜘蛛池模板: 浪卡子县| 静乐县| 千阳县| 丹凤县| 连州市| 襄樊市| 新田县| 庐江县| 日土县| 舞钢市| 岢岚县| 阿巴嘎旗| 湘西| 临清市| 景宁| 都匀市| 文化| 丹阳市| 武汉市| 波密县| 勐海县| 广西| 裕民县| 安徽省| 屏南县| 嘉禾县| 尼木县| 兴海县| 龙胜| 读书| 四会市| 德格县| 慈利县| 武强县| 体育| 扶绥县| 黔西县| 桃源县| 辽宁省| 丹棱县| 建宁县|