官术网_书友最值得收藏!

Multilayer perceptron network construction

As I informed you in the preceding chapter, DL4J-based neural networks are made of multiple layers. Everything starts with a MultiLayerConfiguration, which organizes those layers and their hyperparameters.

Hyperparameters are a set of variables that determine how a neural network would learn. There are many parameters, for example, how many times and how often to update the weights of the model (called an epoch), how to initialize network weights, which activation function to be used, which updater and optimization algorithms to be used, the learning rate (that is, how fast the model should learn), how many hidden layers are there, how many neurons are there in each layer, and so on.

We now create the network. First, let us create the layers. Similar to the MLP we created in Chapter 1Getting Started with Deep Learning, our MLP will have four layers:

  • Layer 0: Input layer
  • Lauer 1: Hidden layer 1
  • Layer 2: Hidden layer 2
  • Layer 3: Output layer

More technically, the first layer is the input layer, and then two layers are placed as hidden layers. For the first three layers, we initialized the weights using Xavier and the activation function is ReLU. Finally, the output layer is placed. This setting is shown in the following figure:

Multilayer perceptron for Titanic survival prediction input layer

We have specified the neurons (that is, nodes), which are an equal number of inputs, and an arbitrary number of neurons as output. We set a smaller value considering very few inputs and features:

DenseLayer input_layer = new DenseLayer.Builder()
.weightInit(WeightInit.XAVIER)
.activation(Activation.RELU)
.nIn(numInputs)
.nOut(16)
.build();
主站蜘蛛池模板: 嘉峪关市| 瓦房店市| 彝良县| 青岛市| 昌江| 萍乡市| 白朗县| 合川市| 福海县| 霍州市| 北安市| 永泰县| 略阳县| 门源| 顺义区| 大姚县| 贵南县| 曲水县| 三台县| 开江县| 红河县| 福清市| 郯城县| 内乡县| 灵宝市| 玛纳斯县| 永福县| 彭阳县| 株洲县| 石狮市| 易门县| 冀州市| 凯里市| 双鸭山市| 江孜县| 卫辉市| 石棉县| 天柱县| 石林| 大连市| 肥城市|