官术网_书友最值得收藏!

The architecture of the generator 

The generator network in our dummy GAN is a simple feed-forward neural network with five layers: an input layer, three hidden layers, and an output layer. Let's take a closer look at the configuration of the generator (dummy) network:

 

The preceding table shows the configurations of the hidden layers, and also the input and output layers in the network.

The following diagram shows the flow of tensors and the input and output shapes of the tensors for each layer in the generator network:

The architecture of the generator network.

Let's discuss how this feed-forward neural network processes information during forward propagation of the data:

  • The input layer takes a 100-dimensional vector sampled from a Gaussian (normal) distribution and passes the tensor to the first hidden layer without any modifications.
  • The three hidden layers are dense layers with 500, 500, and 784 units, respectively. The first hidden layer (a dense layer) converts a tensor of a shape of (batch_size, 100) to a tensor of a shape of (batch_size, 500).
  • The second dense layer generates a tensor of a shape of (batch_size, 500).
  • The third hidden layer generates a tensor of a shape of (batch_size, 784).
  • In the last output layer, this tensor is reshaped from a shape of (batch_size, 784) to a shape of (batch_size, 28, 28). This means that our network will generate a batch of images, where one image will have a shape of (28, 28).
主站蜘蛛池模板: 昆明市| 巴楚县| 邵东县| 潮州市| 罗平县| 泗洪县| 古田县| 韩城市| 衡水市| 罗山县| 抚顺县| 藁城市| 砀山县| 万山特区| 武强县| 曲周县| 华坪县| 东山县| 西充县| 天长市| 大同县| 扶风县| 周宁县| 奉节县| 枣庄市| 南木林县| 枝江市| 葵青区| 海门市| 道真| 义乌市| 云龙县| 三台县| 凯里市| 东乡| 南华县| 彝良县| 大邑县| 巴里| 六盘水市| 临朐县|