官术网_书友最值得收藏!

Building a multi-layer neural network

What we've created in the previous recipe is actually the simplest form of an FNN: a neural network where the information flows only in one direction. For our next recipe, we will extend the number of hidden layers from one to multiple layers. Adding additional layers increases the power of a network to learn complex non-linear patterns. 

Figure 2.7: Two-layer neural network with i input variables,  n hidden units, and  m hidden units respectively, and a single output unit

As you can see in Figure 2-7, by adding an additional layer the number of connections (weights), also called trainable parameters, increases exponentially. In the next recipe, we will create a network with two hidden layers to predict wine quality. This is a regression task, so we will be using a linear activation for the output layer. For the hidden layers, we use ReLU activation functions. This recipe uses the Keras framework to implement the feed-forward network.

主站蜘蛛池模板: 武城县| 囊谦县| 文昌市| 和龙市| 绍兴县| 海城市| 彰武县| 兰西县| 鹰潭市| 永和县| 达日县| 社旗县| 光山县| 大宁县| 太白县| 郧西县| 巴彦淖尔市| 景宁| 三台县| 台山市| 房山区| 万州区| 邢台市| 东明县| 宁阳县| 革吉县| 深水埗区| 华安县| 辉县市| 长葛市| 屏山县| 乡城县| 布拖县| 浦北县| 萨嘎县| 岳普湖县| 新巴尔虎左旗| 徐闻县| 射阳县| 嘉峪关市| 怀来县|