官术网_书友最值得收藏!

Multilayer perceptrons

The multilayer perceptron is one of the simplest networks. Essentially, it is defined as having one input layer, one output layer, and a few hidden layers (more than one). Each layer has multiple neurons and the adjacent layers are fully connected. Each neuron can be thought of as a cell in these huge networks. It determines the flow and transformation of the incoming signals. Signals from the previous layers are pushed forward to the neuron of the next layer through the connected weights. For each artificial neuron, it calculates a weighted sum of all incoming inputs by multiplying the signal with the weights and adding a bias. The weighted sum will then go through a function called an activation function to decide whether it should be fired or not, which results in output signals for the next level.

For example, a fully-connected, feed-forward neural network is pictured in the following diagram. As you may notice, there is an intercept node on each layer (x0 and a0). The non-linearity of the network is mainly contributed by the shape of the activation function.

The architecture of this fully connected, the feed-forward neural network looks essentially like the following:

Fully connected, feed-forward neural network with two hidden layers
主站蜘蛛池模板: 浦东新区| 文成县| 通渭县| 合江县| 临江市| 宜春市| 图片| 浑源县| 沧源| 海口市| 景洪市| 栾川县| 梧州市| 大悟县| 墨竹工卡县| 翼城县| 永寿县| 大庆市| 玛多县| 石柱| 镇赉县| 玉环县| 万年县| 清镇市| 青海省| 普定县| 盐边县| 汨罗市| 阿合奇县| 桓仁| 渝中区| 东至县| 安吉县| 万州区| 连江县| 凤冈县| 绵阳市| 奈曼旗| 广平县| 昌都县| 五寨县|