官术网_书友最值得收藏!

Deep belief networks

To overcome the overfitting problem in MLPs, the DBN was proposed by Hinton et al. It uses a greedy, layer-by-layer, pre-training algorithm to initialize the network weights through probabilistic generative models.

DBNs are composed of a visible layer and multiple layers—hidden units. The top two layers have undirected, symmetric connections in between and form an associative memory, whereas lower layers receive top-down, directed connections from the preceding layer. The building blocks of a DBN are RBMs, as you can see in the following figure, where several RBMs are stacked one after another to form DBNs:

A DBN configured for semi-supervised learning

A single RBM consists of two layers. The first layer is composed of visible neurons, and the second layer consists of hidden neurons. Figure 16 shows the structure of a simple RBM, where the neurons are arranged according to a symmetrical bipartite graph:

RBM architecture

In DBNs, an RBM is trained first with input data, called unsupervised pre-training, and the hidden layer represents the features learned using a greedy learning approach called supervised fine-tuning. Despite numerous successes, DBNs are being replaced  by AEs.

主站蜘蛛池模板: 双城市| 探索| 夏津县| 内乡县| 图木舒克市| 重庆市| 兰溪市| 甘泉县| 浮山县| 城口县| 台中县| 科技| 盐城市| 亳州市| 清水河县| 六安市| 建平县| 黄浦区| 枝江市| 剑河县| 黄梅县| 安国市| 靖州| 大连市| 镇坪县| 江陵县| 承德县| 新源县| 卫辉市| 顺义区| 三门峡市| 锡林浩特市| 乌鲁木齐县| 遵义县| 方正县| 商水县| 巴里| 海城市| 津南区| 乌兰察布市| 江永县|