官术网_书友最值得收藏!

Deep neural networks

DNNs are neural networks having complex and deeper architecture with a large number of neurons in each layer, and there are many connections. The computation in each layer transforms the representations in the subsequent layers into slightly more abstract representations. However, we will use the term DNN to refer specifically to the MLP, the Stacked Auto-Encoder (SAE), and Deep Belief Networks (DBNs).

SAEs and DBNs use AEs and Restricted Boltzmann Machines (RBMs) as building blocks of the architectures. The main difference between these and MLPs is that training is executed in two phases: unsupervised pre-training and supervised fine-tuning.

SAE and DBN using AE and RBM respectively

In unsupervised pre-training, shown in the preceding diagram, the layers are stacked sequentially and trained in a layer-wise manner, like an AE or RBM using unlabeled data. Afterwards, in supervised fine-tuning, an output classifier layer is stacked and the complete neural network is optimized by retraining with labeled data.

主站蜘蛛池模板: 闻喜县| 星子县| 扎兰屯市| 鞍山市| 永宁县| 大宁县| 柘荣县| 武定县| 烟台市| 资中县| 江孜县| 乌鲁木齐县| 德阳市| 台山市| 惠东县| 上蔡县| 开江县| 三穗县| 仲巴县| 漯河市| 安仁县| 读书| 滦南县| 财经| 玛多县| 军事| 寻乌县| 鞍山市| 广州市| 油尖旺区| 东莞市| 石柱| 贞丰县| 丹凤县| 广昌县| 鸡泽县| 色达县| 宜兰县| 五寨县| 深州市| 邹城市|