官术网_书友最值得收藏!

Rectified linear units

The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:

Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0. 

主站蜘蛛池模板: 九龙县| 大竹县| 通江县| 黄大仙区| 临潭县| 增城市| 磴口县| 始兴县| 千阳县| 尚志市| 卫辉市| 宣威市| 贵州省| 安多县| 黄石市| 右玉县| 澎湖县| 慈溪市| 蚌埠市| 碌曲县| 黎平县| 灵川县| 平谷区| 东辽县| 穆棱市| 财经| 邵东县| 阜新| 峨眉山市| 双鸭山市| 赤壁市| 百色市| 灵台县| 夏邑县| 嵩明县| 沂水县| 当阳市| 那坡县| 崇仁县| 文成县| 华安县|