官术网_书友最值得收藏!

Bias and variance errors in deep learning

You may be familiar with the so-called bias/variance trade-off in typical predictive models. In case you're not, we'll provide a quick reminder here. With traditional predictive models, there is usually some compromise when we try to find an error from bias and an error from variance. So let's see what these two errors are:

  • Bias error: Bias error is the error that is introduced by the model. For example, if you attempted to model a non-linear function with a linear model, your model would be under specified and the bias error would be high.
  • Variance error: Variance error is the error that is introduced by randomness in the training data. When we fit our training distribution so well that our model no longer generalizes, we have overfit or introduce a variance error.

In most machine learning applications, we seek to find some compromise that minimizes bias error, while introducing as little variance error as possible. I say most because one of the great things about deep neural networks is that, for the most part, bias and variance can be manipulated independently of one another. However, to do so, we will need to be very careful with how we structure our training data.

主站蜘蛛池模板: 三亚市| 绍兴市| 平潭县| 桂东县| 湖南省| 舟曲县| 华宁县| 武冈市| 河北省| 玉树县| 隆安县| 偃师市| 偏关县| 砚山县| 玉田县| 丹巴县| 会昌县| 北川| 安仁县| 墨玉县| 于都县| 隆化县| 灵寿县| 柳林县| 东阿县| 佛山市| 桐柏县| 云阳县| 长沙市| 巫山县| 当阳市| 尼玛县| 马公市| 互助| 航空| 乌什县| 视频| 乌海市| 台中市| 刚察县| 石首市|