官术网_书友最值得收藏!

The cost function

The cost function is a metric that determines how well or poorly a machine learning algorithm performed with regards to the actual training output and the predicted output. If you remember linear regression, where the sum of squares of errors was used as the loss function, that is, . This works better in a convex curve, but in the case of classification, the curve is non convex; as a result, the gradient descent doesn't work well and doesn't tend to global optimum. Therefore, we use cross-entropy loss which fits better in classification tasks as the cost function.

Cross entropy as loss function (for input data), that is, , where C refers to different output classes.
Thus, cost function = Average cross entropy loss (for the whole dataset), that is, .

In case of binary logistic regression, output classes are only two, that is, 0 and 1, since the sum of class values will always be 1. Therefore (for input data), if one class is , the other will be . Similarly, since the probability of class is (prediction), then the probability of the other class, that is, , will be .

Therefore, the loss function modifies to , where:

  • If , that is, = - . Therefore, to minimize , should be large, that is, closer to 1.

  • If , that is, = - . Therefore, to minimize , should be small, that is, closer to 0.

Loss function applies to a single example whereas cost function applies on the whole training lot. Thus, the cost function for this case will be:

主站蜘蛛池模板: 开江县| 德安县| 房产| 应城市| 新野县| 剑阁县| 威信县| 尼玛县| 电白县| 佛冈县| 新和县| 涡阳县| 手游| 兴隆县| 徐水县| 离岛区| 大厂| 盱眙县| 江源县| 弥勒县| 类乌齐县| 句容市| 韩城市| 湟源县| 区。| 阜宁县| 杨浦区| 文登市| 杨浦区| 榆社县| 丽江市| 洞口县| 奉化市| 庄浪县| 绥芬河市| 河西区| 云霄县| 类乌齐县| 宝鸡市| 融水| 镇江市|