官术网_书友最值得收藏!

  • Deep Learning with Theano
  • Christopher Bourez
  • 232字
  • 2021-07-15 17:17:00

Classification loss function

The loss function is an objective function to minimize during training to get the best model. Many different loss functions exist.

In a classification problem, where the target is to predict the correct class among k classes, cross-entropy is commonly used as it measures the difference between the real probability distribution, q, and the predicted one, p, for each class:

Here, i is the index of the sample in the dataset, n is the number of samples in the dataset, and k is the number of classes.

While the real probability

of each class is unknown, it can simply be approximated in practice by the empirical distribution, that is, randomly drawing a sample out of the dataset in the dataset order. The same way, the cross-entropy of any predicted probability, p, can be approximated by the empirical cross-entropy:

Here,

is the probability estimated by the model for the correct class of example

.

Accuracy and cross-entropy both evolve in the same direction but measure different things. Accuracy measures how much the predicted class is correct, while cross-entropy measure the distance between the probabilities. A decrease in cross-entropy explains that the probability to predict the correct class gets better, but the accuracy may remain constant or drop.

While accuracy is discrete and not differentiable, the cross-entropy loss is a differentiable function that can be easily used for training a model.

主站蜘蛛池模板: 灵宝市| 宜阳县| 海盐县| 宜阳县| 宜都市| 淄博市| 乐山市| 宣威市| 龙岩市| 石棉县| 仪陇县| 望江县| 临高县| 曲松县| 英超| 通化市| 靖安县| 永靖县| 清徐县| 泾川县| 榆林市| 凉山| 曲水县| 白玉县| 福泉市| 中江县| 丹巴县| 获嘉县| 云霄县| 敖汉旗| 芦山县| 鞍山市| 肥东县| 秦安县| 仁寿县| 南康市| 车险| 洪湖市| 蓬莱市| 水富县| 积石山|