官术网_书友最值得收藏!

Kullback-Leibler divergence

Kullback-Leibler divergence (KL divergence), also known as relative entropy, is a method used to identify the similarity between two probability distributions. It measures how one probability distribution p diverges from a second expected probability distribution q

The equation used to calculate the KL divergence between two probability distributions p(x) and q(x) is as follows:

The KL divergence will be zero, or minimum, when p(x) is equal to q(x) at every other point.

Due to the asymmetric nature of KL divergence, we shouldn't use it to measure the distance between two probability distributions. It is therefore should not be used as a distance metric.

主站蜘蛛池模板: 包头市| 饶阳县| 颍上县| 紫阳县| 吉林市| 武强县| 内黄县| 丹东市| 南安市| 共和县| 黄梅县| 丰县| 灵武市| 安国市| 墨竹工卡县| 弥渡县| 内江市| 渭南市| 清水河县| 彭山县| 磴口县| 深州市| 青河县| 漠河县| 华安县| 乐昌市| 个旧市| 林芝县| 苍溪县| 灵宝市| 古丈县| 开阳县| 金平| 花莲县| 阿图什市| 玉龙| 潮安县| 栖霞市| 南城县| 五莲县| 宝清县|