官术网_书友最值得收藏!

Jensen-Shannon divergence

The Jensen-Shannon divergence (also called the information radius (IRaDor the total divergence to the average) is another measure of similarity between two probability distributions. It is based on KL divergence. Unlike KL divergence, however, JS divergence is symmetric in nature and can be used to measure the distance between two probability distributions. If we take the square root of the Jensen-Shannon divergence, we get the Jensen-Shannon distance, so it is therefore a distance metric.

The following equation represents the Jensen-Shannon divergence between two probability distributions, p and q:

In the preceding equation, (p+q) is the midpoint measure, while  is the Kullback-Leibler divergence.

Now that we have learned about the KL divergence and the Jenson-Shannon divergence, let's discuss the Nash equilibrium for GANs.

主站蜘蛛池模板: 黔西县| 蚌埠市| 庆阳市| 中阳县| 金坛市| 通山县| 合肥市| 奉化市| 紫云| 绍兴县| 察隅县| 临夏县| 宣城市| 南昌市| 灌南县| 张家界市| 凤翔县| 贺兰县| 新平| 平泉县| 武定县| 赣榆县| 邵武市| 南皮县| 三台县| 福鼎市| 武鸣县| 龙岩市| 陇西县| 东辽县| 金寨县| 太原市| 宝山区| 鹤岗市| 台山市| 汽车| 上饶县| 卢龙县| 安阳县| 崇阳县| 河北区|