- Generative Adversarial Networks Projects
- Kailash Ahirwar
- 107字
- 2021-07-02 13:38:44
Kullback-Leibler divergence
Kullback-Leibler divergence (KL divergence), also known as relative entropy, is a method used to identify the similarity between two probability distributions. It measures how one probability distribution p diverges from a second expected probability distribution q.
The equation used to calculate the KL divergence between two probability distributions p(x) and q(x) is as follows:

The KL divergence will be zero, or minimum, when p(x) is equal to q(x) at every other point.
Due to the asymmetric nature of KL divergence, we shouldn't use it to measure the distance between two probability distributions. It is therefore should not be used as a distance metric.
推薦閱讀
- 機器學習及應用(在線實驗+在線自測)
- 圖形圖像處理(Photoshop)
- Blockchain Quick Start Guide
- VB語言程序設計
- 網絡組建與互聯
- 自動控制理論(非自動化專業)
- 運動控制器與交流伺服系統的調試和應用
- Dreamweaver CS6中文版多功能教材
- Linux嵌入式系統開發
- HBase Essentials
- Mastering Ansible(Second Edition)
- Web璀璨:Silverlight應用技術完全指南
- Puppet 3 Beginner’s Guide
- 機器學習案例分析(基于Python語言)
- Serverless Design Patterns and Best Practices