- Mastering Machine Learning Algorithms
- Giuseppe Bonaccorso
- 173字
- 2021-06-25 22:07:28
Categorical cross-entropy
Categorical cross-entropy is the most diffused classification cost function, adopted by logistic regression and the majority of neural architectures. The generic analytical expression is:
This cost function is convex and can be easily optimized using stochastic gradient descent techniques; moreover, it has another important interpretation. If we are training a classifier, our goal is to create a model whose distribution is as similar as possible to pdata. This condition can be achieved by minimizing the Kullback-Leibler pergence between the two distributions:
In the previous expression, pM is the distribution generated by the model. Now, if we rewrite the pergence, we get:
The first term is the entropy of the data-generating distribution, and it doesn't depend on the model parameters, while the second one is the cross-entropy. Therefore, if we minimize the cross-entropy, we also minimize the Kullback-Leibler pergence, forcing the model to reproduce a distribution that is very similar to pdata. This is a very elegant explanation as to why the cross-entropy cost function is an excellent choice for classification problems.
- Oracle SOA Governance 11g Implementation
- Dreamweaver CS3+Flash CS3+Fireworks CS3創意網站構建實例詳解
- Introduction to DevOps with Kubernetes
- 圖解PLC控制系統梯形圖和語句表
- Apache Hive Essentials
- Mastering Machine Learning Algorithms
- 人工智能與人工生命
- 可編程序控制器應用實訓(三菱機型)
- JavaScript典型應用與最佳實踐
- PLC與變頻技術應用
- HBase Essentials
- FANUC工業機器人配置與編程技術
- C#求職寶典
- Creating ELearning Games with Unity
- 案例解說Delphi典型控制應用