- Neural Networks with Keras Cookbook
- V Kishore Ayyadevara
- 239字
- 2021-07-02 12:46:32
Getting ready
In the previous section, we assigned the same weightage for each class; that is, the categorical cross entropy loss is the same if the magnitude of difference between actual and predicted is the same, irrespective of whether it is for the prediction of a default or not a default.
To understand the scenario further, let's consider the following example:

In the preceding scenario, the cross-entropy loss value is just the same, irrespective of the actual value of default.
However, we know that our objective is to capture as many actual defaulters as possible in the top 10% of predictions when ranked by probability.
Hence, let's go ahead and assign a higher weight of loss (a weight of 100) when the actual value of default is 1 and a lower weightage (a weight of 1) when the actual value of default is 0.
The previous scenario now changes as follows:

Now, if we notice the cross entropy loss, it is much higher when the predictions are wrong when the actual value of default is 1 compared to the predictions when the actual value of default is 0.
Now that we have understood the intuition of assigning weightages to classes, let's go ahead and assign weights to output classes in the credit default dataset.
All the steps performed to build the dataset and model remain the same as in the previous section, except for the model-fitting process.
- Objective-C Memory Management Essentials
- Hands-On Machine Learning with scikit:learn and Scientific Python Toolkits
- 計(jì)算機(jī)圖形學(xué)編程(使用OpenGL和C++)(第2版)
- Java Web基礎(chǔ)與實(shí)例教程(第2版·微課版)
- NumPy Essentials
- 碼上行動(dòng):用ChatGPT學(xué)會(huì)Python編程
- SQL Server從入門到精通(第3版)
- Oracle數(shù)據(jù)庫編程經(jīng)典300例
- QGIS 2 Cookbook
- ASP.NET求職寶典
- MySQL 8從零開始學(xué)(視頻教學(xué)版)
- Sitecore Cookbook for Developers
- Microsoft XNA 4.0 Game Development Cookbook
- C語言從入門到精通(微視頻精編版)
- 微軟辦公軟件認(rèn)證考試MOS Access 2013實(shí)訓(xùn)教程