- Mastering Machine Learning Algorithms
- Giuseppe Bonaccorso
- 80字
- 2021-06-25 22:07:29
ElasticNet
In many real cases, it's useful to apply both Ridge and Lasso regularization in order to force weight shrinkage and a global sparsity. It is possible by employing the ElasticNet regularization, defined as:
The strength of each regularization is controlled by the parameters λ1 and λ2. ElasticNet can yield excellent results whenever it's necessary to mitigate overfitting effects while encouraging sparsity. We are going to apply all the regularization techniques when discussing some deep learning architectures.
推薦閱讀
- 商戰(zhàn)數(shù)據(jù)挖掘:你需要了解的數(shù)據(jù)科學與分析思維
- Mastering D3.js
- Windows程序設(shè)計與架構(gòu)
- OpenStack Cloud Computing Cookbook(Second Edition)
- Photoshop CS3圖層、通道、蒙版深度剖析寶典
- 基于企業(yè)網(wǎng)站的顧客感知服務(wù)質(zhì)量評價理論模型與實證研究
- 格蠹匯編
- INSTANT VMware vCloud Starter
- 重估:人工智能與賦能社會
- Drupal高手建站技術(shù)手冊
- Microsoft System Center Data Protection Manager Cookbook
- 中小型網(wǎng)站建設(shè)與管理
- 數(shù)據(jù)結(jié)構(gòu)與算法(C++語言版)
- Kubernetes Design Patterns and Extensions
- 站酷志:資深設(shè)計師的Photoshop創(chuàng)意課