- Mastering Machine Learning Algorithms
- Giuseppe Bonaccorso
- 181字
- 2021-06-25 22:07:28
Ridge
Ridge regularization (also known as Tikhonov regularization) is based on the squared L2-norm of the parameter vector:
This penalty avoids an infinite growth of the parameters (for this reason, it's also known as weight shrinkage), and it's particularly useful when the model is ill-conditioned, or there is multicollinearity, due to the fact that the samples are completely independent (a relatively common condition).
In the following diagram, we see a schematic representation of the Ridge regularization in a bidimensional scenario:
Ridge (L2) regularization
The zero-centered circle represents the Ridge boundary, while the shaded surface is the original cost function. Without regularization, the minimum (w1, w2) has a magnitude (for example, the distance from the origin) which is about double the one obtained by applying a Ridge constraint, confirming the expected shrinkage. When applied to regressions solved with the Ordinary Least Squares (OLS) algorithm, it's possible to prove that there always exists a Ridge coefficient, so that the weights are shrunk with respect the OLS ones. The same result, with some restrictions, can be extended to other cost functions.
- 高效能辦公必修課:Word圖文處理
- Microsoft Dynamics CRM Customization Essentials
- R Machine Learning By Example
- Dreamweaver CS3網頁制作融會貫通
- 西門子S7-200 SMART PLC從入門到精通
- 來吧!帶你玩轉Excel VBA
- HBase Design Patterns
- CSS全程指南
- 工業機器人工程應用虛擬仿真教程:MotoSim EG-VRC
- 21天學通C語言
- 基于Xilinx ISE的FPAG/CPLD設計與應用
- JRuby語言實戰技術
- Raspberry Pi Projects for Kids
- FANUC工業機器人虛擬仿真教程
- 牛津通識讀本:大數據(中文版)