- Building Machine Learning Systems with Python
- Luis Pedro Coelho Willi Richert Matthieu Brucher
- 241字
- 2021-07-23 17:11:24
Visualizing the Lasso path
Using scikit-learn, we can easily visualize what happens as the value of the regularization parameter (alphas) changes. We will again use the Boston data, but now we will use the Lasso regression object:
las = Lasso()
alphas = np.logspace(-5, 2, 1000)
alphas, coefs, _= las.path(x, y,
alphas=alphas)
For each value in alphas, the path method on the Lasso object returns the coefficients that solve the Lasso problem with that parameter value. Because the result changes smoothly with alpha, this can be computed very efficiently.
A typical way to visualize this path is to plot the value of the coefficients as alpha decreases. You can do so as follows:
fig,ax = plt.subplots()
ax.plot(alphas, coefs.T)
# Set log scale
ax.set_xscale('log')
# Make alpha decrease from left to right
ax.set_xlim(alphas.max(), alphas.min())
This results in the following plot (we left out the trivial code that adds the axis labels and the title):

In this plot, the x axis shows decreasing amounts of regularization from left to right (alpha is decreasing). Each line shows how a different coefficient varies as alpha changes. The plot shows that when using very strong regularization (left side, very high alpha), the best solution is to have all values be exactly zero. As the regularization becomes weaker, one by one, the values of the different coefficients first shoot up, then stabilize. At some point, they all plateau as we are probably already close to the unpenalized solution.
- 零點起飛學Xilinx FPG
- 辦公通信設備維修
- Linux運維之道(第2版)
- 精選單片機設計與制作30例(第2版)
- 深入淺出SSD:固態(tài)存儲核心技術、原理與實戰(zhàn)(第2版)
- Camtasia Studio 8:Advanced Editing and Publishing Techniques
- 分布式系統(tǒng)與一致性
- Mastering Adobe Photoshop Elements
- 筆記本電腦使用、維護與故障排除從入門到精通(第5版)
- Internet of Things Projects with ESP32
- Spring Security 3.x Cookbook
- 單片機項目設計教程
- 計算機電路基礎(第2版)
- Blender 3D By Example
- FPGA實戰(zhàn)訓練精粹