- Mastering Machine Learning Algorithms
- Giuseppe Bonaccorso
- 207字
- 2021-06-25 22:07:36
Example of Laplacian Spectral Embedding
Let's apply this algorithm to the same dataset using the Scikit-Learn class SpectralEmbedding, with n_components=2 and n_neighbors=15:
from sklearn.manifold import SpectralEmbedding
se = SpectralEmbedding(n_components=2, n_neighbors=15)
X_se = se.fit_transform(faces['data'])
The resulting plot (zoomed in due to the presence of a high-density region) is shown in the following graph:
Laplacian Spectral Embedding applied to the Olivetti faces dataset
Even in this case, we can see that some classes are grouped into small clusters, but at the same time, we observe many agglomerates where there are mixed samples. Both this and the previous method work with local pieces of information, trying to find low-dimensional representations that could preserve the geometrical structure of micro-features. This condition drives to a mapping where close points share local features (this is almost always true for images, but it's very difficult to prove for generic samples). Therefore, we can observe small clusters containing elements belonging to the same class, but also some apparent outliers, which, on the original manifold, can be globally different even if they share local patches. Instead, methods like Isomap or t-SNE work with the whole distribution, and try to determine a representation that is almost isometric with the original dataset considering its global properties.
- 后稀缺:自動化與未來工作
- Hands-On Graph Analytics with Neo4j
- Instant Raspberry Pi Gaming
- 過程控制工程及仿真
- 數據庫原理與應用技術學習指導
- 大數據改變世界
- Photoshop CS3圖層、通道、蒙版深度剖析寶典
- Windows游戲程序設計基礎
- INSTANT Autodesk Revit 2013 Customization with .NET How-to
- 分析力!專業Excel的制作與分析實用法則
- 自動化生產線安裝與調試(三菱FX系列)(第二版)
- PLC與變頻技術應用
- Web璀璨:Silverlight應用技術完全指南
- 計算機硬件技術基礎(第2版)
- Serverless Design Patterns and Best Practices