透過您的圖書館登入
IP:3.134.102.182
  • 學位論文

流形學習回顧

A Review of Manifold Learning Algorithms

指導教授 : 王藹農
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


流形學習是降低資料維度的方法,可分為線性以及非線性的。非線性的方法有 Laplacian eigenmaps 和 locally linear embeddings 等。線性的方法有 MDS、ISOMAP、LPP 以及他們的衍生。這些方法的解可由跡數最小化問題得來,並等價於特徵值問題。我們給一個通用的架構並討論他們之間的關係。

關鍵字

流形學習

並列摘要


Manifold learning algorithms are techniques utilized to reduce the dimen­ sion of data sets. These methods includes the nonlinear (implicit) ones, and the linear (projective) ones. Among the nonlinear are Laplacian eigenmaps and locally linear embeddings (LLE); and among the linear are metric multi­ dimensional scaling (MDS), ISOMAP, locally preserving projections (LPP) and derivatives of them. All these methods give rise to trace minimization problems and, as a result, eigenvalue problems. We give a common frame­ work for them and discuss their relationships.

並列關鍵字

Manifold Learning

參考文獻


[1] M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003.
[2] X. He and P. Niyogi. Locality preserving projections. In S. Thrun, L. K. Saul, and B. Schölkopf, editors, Advances in Neural Information Processing Systems 16, pages 153–160. MIT Press, 2004.
[3] E. Kokiopoulou and Y. Saad. Orthogonal neighborhood preserving projections. In Proceedings of the Fifth IEEE International Conference on Data Mining, ICDM ’05, pages 234–241, Washington, DC, USA, 2005. IEEE Computer Society.
[4] E. Kokiopoulou and Y. Saad. Orthogonal neighborhood preserving projections: A projection­based dimensionality reduction technique. IEEE Trans. Pattern Anal. Mach. Intell., 29(12):2143–2156, Dec. 2007.
[5] S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000.

延伸閱讀


國際替代計量