流形學習是降低資料維度的方法,可分為線性以及非線性的。非線性的方法有 Laplacian eigenmaps 和 locally linear embeddings 等。線性的方法有 MDS、ISOMAP、LPP 以及他們的衍生。這些方法的解可由跡數最小化問題得來,並等價於特徵值問題。我們給一個通用的架構並討論他們之間的關係。
Manifold learning algorithms are techniques utilized to reduce the dimen sion of data sets. These methods includes the nonlinear (implicit) ones, and the linear (projective) ones. Among the nonlinear are Laplacian eigenmaps and locally linear embeddings (LLE); and among the linear are metric multi dimensional scaling (MDS), ISOMAP, locally preserving projections (LPP) and derivatives of them. All these methods give rise to trace minimization problems and, as a result, eigenvalue problems. We give a common frame work for them and discuss their relationships.