Abstract—The need to analyze high-dimensional data in various areas, such as image processing, human gene regulation and smart grids, raises the importance of dimensionality reduction. While classical linear dimensionality reduction methods are easily implementable and efficiently computable, they fail to discover the true structure of high-dimensional data lying on a non-linear subspace. To overcome this issue, many non-linear dimensionality reduction approaches, such as Locally Linear Embedding, Isometric Embedding and Semidefinite Embedding, have been proposed. Though these approaches can learn the global structure of non-linear manifolds, they are computationally expensive, potentially limiting their use in large-scale applications involve very high-dimensional data. An innovative method framework that combines random projections and non-linear dimensionality reduction methods is proposed in this paper to increase computational speed and reduce memory usage, while preserving the non-linear data geometry. Illustrations with various combinations of random projections and non-linear dimensionality reduction methods tested on a hand-written digits dataset are also given in this paper to demonstrate that this method framework is both computationally efficient and globally optimal.
Index Terms—Random projections, nonlinear dimensionality reduction, manifold learning, high dimensional data.
Long Cheng is with the Kiwii Power Technology Co., Ltd, Troy, NY, USA (e-mail: email@example.com).
Chenyu You and Yani Guan are with the Rensselaer Polytechnic Institute, Troy, NY, USA (e-mail: firstname.lastname@example.org, email@example.com).
Cite: Long Cheng, Chenyu You, and Yani Guan, "Random Projections for Non-linear Dimensionality Reduction," International Journal of Machine Learning and Computing vol. 6, no. 4, pp. 220-225, 2016.