Methods | Descriptions |
---|---|
Principal component analysis (PCA) [44] | PCA decomposes data into a set of successive orthogonal components that explain a maximum amount of the variance |
Kernel PCA (KPCA) [45] | KPCA extends PCA by using kernel functions to achieve non-linear dimensionality reduction |
Latent semantic analysis (LSA) [46] | LSA is similar to PCA but differs in that the data matrix does not need to be centered |
Gaussian random projection (GRP) [47] | GRP projects the original input features onto a randomly generated matrix where components are drawn from a Gaussian distribution |
Sparse random projection (SRP) [48] | SRP projects the original input features onto a sparse random matrix, which is an alternative to dense Gaussian random projection matrix |
Multidimensional scaling (MDS) [49] | MDS is a technique used for analyzing similarity or dissimilarity data, seeking a low-dimensional representation of the data in which the distances respect well the distances in the original high-dimensional space |
Isomap [50] | Isomap is a manifold learning algorithm, seeking a lower-dimensional embedding that maintains geodesic distances between all points |
Locally linear embedding (LLE) [51] | LLE projects the original input features to a lower-dimensional space by preserving distances within local neighborhoods |