Discriminant Analysis of Hyperspectral Imagery Using Fast Kernel Sparse and Low-Rank Graph
2017
Due to the high-dimensional characteristic of hyperspectral images, dimensionality reduction (DR) is an important preprocessing step for classification. Recently, sparse and low-rank graph-based discriminant analysis (SLGDA) has been developed for DR of hyperspectral images, for which the properties of sparsity and low-rankness are simultaneously exploited to capture both local and global structures. However, SLGDA may not achieve satisfactory results when handling complex data with nonlinear nature. To address this problem, this paper presents two kernel extensions of SLGDA. In the first proposed classical kernel SLGDA ( ${c}$ KSLGDA), the kernel trick is exploited to implicitly map the original data into a high-dimensional space. With a totally different perspective, we further propose a Nystrom-based kernel SLGDA ( ${n}$ KSLGDA) by constructing a virtual kernel space by the Nystrom method, in which virtual samples can be explicitly obtained from the original data. Both ${c}$ KSLGDA and ${n}$ KSLGDA can achieve more informative graphs than SLGDA, and offer superiority over other state-of-the-art DR methods. More importantly, the ${n}$ KSLGDA can outperform ${c}$ KSLGDA with much lower computational cost.
Keywords:
- Kernel (linear algebra)
- Kernel Fisher discriminant analysis
- Kernel embedding of distributions
- Mathematics
- Mathematical optimization
- Dimensionality reduction
- Kernel method
- Variable kernel density estimation
- Linear discriminant analysis
- Kernel principal component analysis
- Artificial intelligence
- Computer vision
- Pattern recognition
- Discrete mathematics
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
49
References
23
Citations
NaN
KQI