Unsupervised feature selection via discrete spectral clustering and feature weights

2023 
Most of the existing unsupervised feature selection methods learn the cluster structure through spectral clustering, and then use various regression models to introduce the data matrix into the indicator matrix to obtain feature selection matrix. In these methods, the clustering indicator matrix is usually continuous value, which is not the best choice for the matrix in terms of its supervising role in feature selection. Based on this, unsupervised feature selection via discrete spectral clustering and feature weights (FSDSC) is proposed in this paper. First, FSDSC integrates regression model and spectral clustering in a unified framework for feature selection, and introduces a feature weight matrix, which intuitively expresses the importance of each feature with its diagonal elements. Compared with the common feature selection matrix that requires constraints such as sparse regular items, the appearance of the feature weight matrix reduces the complexity of the model and simplifies the calculation process of feature evaluation. Secondly, for the value of the indicators matrix, the spectral clustering is improved to obtain a discrete clustering indicator matrix, which provides clearer guidance information for feature selection. Finally, in order to avoid trivial solutions, the transformation matrix is constrained by orthogonal constraint. The combination of the orthogonal regression model and spectral clustering enables the algorithm to perform feature selection and manifold information learning at the same time, thereby preserving the local geometric structure of data. Compared with other excellent unsupervised feature selection algorithms, the experimental results prove the effectiveness of the proposed algorithm.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []