LEAST SQUARES SUPPORT TENSOR MACHINE

2013 
Least squares support vector machine (LS-SVM), as a variant of the standard support vector machine (SVM) operates directly on patterns represented by vector and obtains an analytical solution directly from solving a set of linear equations instead of quadratic programming (QP). Tensor representation is useful to reduce the overfitting problem in vector-based learning, and tensor-based algorithm requires a smaller set of decision variables as compared to vector-based approaches. Above properties make the tensor learning specially suited for small-sample-size (S3) problems. In this paper, we generalize the vector-based learning algorithm least squares support vector machine to the tensor-based method least squares support tensor machine (LS-STM), which accepts tensors as input. Similar to LS-SVM, the classifier is obtained also by solving a system of linear equations rather than a QP. LS-STM is based on the tensor space, with tensor representation, the number of parameters estimated by LS-STM is less than the number of parameters estimated by LS-SVM, and avoids discarding a great deal of useful structural information. Experimental results on some benchmark datasets indicate that the performance of LS-STM is competitive in classification performance compared to LS-SVM.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    3
    Citations
    NaN
    KQI
    []