Functional Linear Regression with Mixed Predictors.

2020 
We consider a general functional linear regression model, allowing for both functional and high-dimensional vector covariates. Furthermore, the proposed model can accommodate discretized observations of functional variables and different reproducing kernel Hilbert spaces (RKHS) for the functional regression coefficients. Based on this general setting, we propose a penalized least squares approach in RKHS, where the penalties enforce both smoothness and sparsity on the functional estimators. We also show that the excess prediction risk of our estimators is minimax optimal under this general model setting. Our analysis reveals an interesting phase transition phenomenon and the optimal excess risk is determined jointly by the sparsity and the smoothness of the functional regression coefficients. We devise a novel optimization algorithm, simultaneously handling the smoothness and sparsity penalization.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    67
    References
    3
    Citations
    NaN
    KQI
    []