Model averaging prediction for time series models with a diverging number of parameters

2020 
Abstract An important problem with the model averaging approach is the choice of weights. In this paper, a generalized Mallows model averaging (GMMA) criterion for choosing weights is developed in the context of an infinite order autoregressive (AR( ∞ )) process. The GMMA method adapts to the circumstances in which the dimensions of candidate models can be large and increase with the sample size. The GMMA method is shown to be asymptotically optimal in the sense of achieving the lowest out-of-sample mean squared prediction error (MSPE) for both the independent-realization and the same-realization predictions, which, as a byproduct, solves a conjecture put forward by Hansen (2008) that the well-known Mallows model averaging criterion from Hansen (2007) is asymptotically optimal for predicting the future of a time series. The rate of the GMMA-based weight estimator tending to the optimal weight vector minimizing the independent-realization MSPE is derived as well. Both simulation experiment and real data analysis illustrate the merits of the GMMA method in the prediction of an AR( ∞ ) process.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []