Autoregressive–moving-average model

In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA). The general ARMA model was described in the 1951 thesis of Peter Whittle, Hypothesis testing in time series analysis, and it was popularized in the 1970 book by George E. P. Box and Gwilym Jenkins. In the statistical analysis of time series, autoregressive–moving-average (ARMA) models provide a parsimonious description of a (weakly) stationary stochastic process in terms of two polynomials, one for the autoregression (AR) and the second for the moving average (MA). The general ARMA model was described in the 1951 thesis of Peter Whittle, Hypothesis testing in time series analysis, and it was popularized in the 1970 book by George E. P. Box and Gwilym Jenkins. Given a time series of data Xt , the ARMA model is a tool for understanding and, perhaps, predicting future values in this series. The AR part involves regressing the variable on its own lagged (i.e., past) values. The MA part involves modeling the error term as a linear combination of error terms occurring contemporaneously and at various times in the past.The model is usually referred to as the ARMA(p,q) model where p is the order of the AR part and q is the order of the MA part (as defined below). ARMA models can be estimated by using the Box–Jenkins method. The notation AR(p) refers to the autoregressive model of order p. The AR(p) model is written where φ 1 , … , φ p {displaystyle varphi _{1},ldots ,varphi _{p}} are parameters, c {displaystyle c} is a constant, and the random variable ε t {displaystyle varepsilon _{t}} is white noise. Some constraints are necessary on the values of the parameters so that the model remains stationary. For example, processes in the AR(1) model with | φ 1 | ≥ 1 {displaystyle |varphi _{1}|geq 1} are not stationary. The notation MA(q) refers to the moving average model of order q: where the θ1, ..., θq are the parameters of the model, μ is the expectation of X t {displaystyle X_{t}} (often assumed to equal 0), and the ε t {displaystyle varepsilon _{t}} , ε t − 1 {displaystyle varepsilon _{t-1}} ,... are again, white noise error terms. The notation ARMA(p, q) refers to the model with p autoregressive terms and q moving-average terms. This model contains the AR(p) and MA(q) models,

[ "Autoregressive model", "Series (mathematics)", "arma process", "white noise estimator", "recursive extended least squares" ]
Parent Topic
Child Topic
    No Parent Topic