Old Web

English

Sign In

In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model specifies that the output variable depends linearly on its own previous values and on a stochastic term (an imperfectly predictable term); thus the model is in the form of a stochastic difference equation. Together with the moving-average (MA) model, it is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure; it is also a special case of the vector autoregressive model (VAR), which consists of a system of more than one interlocking stochastic difference equation in more than one evolving random variable. In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model specifies that the output variable depends linearly on its own previous values and on a stochastic term (an imperfectly predictable term); thus the model is in the form of a stochastic difference equation. Together with the moving-average (MA) model, it is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure; it is also a special case of the vector autoregressive model (VAR), which consists of a system of more than one interlocking stochastic difference equation in more than one evolving random variable. Contrary to the moving-average model, the autoregressive model is not always stationary as it may contain a unit root. The notation A R ( p ) {displaystyle AR(p)} indicates an autoregressive model of order p. The AR(p) model is defined as where φ 1 , … , φ p {displaystyle varphi _{1},ldots ,varphi _{p}} are the parameters of the model, c {displaystyle c} is a constant, and ε t {displaystyle varepsilon _{t}} is white noise. This can be equivalently written using the backshift operator B as so that, moving the summation term to the left side and using polynomial notation, we have An autoregressive model can thus be viewed as the output of an all-pole infinite impulse response filter whose input is white noise. Some parameter constraints are necessary for the model to remain wide-sense stationary. For example, processes in the AR(1) model with | φ 1 | ≥ 1 {displaystyle |varphi _{1}|geq 1} are not stationary. More generally, for an AR(p) model to be wide-sense stationary, the roots of the polynomial Φ ( z ) := 1 − ∑ i = 1 p φ i z i {displaystyle Phi (z):= extstyle 1-sum _{i=1}^{p}varphi _{i}z^{i}} must lie outside the unit circle, i.e., each (complex) root z i {displaystyle z_{i}} must satisfy | z i | > 1 {displaystyle |z_{i}|>1} . In an AR process, a one-time shock affects values of the evolving variable infinitely far into the future. For example, consider the AR(1) model X t = c + φ 1 X t − 1 + ε t {displaystyle X_{t}=c+varphi _{1}X_{t-1}+varepsilon _{t}} . A non-zero value for ε t {displaystyle varepsilon _{t}} at say time t=1 affects X 1 {displaystyle X_{1}} by the amount ε 1 {displaystyle varepsilon _{1}} . Then by the AR equation for X 2 {displaystyle X_{2}} in terms of X 1 {displaystyle X_{1}} , this affects X 2 {displaystyle X_{2}} by the amount φ 1 ε 1 {displaystyle varphi _{1}varepsilon _{1}} . Then by the AR equation for X 3 {displaystyle X_{3}} in terms of X 2 {displaystyle X_{2}} , this affects X 3 {displaystyle X_{3}} by the amount φ 1 2 ε 1 {displaystyle varphi _{1}^{2}varepsilon _{1}} . Continuing this process shows that the effect of ε 1 {displaystyle varepsilon _{1}} never ends, although if the process is stationary then the effect diminishes toward zero in the limit. Because each shock affects X values infinitely far into the future from when they occur, any given value Xt is affected by shocks occurring infinitely far into the past. This can also be seen by rewriting the autoregression

Parent Topic

Child Topic

No Parent Topic