Conditional and Marginal Mutual Information in Gaussian and Hyperbolic Decay Time Series

2016 
We consider the amount of available information about an arbitrary future state of a Gaussian stochastic process. We derive an infinite series for the marginal mutual information in terms of the autocorrelation function. We derive an infinite series for the newly available information for prediction, the conditional mutual information, in terms of the moving average parameters, and directly characterize predictability in terms of sensitivity to random shocks. We apply our results to long memory, or more generally, hyperbolic decay models, and give information‐theoretic characterizations of the transition from persistence to anti‐persistence, stationary long memory to nonstationarity, and a stationary regime where the mutual information is not summable.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    1
    Citations
    NaN
    KQI
    []