Informational Divergence and Entropy Rate on Rooted Trees with Probabilities

2013 
Rooted trees with probabilities are used to analyze properties of a variable length code. A bound is derived on the difference between the entropy rates of the code and a memoryless source. The bound is in terms of normalized informational divergence. The bound is used to derive converses for exact random number generation, resolution coding, and distribution matching.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    1
    Citations
    NaN
    KQI
    []