Information measures and performance bounds for array processors

2005 
Information measures and performance bounds are derived for frequency‐domain linear array processors deployed in homogeneous Gaussian random fields. J divergence, a measure of the (net) information rate of an array, is shown to be a useful measure of how effectively detection and estimation functions can be performed in optimum and conventional array processing structures. In a detection context, J divergence becomes a detection index that can be interpreted in terms of array gain and output signal‐to‐noise ratio. Comparisons between the divergence of optimum and conventional processors indicate, for example, that optimum processing can provide on the order of 13‐dB gain over conventional processing when trying to detect a 20‐dB signal in the presence of a 20‐dB interference located within the Rayleigh limit of the array. In an estimation context, J divergence can be used to derive “critical divergence” and Cramer‐Rao bounds on resolution variance. These bounds indicate that approximately 25‐dB output signal‐to‐noise ratio is required to obtain a 10:1 improvement over the classical Rayleigh resolution limit. The Rayleigh limit is argued to have significance only at output signal‐to‐noise ratios of approximately 10 dB. The argument is based on a new resolution limit termed the critical divergence limit. This limit is shown to give resolution limits approximately three times the Cramer‐Rao bound, indicating that the latter bound is an optimistic resolution limit. [Supported by Office of Naval Research, Statistics and Probability Branch, Arlington, VA.]
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []