language-icon Old Web
English
Sign In

Additive white Gaussian noise

Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature. The modifiers denote specific characteristics: Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature. The modifiers denote specific characteristics: Wideband noise comes from many natural noise, such as the thermal vibrations of atoms in conductors (referred to as thermal noise or Johnson–Nyquist noise), shot noise, black-body radiation from the earth and other warm objects, and from celestial sources such as the Sun. The central limit theorem of probability theory indicates that the summation of many random processes will tend to have distribution called Gaussian or Normal. AWGN is often used as a channel model in which the only impairment to communication is a linear addition of wideband or white noise with a constant spectral density (expressed as watts per hertz of bandwidth) and a Gaussian distribution of amplitude. The model does not account for fading, frequency selectivity, interference, nonlinearity or dispersion. However, it produces simple and tractable mathematical models which are useful for gaining insight into the underlying behavior of a system before these other phenomena are considered. The AWGN channel is a good model for many satellite and deep space communication links. It is not a good model for most terrestrial links because of multipath, terrain blocking, interference, etc. However, for terrestrial path modeling, AWGN is commonly used to simulate background noise of the channel under study, in addition to multipath, terrain blocking, interference, ground clutter and self interference that modern radio systems encounter in terrestrial operation. The AWGN channel is represented by a series of outputs Y i {displaystyle Y_{i}} at discrete time event index i {displaystyle i} . Y i {displaystyle Y_{i}} is the sum of the input X i {displaystyle X_{i}} and noise, Z i {displaystyle Z_{i}} , where Z i {displaystyle Z_{i}} is independent and identically distributed and drawn from a zero-mean normal distribution with variance N {displaystyle N} (the noise). The Z i {displaystyle Z_{i}} are further assumed to not be correlated with the X i {displaystyle X_{i}} . The capacity of the channel is infinite unless the noise n is nonzero, and the X i {displaystyle X_{i}} are sufficiently constrained. The most common constraint on the input is the so-called 'power' constraint, requiring that for a codeword ( x 1 , x 2 , … , x k ) {displaystyle (x_{1},x_{2},dots ,x_{k})} transmitted through the channel, we have: where P {displaystyle P} represents the maximum channel power.Therefore, the channel capacity for the power-constrained channel is given by: Where f ( x ) {displaystyle f(x)} is the distribution of X {displaystyle X} . Expand I ( X ; Y ) {displaystyle I(X;Y)} , writing it in terms of the differential entropy: But X {displaystyle X} and Z {displaystyle Z} are independent, therefore:

[ "Communication channel", "gaussian mixture noise", "additive white gaussian noise awgn channel", "additive white gaussian channel", "Shannon–Hartley theorem", "scalar costa scheme" ]
Parent Topic
Child Topic
    No Parent Topic