language-icon Old Web
English
Sign In

Group delay and phase delay

In signal processing, group delay is the time delay of the amplitude envelopes of the various sinusoidal components of a signal through a device under test, and is a function of frequency for each component. Phase delay, in contrast, is the time delay of the phase as opposed to the time delay of the amplitude envelope. In signal processing, group delay is the time delay of the amplitude envelopes of the various sinusoidal components of a signal through a device under test, and is a function of frequency for each component. Phase delay, in contrast, is the time delay of the phase as opposed to the time delay of the amplitude envelope. All frequency components of a signal are delayed when passed through a device such as an amplifier, a loudspeaker, or propagating through space or a medium, such as air. This signal delay will be different for the various frequencies unless the device has the property of being linear phase. ('Linear phase' and 'minimum phase' are often used interchangeably, but are actually quite different.) The delay variation means that signals consisting of multiple frequency components will suffer distortion because these components are not delayed by the same amount of time at the output of the device. This changes the shape of the signal in addition to any constant delay or scale change. A sufficiently large delay variation can cause problems such as poor fidelity in audio or intersymbol interference (ISI) in the demodulation of digital information from an analog carrier signal. High speed modems use adaptive equalizers to compensate for non-constant group delay. Group delay is a useful measure of time distortion, and is calculated by differentiating, with respect to frequency, the phase response of the device under test (DUT): the group delay is a measure of the slope of the phase response at any given frequency. Variations in group delay cause signal distortion, just as deviations from linear phase cause distortion. In linear time-invariant (LTI) system theory, control theory, and in digital or analog signal processing, the relationship between the input signal, x ( t ) {displaystyle displaystyle x(t)} , to output signal, y ( t ) {displaystyle displaystyle y(t)} , of an LTI system is governed by a convolution operation: Or, in the frequency domain,

[ "Electronic engineering", "Control theory", "Optics", "Signal", "group delay ripple" ]
Parent Topic
Child Topic
    No Parent Topic