Modeling of waveform distortion due to optical filtering

2000 
The usable bandwidth of an optical filter is not only limited by signal attenuation, but also the waveform distortion when optical signal is passed through the edge of the filter. The waveform distortion due to optical filtering is investigated with the assumption of linearly chirped Gaussian pulse. While the first-order optical filtering (linear slope in dB) does not induce waveform distortion, second-order optical filtering induces pulse distortion similar to fiber dispersion.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    11
    Citations
    NaN
    KQI
    []