All-Digital Bandwidth Mismatch Calibration of TI-ADCs Based on Optimally Induced Minimization

2020 
The problem of parameter mismatch in time-interleaved-analog-to-digital converters (TI-ADCs) has become a significant concern to guarantee output linearity. Several solutions have been presented for offset, gain, time skew, and bandwidth mismatches, but they can rely on hardware expensive methods. This article proposes an all-digital calibration algorithm for the TI-ADC bandwidth mismatch, which is capable of detecting the optimal correction coefficients for the derivative-based digital filters. The analyzed convergence logic further relaxes the hardware requirements. Moreover, numerical simulations and experimental results validate the calibration efficiency. A commercial 12-bit 3.6-GS/s two-channel TI-ADC was used to verify the proposed calibration algorithm under real conditions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    3
    Citations
    NaN
    KQI
    []