Reducing the spectral nonlinearity error caused by varying integration time

2018 
Abstract In the spectral analysis, it is very common to adjust the grating spectrometer's integration time to make it at the appropriate sensitivity, which can get high signal-to-noise ratio (SNR) and unsaturated spectrum. However, varying the integration time during measurement will introduce errors. The influence of varying spectrometer integration time on the spectral analysis was studied in this work, and a method was proposed to formulate a calibration model to inhibit the effect of different sensitivity spectrum due to different integration time. The strategy is to construct a model that is insensitive to the variation of integration time. This method involves the use of collected spectra under different integration time for the model establishment where the obtained model is found to satisfactorily inhibit the influence of integration time variation. An experiment was designed: Collecting the transmission spectra of Intra-lipid suspension by setting different integration time, the obtained spectral data were then processed by “Interactive-Validation method”that can evaluate the error caused by different integration, and processed by the“new modeling method”. The experimental results showed that when the integration time was varied to achieve high SNR spectra acquisition, the model established via the novel modeling strategy was clearly a viable method of improving its robustness by incorporating varying integration time, and it could well inhibit the error caused by integration time variation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    2
    Citations
    NaN
    KQI
    []