Measurement error due to self-absorption in calibration-free laser-induced breakdown spectroscopy

2021 
Abstract Self-absorption of spectral lines is known to lower the performance of analytical measurements via calibration-free laser-induced breakdown spectroscopy. However, the error growth due to this effect is not clearly assessed. Here we propose a method to quantify the measurement error due to self-absorption based on the calculation of the spectral radiance of a plasma in local thermodynamic equilibrium. Validated through spectroscopic measurements for a binary alloy thin film of compositional gradient, the method evidences that measurement performance lowering due to self-absorption depends on the spectral shape of the analytical transition and on the intensity measurement method. Thus, line-integrated intensity measurements of Stark broadened lines enable accurate analysis, even at large optical thickness, if line width and plasma size are precisely known. The error growth due to self-absorption is significantly larger for line shapes dominated by Doppler broadening and for line-center intensity measurements. The findings present a significant advance in compositional measurements via calibration-free laser-induced breakdown spectroscopy, as they enable straightforward selection of most appropriate analytical lines.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    1
    Citations
    NaN
    KQI
    []