Measurement error due to self-absorption in calibration-free laser-induced breakdown spectroscopy.
Anal Chim Acta 2021;
1185:339070. [PMID:
34711325 DOI:
10.1016/j.aca.2021.339070]
[Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2021] [Revised: 09/03/2021] [Accepted: 09/14/2021] [Indexed: 11/15/2022]
Abstract
Self-absorption of spectral lines is known to lower the performance of analytical measurements via calibration-free laser-induced breakdown spectroscopy. However, the error growth due to this effect is not clearly assessed. Here we propose a method to quantify the measurement error due to self-absorption based on the calculation of the spectral radiance of a plasma in local thermodynamic equilibrium. Validated through spectroscopic measurements for a binary alloy thin film of compositional gradient, the method evidences that measurement performance lowering due to self-absorption depends on the spectral shape of the analytical transition and on the intensity measurement method. Thus, line-integrated intensity measurements of Stark broadened lines enable accurate analysis, even at large optical thickness, if line width and plasma size are precisely known. The error growth due to self-absorption is significantly larger for line shapes dominated by Doppler broadening and for line-center intensity measurements. The findings present a significant advance in compositional measurements via calibration-free laser-induced breakdown spectroscopy, as they enable straightforward selection of most appropriate analytical lines.
Collapse