The laser-induced breakdown spectroscopy is used to analyze the lead content in soils. The analyzed spectral line profile is fitted by Lorentzian function for determining the background and the full-width at half-maximum (FWHM) intensity of spectral line. A self-absorption correction model based on the information of spectral broadening is introduced to calculate the true value of spectral line intensity, which refers to the elemental concentration. The results show that the background intensity obtained by spectral profile fitting is very effective and important due to removing the interference of spectral broadening, and a better precision of calibration analysis is acquired by correcting the self-absorption effect.
Laser-induced breakdown spectroscopy (LIBS) has been used to detect atomic species in various enviromnents. The quantitative analysis (C, H, O, N and S) of representative coal samples are being carried out with LIBS, and the effects of particle size are analyzed. A powerful pulse Nd:YAG laser is focused on the coal sample at atmosphere pressure, and the emission spectra from laser-induced plasmas are measured by time-resolved spectroscopy, and the intensity of analyzed spectral lines is obtained through observing the laser plasma with a delay time of 0.4 #s. The experimental results show that the slope of calibration curve is nearly 1 when the concentration of the analyzed element is relatively low, and the slope of curve is nearly 0.5 when the concentration of C is higher than other elements. In addition, using the calibration-free model without self-absorption effect, the results show that the decreasing of particle size leads to an increase of the plasma temperature.