Quantitative Analysis by Atomic Absorption
Quantitative Analysis by Atomic Absorption
The capability of an atom to absorb very specific wavelengths of light is utilized in atomic absorption spectrometry.
Light of a specific wavelength and of initial intensity Io is focused on the flame cell containing ground state atoms. The initial light intensity is decreased by an amount determined by the atom concentration in the flame cell. The light is then directed to the detector where the reduced intensity, It, is measured. The amount of light absorbed is determined by comparing It to Io and according to Beer’s law:
A = log Io / It = α * c * d (1)
Absorbance A is the most convenient term for characterizing light absorption in absorption spectrophotometry, as this quantity follows a linear relationship with concentration c.
Equation (1) can be used for quantitative analysis by atomic absorption spectrometry AAS. A calibration curve is used to determine the unknown concentration of an element – i.e. nickel – in a solution. The instrument is calibrated using several solutions of known concentrations of the element under examination. The calibration curve shows the concentration of the element in solution against the amount of radiation absorbed (Fig. I.1).
The sample solution is fed into the instrument and the unknown concentration of the element – i.e. nickel – is then displayed on the calibration curve (Fig. I.1).
For example for the Ni solution above an absorbance of 0.37 was obtained that corresponds to a concentration of 12 mg/l.
Over the region where the Beer’s law relationship is observed, the calibration yields a straight line. As the concentration and absorbance increase, nonideal behavior in the absorption process can cause a deviation from linearity as shown in Fig. I.1. There are several reasons for this nonideal behavior such as nonhomogeneities of temperature and space in the absorbing cell, line broadening, absorption at nearby lines and stray light.
As shown above, after such a calibration is established (Fig. I.1) the absorbance of solutions of unknown concentrations may be measured and the corresponding concentrations can be determined from the calibration curve.
The instrument performance for an element can be monitored by the following parameters:
- Characteristic concentration for the element
- Detection limit
The characteristic concentration for an element (called “sensitivity”) is a convention and is defined as the concentration of analyte giving an absorbance of 0.00436 (corresponding to a percent transmittance of 99%). Usually the wavelength providing the best sensitivity is used, although a less sensitive wavelength may be more appropriate for a high concentration of analyte. A less sensitive wavelength also may be appropriate when significant interferences occur at the most sensitive wavelength.
Characteristic Conc. (mg/l) = Conc. of Standard (mg/l) * 0.0044 / measured absorbance
There are several practical reasons for wanting to know the value of the characteristic concentration for an element. For example, knowing the expected characteristic concentration of an element allows an operator to determine if all instrumental conditions are optimized and if the instrument is performing according to specifications. This is accomplished by simply measuring the absorbance of a known concentration of the element and comparing the results to the expected value.
Even though the magnitude of the absorbance signal can be predicted from the value given for characteristic concentration, no information is given on how small of an absorbance signal can be measured.
The smallest measurable concentration of an element – the detection limit of the element - will be determined by the magnitude of the absorbance observed for the element and the stability of the absorbance signal.
The detection limit (according to IUPAC) is the smallest concentration or absolute amount of analyte that has a signal significantly larger than the signal arising from a reagent blank.
Mathematically, the analyte’s signal at the detection limit (sDL) is given by:
sDL = sreag + 3 * σreag
where sreag is the signal for a reagent blank, sreag is the known standard deviation for the reagent blank’s signal.
Other approaches for defining the detection limit have also been developed. In atomic absorption spectrometry usually the detection limit is determined for a certain element by analyzing a diluted solution of this element and recording the corresponding absorbances. The experiment is repeated for 10 times. The 3σ of the recorded absorbance signal can be considered as the detection limit for the specific element under the experimental conditions used – wavelength, type of flame, instrument.
For example, let us suppose that the detection limit for Cu has to be determined by AAS under certain experimental conditions. A 0.1 ppm Cu solution is analyzed for 10 times by AAS at a wavelength of 324.8 nm and the corresponding absorbance values are recorded:
Experiment # | Absorbace |
1 | 0.006 |
2 | 0.005 |
3 | 0.007 |
4 | 0.007 |
5 | 0.006 |
6 | 0.007 |
7 | 0.005 |
8 | 0.004 |
9 | 0.005 |
10 | 0.004 |
Average | 0.0056 |
σ | 0.0012 |
3σ | 0.0036 |
Therefore, the detection limit for Cu (Cu signal at the detection limit) under the above conditions is 0.0036 and this absorbance corresponds to a Cu solution concentration approximately at:
0.1 ppm * (0.0036/0.0056) = 0.064 ppm (assuming we are working on the linear region of the calibration curve).
Relevant Posts
Atomic Absorption Spectrometry (AAS)
Advantages and Disadvantages of Atomic Absorption Spectrometry and Graphite Furnace
References
- R. Ferrus, ; M.R. Egea, Anal. Chim. Acta, 287, 119–145 (1994)
- J.A. Glaser, D.L. Foerst, et al. Environ. Sci.Technol., 15, 1426–1435 (1981)
- P.W.J. Boumans, Anal. Chem., 66, 459A–467A (1994)
Comments
Post a Comment