Analytical Chemistry - Calibration
| |
|
|
Calibration of an instrument in Analytical Chemistry is the operation that determines the functional relationship between measured values (signal intensities S at certain signal positions zi) and analytical quantities characterizing types of analytes qi and their amount (content, concentration) n. Calibration includes the selection of the model (its functional form), the estimation of the model parameters as well as the errors, and their validation.
During calibration, the instrument is “taught” using known values of calibrator standards what result it should provide. A calibration curve is constructed which is the graphical relationship between the known values, such as concentrations, of a series of calibration standards and their instrument response. A calibration software is also can be used for automating calibration processes plus managing and reporting results for various measurement disciplines including electrical, temperature, pressure, flow and RF.
Proper calibration is important because ensures that an instrument (analytical device) remains within validated performance limits to accurately report results. Calibration is one of the primary processes used to maintain instrument accuracy.
References
- K. Danzer, L.A. Currie, Pure & Appl. Chem., Vol. 70, 4, 993-1014 (1998
- D. Harvey, “Modern Analytical Chemistry”, McGraw-Hill Companies Inc., 2000
- D.A. Skoog, F.J. Holler, T.A. Nieman, “Principles of Instrumental Analysis”. Saunders College Publishing: Philadelphia, 1998.
Comments
Post a Comment