Analytical Chemistry - Calibration

Analytical Chemistry - Calibration

CALIBRATION

Calibration of an instrument in Analytical Chemistry is the operation that determines the functional relationship between measured values (signal intensities S at certain signal positions zi) and analytical quantities characterizing types of analytes qi and their amount (content, concentration) n. Calibration includes the selection of the model (its functional form), the estimation of the model parameters as well as the errors, and their validation.

During calibration, the instrument is “taught” using known values of calibrator standards what result it should provide. A calibration curve is constructed which is the graphical relationship between the known values, such as concentrations, of a series of calibration standards and their instrument response. A calibration software is also can be used for automating calibration processes plus managing and reporting results for various measurement disciplines including electrical, temperature, pressure, flow and RF.

Proper calibration is important because ensures that an instrument (analytical device) remains within validated performance limits to accurately report results. Calibration is one of the primary processes used to maintain instrument accuracy.


References

  1. K. Danzer, L.A. Currie, Pure & Appl. Chem., Vol. 70, 4, 993-1014 (1998
  2. D. Harvey, “Modern Analytical Chemistry”, McGraw-Hill Companies Inc., 2000
  3. D.A. Skoog, F.J. Holler, T.A. Nieman, “Principles of Instrumental Analysis”. Saunders College Publishing: Philadelphia, 1998.

Author: | Publisher: Chemistry Net

Comments

Popular posts from this blog

Carbocations: Factors affecting their Stability

Standard Enthalpies of Formation of Organic Compounds

CHEMISTRY NET - INTRODUCTION - LIST OF TOPICS