Open access peer-reviewed chapter

Introductory Chapter: History and Scope of Quality Control in Laboratories

By Gaffar Sarwar Zaman

Submitted: May 8th 2017Reviewed: January 29th 2018Published: August 22nd 2018

DOI: 10.5772/intechopen.74593

Downloaded: 446

1. Introduction

Laboratory quality control encompasses the integral and essential monitoring of investigative analytics to find out investigative and interfering flaws, miscalculations and blunders that appear or materialize during the analytical procedures and conclusively avert the erroneous reporting of results to the patient. Quality control is usually used to monitor the accomplishment of a test, and, to find the accuracy. Levey and Jennings were the pioneers of using statistics for the standardization and calibration of analytical procedures. The substances used for quality control are known as quality control (QC) materials. They are usually aliquoted in a stable form. Nowadays, most laboratories purchase the QC material from reputed companies instead of making them. The QC material is usually in a powdered form and can be stored for more than a year. The concept of statistical quality control decreased to a great extent the cost of quality control by the help of the method of sampling.

2. Historical perspective of quality control in laboratories

Walter Andrew Shewhart (March 18, 1891–March 11, 1967) was an American engineer, statistician and physicist, also called “the Father of Modern Quality Control,” and he is also credited as the founder of the “Shewhart cycle” [1] (Figure 1). Walter Shewhart was instrumental in the introduction and development of “Process Control” in the year 1924. Prevention of the manufacture of defective products was the main aim of this method. For this, he also suggested and developed control charts, known as “Average Shewhart Chart” and “R Shewhart Chart or Range Shewhart Chart”. W. Edwards Deming was another pioneer in this field [2, 3, 4, 5] (Figure 2). He developed 14 rules, which comprised of a series of successive steps of testing. His contributions boosted Japan’s rapid industrial growth in the post-war period.

Figure 1.

Walter Andrew Shewhart (March 18, 1891–March 11, 1967) [reference: http://www.york.ac.uk/depts/maths/histstat/people/shewhart.gif] [4].

Figure 2.

William Edwards Deming (October 14, 1900–December 20, 1993). http://www.fda.gov/oc/initiatives/criticalpath/stanski/stanski.html [5].

2.1. Levey-Jennings chart

It is a chart in which the data from a quality control are plotted and from which visually we can find out whether a particular laboratory test is working or not. The name of the chart is given after Stanley Levey and Εlmer R. Jennings who first introduced this chart in the 1950s. It has become so popular that nowadays that it is even used for automated analyzers.

When Rausch and Freier introduced that serum pools should be used in place of samples from patients, the chart of Stanley Levey and Εlmer R. Jennings became even more popular. Thus, these samples that came to be called as “standards” were ultimately known as “control samples” [6].

In 1954, E.S. Page of the University of Cambridge introduced the sequential analysis technique known as the cumulative sum control chart (CUSUM). CUSUM was devised as a method to find out changes in the techniques or quality and also the exact time as to when to take corrective actions. But the CUSUM chart was only analyzed for feasible use in the laboratory many years later. It was followed by the invention of the “Exponentially Weighted Moving Average (EWMA)” chart in 1959 by the American S.W. Roberts [7]. However, the EWMA chart was adapted for use in medical laboratory applications much later [8].

However the use of patient results for quality control only started during 1960–1970. One of the pioneers in this field was the Japanese statistician Kaoru Ishikawa. Quality control was used for patients’ tests involving hematology and biochemistry laboratories [9]. Dennis Dorsay, in 1963, was one of the pioneers who stressed the importance of erythrocyte indexes for quality control in various hematology analyzers [10].

Repeated assaying of whole blood samples from two successive days was advocated by Frank Ductra in 1966; this was to be done in place of control samples and also revolutionized the quality control process [11].

In 1965, Michael Waid and Robert Hoffmann introduced the unique “Average of normals” (AON) method where systematic errors can be detected by the arithmetic average of normal test results produced by biochemical analyses [12].

Quality control in hematology analyzers, called “Bull’s algorithm” or “X_B”, was introduced in 1974 by the American hematologist Brian Bull [13].

The introduction of “Computer simulations” brought about a big change in the issues of quality control. It was introduced by the Swedish clinical engineer Torsten Aronson, medical doctor Carl-Henric de Verdier and a physicist Torgny Groth [10, 14]. In the same year, Arthur Gottmann and Jerome Nosanchuk [13, 14] utilized the new method of comparing each patient’s results with previous results, within a specified time, to detect any errors made my analyzers. It proved to be a reliable and cost-effective quality method. The distinctive feature of this method was the use of patient’s results instead of control samples, and it does not make any discrimination between normal and pathological values.

The “delta check method” (in comparison with the previous record) and the “rate check method” in which the time elapsed between measurements were being considered and were suggested by Nosanchuk and Gottmann [15]. Usefulness of the moving average was elucidated by the Canadian clinical chemist George Cembrowski and the American clinical pathologist James Westgard in 1975 (Figure 3). A year after, the use of anion gap equation for automated blood gas and electrolyte analyzers quality control was advocated [16] by David Witte and co-workers.

Figure 3.

Professor James O. Westgard is President of Westgard QC, Inc., a small business providing education and training for laboratory quality management. He is an Emeritus Professor in the Department of Pathology and Laboratory Medicine at the University of Wisconsin Medical School [reference: https://www.labqualityconfab.com/speakers/james-o-westgard] [17].

“A multi-rule Shewhart Chart for quality control in clinical chemistry” was published during the 1980s by Westgard, marking a major breakthrough in quality control for laboratories. The simple rules explaining implementation of the Levy-Jennings chart were given in this chapter.

The initial international quality standard for operations in a clinical laboratory was also established during the 1980s. During the 1990s, the theoretical and practical application of biological variances as analytical targets in clinical chemistry [18, 19] were worked upon by Fraser and his co-workers, distinguishable among them being Eugene Harris, the American clinical chemist, who was instrumental in contributing to the formulation of the theory of biological variances through his expertise of statistics and informatics [20]. Another notable contribution is that of Carmen Ricos [21, 22, 23] and her group of Spanish researchers (majority), who were responsible for collecting data on quality specifications and biological variances a number of biochemical parameters.

The “OPSpecs charts” [24] concept was proposed by Westgard in 1994. Non-analytical errors, that is, errors that occur before or after analysis, were also discussed extensively during the 1990s. Configuration of laboratory information systems (LISs) led to the prevention of post-analytical errors and some types of pre-analytical errors.

Later on, Westgard introduced the six sigma theory in clinical chemistry, which proved to be another method of establishing quality specifications [25].

3. Salient features

The QC should proceed through three parts, mainly:

  1. Each analytic method should have its own statistical limits of variation.

  2. These limits should be utilized for finding out the QC data which is generated for each type of test.

  3. Elimination of the various errors, and if found:

    1. The cause of the error should be found out.

    2. Action should be taken to correct the error.

    3. The patients’ data should be re-analyzed.

Multirule procedure: this includes decision criteria to determine if an analytic run is in control; it is used to detect random and systemic error over time and is developed by Westgard and Groth [26].

Proficiency testing, internal quality control, laboratory inspections, clinical utilization and quality assurance monitoring play an important role as indicators of analytic performance. Management of quality consists of quality design, quality control and quality improvement [26] (Figure 4).

Figure 4.

Relation between various aspects of quality control [26].

Use of automated analyzers in clinical laboratories: nowadays, almost every laboratory uses automated analyzers. The reason is that they are more reliable, can process more samples at a time, and are time saving and also cost saving in the long run. Most companies provide the quality control material along with the quality control guide. This has made it easier for laboratories to assess quality of the various types of parameters.

© 2018 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Gaffar Sarwar Zaman (August 22nd 2018). Introductory Chapter: History and Scope of Quality Control in Laboratories, Quality Control in Laboratory, Gaffar Sarwar Zaman, IntechOpen, DOI: 10.5772/intechopen.74593. Available from:

chapter statistics

446total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

The Basic Concepts of Quality Control Reference: Interval Studies, Diagnostic Efficiency, and Method Evaluation in Quality Control

By Ayed Dera

Related Book

First chapter

Introductory Chapter: Historical Perspective and Brief Overview of Insulin

By Gaffar Sarwar Zaman

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us