Proficiency testing performance in US laboratories: results reported to the Centers for Medicare & Medicaid Services, 1994 through 2006.
* Context.--Beginning in 1994, clinical laboratories performing
nonwaived testing were required, under the regulations implementing the
Clinical Laboratory Improvement Amendments of 1988 (CLIA), to enroll and
participate in a proficiency testing (PT) program approved by the
Centers for Medicare & Medicaid Services. Successful PT performance
is a requirement for maintaining CLIA certification to perform testing
in certain specialties and subspecialties and for specific analytes.
Objective.--To evaluate the PT performance from 1994 through 2006 of hospital and independent laboratories (HI) compared with all other testing sites (AOT) for selected commonly performed tests and analytes.
Design.--Proficiency testing data, from 1994 through 2006, were electronically reported to the Centers for Medicare & Medicaid Services by approved PT programs as required by CLIA regulations. Approximately 16 million PT event scores from 36 000 unique testing sites were sorted into 2 groups based on the type of testing facility: HI or AOT.
Results.--The PT performance scores for 15 of the most commonly performed tests demonstrated a decline in failure rates for both HI and AOT laboratory groups during 1994 through 2006 (analyte/test values reported in this article include alanine aminotransferase, amylase, bilirubin, cholesterol, digoxin, glucose, hemoglobin, leukocyte count, potassium, prothrombin time, theophylline, thyroxine, triglycerides, white blood cell differential, and uric acid). For most analytes, the difference in failure rates between HI and AOT was statistically significant. The AOT group started with higher failure rates, and remained higher for all analytes, during most years when compared with the HI group; although, over time, that difference diminished. The AOT group showed a greater decline in PT failure than the HI group. For all analytes, the AOT group performance improved during this period.
Conclusions.--The PT performance improved dramatically for the AOT group from 1994 through 2006 as measured by a decrease in the percentage of laboratories with unsatisfactory performance for 15 selected analytes. The PT performance in the HI group improved modestly for some analytes during this same period, whereas, for other analytes, the group showed no apparent improvement.
(Arch Pathol Lab Med. 2010;134:751-758)
(Laws, regulations and rules)
Medical tests (Laws, regulations and rules)
Krolak, John M.
Handsfield, James H.
|Publication:||Name: Archives of Pathology & Laboratory Medicine Publisher: College of American Pathologists Audience: Academic; Professional Format: Magazine/Journal Subject: Health Copyright: COPYRIGHT 2010 College of American Pathologists ISSN: 1543-2165|
|Issue:||Date: May, 2010 Source Volume: 134 Source Issue: 5|
|Topic:||Event Code: 930 Government regulation; 940 Government regulation (cont); 980 Legal issues & crime Advertising Code: 94 Legal/Government Regulation Computer Subject: Government regulation|
|Product:||Product Code: 9124500 Health Care Financing Admin; 7397030 Biomedical Testing Labs NAICS Code: 92312 Administration of Public Health Programs; 54138 Testing Laboratories SIC Code: 8071 Medical laboratories; 8731 Commercial physical research|
|Organization:||Government Agency: United States. Centers for Medicare and Medicaid Services|
|Geographic:||Geographic Scope: United States Geographic Code: 1USA United States|
Under the regulations implementing the Clinical Laboratory
Improvement Amendments of 1988 (CLIA), laboratories conducting nonwaived
testing (ie, moderate and high complexity testing) are required to
enroll and participate in a proficiency testing (PT) program approved by
the Centers for Medicare & Medicaid Services (CMS) for prescribed
tests (regulated analytes) within certain specialties and subspecialties
of testing. (1-4) In accordance with CLIA regulations, PT events are
generally conducted 3 times per year and consist of 5 challenges per
event. For most specialty and subspecialty tests (immunohematology and
cytology are exceptions), satisfactory performance in a PT event is
achieved by attaining an overall testing event score of at least 80%.
(1) This requires a laboratory to attain the target result, as
determined by the PT program, using the CLIA-prescribed acceptance
criteria, for at least 4 of the 5 challenges (80%) in a given PT event.
Failure to achieve satisfactory performance for any analyte on 2
consecutive testing events or for 2 of 3 events is considered
unsuccessful performance, which can result in sanctions against the
laboratory. (1,3,5) To meet CLIA requirements, CMS-approved PT programs
are required to report PT event result scores (percentage of correct
results) to the CMS for each laboratory enrolled in their program for
the regulated analytes they perform.
An analysis of the PT scores data reported to CMS during 1994, the first year of compulsory PT for all laboratories performing nonwaived tests, has been previously reported. (5) Those results showed a difference in PT performance between laboratories that had been previously regulated under CLIA (most hospital and independent laboratories [HI]) and those that were previously unregulated (all other testing sites [AOT]). The latter was a large group, which included long-term care facilities, nursing homes, ambulatory sites, mobile clinics, and physician office laboratories (POLs). The PT performance scores achieved by the HI group were consistently better than those of the AOT group. In subsequent reports, certified laboratories enrolled in PT programs showed improved performance over time, and inspection and PT performance data indicated that the quality of testing improved in certified sites. (2-4) In this study, we analyzed the PT score data reported to the CMS in 1994 through 2006 for 15 representative tests, using the same grouping of laboratories (HI and AOT) as previously grouped by Stull et al (5) for the 1994 data. This study was conducted to determine whether a change in PT performance for each group, as measured by PT failure rates, occurred over time and whether the PT performance level between the 2 groups of laboratories continued to differ during the 13 years after implementation of the CLIA PT requirements. The evaluation criteria for each analyte/test have not changed since 1994, although a rule change regarding the grading consensus requirements was finalized in 2003, primarily to reduce ungraded challenges.
MATERIALS AND METHODS
The study data evaluated were extracted from the CMS PT Monitoring Database. These data are PT scores electronically reported to CMS by approved PT programs in a standardized format and consist of a value between 0 and 100 representing the percentage of correct responses (usually from 5 challenges) during any one testing event for each analyte for each laboratory. For the analytes/tests reported in this article, the database included 3 testing event scores per laboratory per year.
For purposes of this study and according to CLIA requirements, a score of 80% or higher is a passing score (satisfactory performance), and a score of less than 80% is a failing score (unsatisfactory performance) for each testing event. The annual rate of unsatisfactory performances (failure rate) for each analyte/test is the calculated percentage of all testing events during the year for which scores of less than 80% were reported for the laboratories in each group. Laboratories were classified into 2 groups, HI and AOT, based on laboratory type recorded in the Online Survey, Certification, and Reporting database.
By regulation, all laboratories performing nonwaived tests for analytes in this study are required to participate in PT. Because the data contain virtually all laboratories, there is no need for statistical inference. The volume of the data gives all statistical procedures a power that approaches 1, and the statement of statistical significance is generally not helpful. The Cochran-Mantel-Haenszel test was used to calculate the odds ratio for AOT referenced to HI.
Number of PT Programs
In 1994, 16 CMS-approved PT programs offered the tests that were included in this evaluation. During the subsequent 13 years, 5 of these programs discontinued offering PT, thus resulting in 11 approved programs in 2006. The list shown in Table 1 specifies the PT programs included in this evaluation.
Selection of Analytes and Tests
The 15 analytes and tests selected for this evaluation are a subset of those evaluated previously. (5) They were chosen because they were representative of different laboratory specialties and subspecialties (general chemistry, toxicology, and hematology) and were the most commonly performed tests on the basis of their frequency in the PT database for both populations of laboratory sites (HI and AOL).
In 1994, the first year of mandatory PT participation by all laboratories performing nonwaived testing, the number of laboratories participating in PT for at least one of the regulated analytes, tests, specialties, or subspecialties, according to data submitted by PT programs to CMS, was 23 544 (Table 2). This number peaked at 41 436 in 1996 and, in subsequent years, steadily declined to approximately 30 000 laboratories, where it appears to have leveled off during 2003 through 2006. In Table 2, laboratories are divided into 2 groups; HI and AOT. The change in numbers of laboratories participating in PT from year to year correlates with CMS data showing fewer laboratories holding CLIA certificates of compliance and accreditation (those laboratories performing nonwaived testing for which PT participation is required) and with a concomitant increase in laboratories with certificates of waiver for which PT participation is not required. (6)
Figure 1 shows the distribution of the 4 major types of laboratories participating in PT in the years 1996 and 2006, according to data reported by each of the approved PT programs to CMS. The HI sites included 9 498 laboratories in 1996 and 8 876 in 2006 (a 7% decrease), whereas the AOT sites comprised 30 899 laboratories in 1996 and 20 281 laboratories in 2006 (a 34% decrease). Of those AOT laboratories, POLs represented the largest sector (57% and 50%, in 1996 and 2006, respectively).
The annual rates of unsatisfactory performance (failure rates) for 15 selected tests/analytes for all testing events for all laboratories within the HI and AOT groups for the years 1994 through 2006 are shown in Figure 2 (designated as all laboratories). For most of the analytes, in most years, the HI laboratories had significantly lower rates of PT failure than AOTs did, as demonstrated by the range of odds ratios. The few exceptions in which the failure rates of the AOT and HI laboratories showed no significant differences were digoxin in 2006, theophylline in 2006, and thyroxine in 1995. The data indicate that the PT failure rates for the AOT group ranged from 5% to 11% in 1994 for all analytes. These failure rates decreased during the following 12 years to levels of 2% to 7% in 2006. In contrast, failure rates for the HI group remained less than 3% during the entire 1994 through 2006 period for all analytes, except the following values: cholesterol in 1995; digoxin in 1995; glucose in 1994 and 1995; thyroxine in 1994, 1995, 1996, and 1998; and white blood cell differential in 1998 and 2003. For the HI group, the glucose analyte had the highest failure rates of 4% and 6% in 1994 and 1995, respectively, and that rate dropped to 2% by 1998 and remained between 2% and 3% through 2006.
To evaluate whether the improvement in PT performance observed for all laboratories was not simply due to some laboratories dropping out of PT by not testing for certain analytes or by shifting to waived tests, for example, we also analyzed the annual PT failure rates for a subset of laboratories (also sorted into HI and AOT laboratory groups). This subset consists of approximately 13 000 laboratories that were in the PT scores database for the initial year (1994) and remained in the database throughout the entire 13 years (1994-2006). These results are depicted in Figure 3 and designated as subset of laboratories.
The data for this subset of laboratories (Figure 3) show patterns similar to those observed for all laboratories (Figure 2), with a significant difference in PT performance observed between HI and AOT laboratory groups (odds ratio . 1.0) for most analytes in most years. The AOT group in this subset of laboratories showed decreasing failure rates (improvement in PT performance) over time for most analytes. Exceptions include leukocyte count, prothrombin time, and white blood cell differential, which showed little to no change over time. The failure rates for the HI laboratory group in this subset for most analytes (exceptions include glucose and thyroxine) were at or around 2% or less with slight improvement or no change over time, the same patterns seen with all HI laboratories. The AOT group in this subset of laboratories that participated in PT for the full 13 years had lower PT failure rates overall than the AOT group in the total all laboratories data set (Figure 2), which included new laboratories enrolling and laboratories disenrolling during various years of the total 13-year timeframe. With some exceptions (bilirubin, cholesterol, glucose, thyroxine, and white blood cell differential), in this subset of laboratories, as seen with all laboratories, the failure rates of the AOT group diminished and approached the failure rates for the HI group in this subset during the final 1 or 2 years of the study period (2005-2006). The failure rates for 6 analytes (alanine aminotransferase, amylase, digoxin, potassium, theophylline, and triglycerides) were not significantly different between the AOT and the HI in 2006, 12 years after mandatory PT participation.
[FIGURE 2 OMITTED]
[FIGURE 3 OMITTED]
Of the 30 chemistry and hematology tests and analytes previously evaluated in an earlier report, 15 were selected for further evaluation because they were reflective of tests performed in most laboratory sites, including POLs and other sites (AOT), as well as hospital and independent laboratories (HI).5 In selecting these analytes, our attempt was to make the study as comprehensive as possible while limiting the amount of data so the results could be presented in a clear and concise manner.
Although the rates of PT failure were disparate between the HI and AOT groups for most analytes throughout this 13-year time period, factors contributing to this disparity are not fully understood. Previous reports have documented many causes for discordant PT results among laboratory types, which include clerical errors, sample handling, analytical error, random error, and errors of unknown origin. (7-17) It is likely that differences in personnel qualifications and turnover of laboratory staff, testing methodologies and equipment, and managerial factors between the HI and AOT laboratory groups contributed to the observed differences in PT performance.
Additionally, the observed differences in failure rates between the 2 laboratory groups may relate to their familiarity with the PT process, as measured by the length of time the laboratory has participated in PT. The AOT group contains most of the laboratories that were not previously regulated and did not participate in PT before 1994, whereas the HI group is composed of most of the previously regulated laboratories that participated in PT before 1994. The failure rates for the AOT group, all of which started higher in 1994 than the failure rates for laboratories in the HI group, diminished and approached the failure rates of the HI group for several analytes in 2006, 12 years after PT participation was mandated. In addition, a comparison of failure rates for the subset of AOT laboratories that participated in PT for the full 13 years showed overall lower failure rates than all AOT laboratories, which included newly enrolled and disenrolled laboratories. Also, the AOT group had a more pronounced decrease in failure rates during the 3 to 4 years immediately following 1994, the year PT participation became mandatory, for some analytes examined. A similar observation was made in an evaluation of performance in PT surveys conducted by the College of American Pathologists, in which it was reported that laboratories demonstrated "consistent and statistically significant improvement in performance for the first 3 to 4 years of proficiency testing." (14 (p307))
These data suggest that the longer laboratories participate in PT, the better their performance. Others have noted that, over time, as laboratories gradually gain experience in handling PT samples and reporting results, performance improves. (18) Experience with the PT process most likely contributed to the decrease in PT failure rates observed in this study. As previously unregulated laboratories implemented PT processes and procedures in their laboratories, continued participation in PT became routine and errors became less frequent. (3,5,11)
In contrast to the AOT group, the failure rate trends for the HI group for most of the analytes were flat or nearly flat lines during the entire period. With 2 exceptions (glucose and thyroxine), the trend lines for the annual failure rates remained between approximately 1% and 2% for the entire 13 years. Most of these laboratories had likely participated in PT for many years before 1994, and these failure rates may represent the best aggregate PT performance that can be achieved with the technology and laboratory environments available at the time. An overall failure rate of 1% to 2% may represent random error or errors attributed to the PT process itself that are not easily addressed to effect further improvement.
In addition to gaining experience with the PT process, another likely contributing factor to improved PT performance, since 1994, is the continuing implementation of innovations in testing technology that has resulted in more reliable test systems for many analytes. Test systems have vastly improved in both ease of use and speed of operation. Improvements such as simpler sample handling and sample addition, automatic or less frequent calibration, and more available or automated forms of quality control have made test systems more robust. Such changes, devised by test system manufacturers, have likely contributed to fewer errors during the analytic phase of laboratory testing, which may have translated into improved PT performance. (19)
Changes in technology have also contributed to the change in numbers of laboratories participating in PT, as shown in Figure 1. The 34% fewer sites from 1996 to 2006 included in the AOT group is likely due to the increase in the availability of simple, point-of-care tests that have been waived from PT requirements under CLIA by being classified as waived test systems or devices. This change correlates with more laboratories holding CLIA certificates of waiver and the resulting fewer laboratories with certificates of compliance or accreditation that perform nonwaived testing for which PT is required. (6,20) It is important to note that the attrition rate of the AOT laboratories participating in PT during this period was nearly 5 times higher than that of the HI laboratories. Although we attempted to control for this by comparing performance of subsets of laboratories for which results were reported for the full 13 year period, one component of the apparent improvement in performance of the AOT laboratories, which cannot be ruled out when looking at the full complement of sites is selective PT disenrollment by poor performers. These laboratories may have continued testing for certain analytes but switched to the use of tests for these analytes that are waived. Although many waived tests are as, or more, robust than the test methods these laboratories may have used previously, it is possible that performance issues may have been carried to the waived subset of laboratories. Such performance issues are not readily detected because these laboratories are not subject to routine regulatory oversight or required to participate in PT.
Some laboratories not enrolled in PT before 1994 may not have understood, or likely underestimated, the educational value of PT participation. (4) Efforts by CMS, PT programs, and laboratory-accrediting organizations to educate laboratories about the benefits of PT may have resulted in more laboratories regularly reviewing and evaluating their performance on PT challenges. Some laboratory-accrediting organizations make available literature sources and training opportunities for laboratories seeking to improve their testing abilities. The PT programs give their participating laboratories informative feedback on testing performance about the sources of errors. This information is primarily disseminated through the individual participant PT result reports and PT event summary reports. In addition to routine PT samples, many PT programs also offer their participants an array of educational opportunities, such as supplemental, non-CLIA-graded PT samples and study exercises designed to address various aspects of laboratory testing principles. All CMS-approved PT programs are required to offer and maintain some form of technical assistance, with the ability to answer questions and address concerns from participants regarding laboratory enrollment in PT. Most PT programs make available, by telephone, consultants for such assistance for each of the testing specialties they provide. Also, many PT programs now have Web sites that offer an additional source of information and support for their subscribers. Therefore, over time, many laboratories may have realized that these available resources from PT programs and others can be effective mechanisms for those wishing to learn from the PT experience to effect quality improvements.
Furthermore, a number of educational resources, previously unavailable to some laboratories have, since 1994, become generally accessible and have also helped some laboratories become more cognizant of PT and thus contributed to their improvement in testing performance. Both professional journals and trade publications have published numerous articles that explain and outline the CLIA requirements for PT and provide detailed and comparative information regarding PT enrollment and participation. (22,23) Likewise, professional laboratory organizations, such as American Association for Clinical Chemistry, American Society for Clinical Pathologists, American Society for Clinical Laboratory Scientists, Clinical Laboratory Management Association, and the College of American Pathologists, have targeted laboratories with newsletters, online publications, audio teleconferences, and workshops providing information concerning PT and quality laboratory testing. (24-28) Guidelines for investigating PT failures and establishing procedures leading to improved laboratory testing as a result of PT failure have been available from the Clinical and Laboratory Standards Institute since 1999. (29)
An evaluation of PT performance data for CLIA-certified laboratories has been published in a recent report by the US Government Accountability Office (GAO). (30) The report indicated that during the 5 years from 1999 to 2003, laboratories accredited by the College of American Pathologists and the Joint Commission (generally hospitals and large independent laboratories) had an increase in PT failures (as measured by unsuccessful performance, which is equivalent to unsatisfactory performance in 2 consecutive or 2 out of 3 PT events) across all analytes, whereas laboratories inspected by state survey agencies or accredited by COLA (generally POLs) showed a decrease in unsuccessful performance during this period. Although the conclusions of the GAO (30) report appear to conflict with the conclusions from this evaluation of data from the same CMS source, the 2 studies cannot be directly compared. Differences between the studies include the populations of laboratories for which PT data were evaluated, the scope of testing (all testing in the GAO report versus selected analytes in this study), the study periods, and the definition for failure. The GAO report evaluated unsuccessful performance (failure to attain a passing score in 2 consecutive or 2 out of 3 PT events), whereas this study evaluated unsatisfactory performance (failure to attain a passing score in a single testing event). The data presented in this article for the AOT sites (primarily POLs) show improvement in PT performance during the entire 13 years for the total group of enrolled laboratories for all the analytes evaluated. This improvement in performance and the overall higher failure rates found among this group of testing sites appears to be consistent with the conclusions made in the GAO report. (30) However, our data analysis demonstrated that HI laboratories (which would include most laboratories accredited by the College of American Pathologists and Joint Commission) also showed either improved performance or maintenance of failure rates between 1% and 3% for the analytes evaluated during the 13 years. These results appear to be inconsistent with the GAO's conclusions. However, for 12 of the 15 analytes, the HI group of laboratories did show a slight to significant (white blood cell differential) increase in failure rates when only rates for 1999 are compared with those for 2003. This may actually represent a change in the grading system, rather than a change in actual failure rates because 2003 was the implementation year for a CLIA regulatory change that reduced the grading requirement from 90% to 80% consensus. This change meant more challenges could be graded by peer comparison, resulting in some additional failures that would have otherwise passed by default for not being graded because of a lack of consensus.
A recent report (31) reviewing PT performance, primarily in small laboratories, shows findings for 8 chemistry and hematology analytes that are similar to this current study and has suggested that these declining failure rates, which are associated with experience in performing proficiency testing, will positively affect patient test performance.
Certain limitations are present in the interpretation of the data evaluated and presented in this article. The PT programs use various grading schemes and, in many cases, apply peer group grading, in which results are compared and graded within groups of laboratories reporting results using the same instruments or test methods. The PT performance, therefore, as measured by peer group grading, rather than by a comparison of PT results to results from a reference method, for example, is not a measure of accuracy. Taken a step further, when comparing PT performance evaluated on the basis of failure rates that are reflective of peer group grading, it must be acknowledged that the peer groups within the HI and AOT are likely not equivalent because the instruments and methods used in the 2 groups of laboratories may not be the same. Also, it must be noted that within the 2 groups of laboratories, different PT programs may be favored that could also introduce some bias in the results. A comparison of the failure rates between these 2 groups of laboratories does provide general information on trends over time but does not conclusively show equivalence in performance.
The results presented in this article comparing failure rates for selected analytes between 2 large groups of laboratory types represents one way to evaluate this rich data set. The objective of this study was to build on what had been previously reported in evaluating trends in performance between the 2 groups of laboratories over time. (5) Other analyses could be performed but were outside of the scope of this study. Future reports may include analyses and comparisons of the data with respect to other variables, such as laboratory accreditation status, PT programs, specific laboratory types, and additional analytes or tests.
During the 12 years subsequent to the 1994 implementation of mandatory PT participation for all laboratories performing nonwaived testing under CLIA, laboratories have shown an improvement in PT performance as demonstrated by the gradual reduction of PT failures for 15 commonly tested analytes (Figures 2 and 3). This is most evident in AOT laboratories, primarily comprised of POLs. Although, as initially reported, (5,32) there was a substantial difference in PT performance between the 2 types of laboratory groups in the first year of full CLIA PT implementation, over time, this difference has diminished for some of the analytes studied, but statistically significant differences persist for the remaining analytes.
The observed improvement in laboratory PT performance, whether in HI testing sites or AOT, has important and positive implications relevant to CLIA. Because the intent of the CLIA regulations was to ensure the accuracy of clinical laboratory testing, regardless of testing location, this current study demonstrates that there has been an improvement in testing performance, as measured by PT, since the implementation of CLIA. These data are consistent with other studies that have shown improved clinical laboratory testing performance related to PT participation. (2,7,17,18)
The authors would like to thank D. Joe Boone, PhD; Thomas L. Hearn, PhD; Shahram Shahangian, PhD; Angela Ragin-Wilson, PhD; and Daniel W. Tholen, MS, for their helpful review and feedback in the preparation of this manuscript. This study received no funding support from any sponsor and the aforementioned individuals did not receive compensation for their work.
(1.) Title 42: Public health, chapter IV, Centers for Medicare & Medicaid Services, Department of Health and Human Services, subchapter G--standards and certification, part 493: laboratory requirements. In: Code of Federal Regulations. http://ecfr.gpoaccess.gov/cgi/t/text/text-idx?c5ecfr&tpl5/ecfrbrowse/ Title42/42cfr493_main_02.tpl. Accessed July 7, 2009.
(2.) Ehrmeyer SS, Laessig RH. Has compliance with CLIA requirements really improved quality in US clinical laboratories? Clin Chem Acta. 2004;346(1):37- 43.
(3.) Ehrmeyer SS, Laessig RH. Effect of legislation (CLIA '88) on setting quality specifications in US clinical laboratories. Scand J Clin Lab Invest. 1999;59(7): 563-567.
(4.) Higgins JC. The status of physician office labs since CLIA'88. J Med Pract Manage. 2000;16(2):99-102.
(5.) Stull TM, Hearn TL, Hancock JS, Handsfield JH, Collins CL. Variation in proficiency testing performance by testing site. JAMA. 1998;279(6):463-467.
(6.) Centers for Medicare & Medicaid Services. CLIA database information. http://www.cms.hhs.gov/CLIA/17_CLIA_Statistical_Tables_Graphs.asp#TopOfPage. Accessed February 8, 2008.
(7.) Hertzberg MS, Mammen J, McCraw A, Nair SC, Srivastava A. Achieving and maintaining quality in the laboratory. Haemophilia. 2006;12(suppl 3):61-67.
(8.) Westgard JO, Westgard SA. The quality of laboratory testing today; an assessment of s metrics for analytic quality using performance data from proficiency testing surveys and the CLIA criteria for acceptable performance. Am J Clin Pathol. 2006;125(3):343-354.
(9.) Hurst J, Nickel K, Hilborne LH. Are physicians' office laboratory results of comparable quality to those produced in other laboratory settings? JAMA. 1998; 279(6):468-471.
(10.) Steindel SJ, Howanitz PJ, Renner SW. Reasons for proficiency testing failures in clinical chemistry and blood gas analysis: a College of American Pathologists Q-Probes study in 665 laboratories. Arch Pathol Lab Med. 1996; 120(12):1094-1101.
(11.) Jenny RW, Jackson-Tarentino KY. Causes of unsatisfactory performance in proficiency testing. Clin Chem. 2000;46(1):89-99.
(12.) Isenberg HD, D'Amato RF. Does proficiency testing meet its objective? J Clin Microbiol. 1996;34(11):2643-2644.
(13.) Johnson PR. The contribution of proficiency testing to improving laboratory performance and ensuring quality patient care. Clin Leadersh Manag Rev. 2004;18(6):335-341.
(14.) Tholen D, Lawson NS, Cohen T, Gilmore B. Proficiency testing performance and experience with College of American Pathologists' programs. Arch Pathol Lab Med. 1995;119(4):307-311.
(15.) Carey RN, Cembrowski GS, Garber CC, Zaki Z. Performance characteristics of several rules for self-interpretation of proficiency testing data. Arch Pathol Lab Med. 2005;129(8):997-1003.
(16.) Waugh JM, Collier CP, Day AG, Waugh M, Raymond MJ. Proficiency testing performance: a case study with modeling. Clin Biochem. 2002;35(6):447- 453.
(17.) Hoeltge GA, Phillips MG, Styer PE, Mockridge P. Detection and correction of systematic laboratory problems by analysis of clustered proficiency testing failures. Arch Pathol Lab Med. 2005;129(2):186-189.
(18.) Reilly AA, Salkin IF, McGinnis MR, et al. Evaluation of mycology laboratory proficiency testing. J Clin Microbiol. 1999;37(7):2297-2305.
(19.) Tholen DW. Improvements in performance in medical diagnostics tests documented by interlaboratory comparison programs. Accred Qual Assur. 2002; 7(4):146-152.
(20.) Centers for Disease Control and Prevention. Good laboratory practices for waived testing sites: survey findings from testing sites holding a certificate of waiver under the Clinical Laboratory Improvement Amendments of 1988 and recommendations for promoting quality testing. MMWR Morb Mortal Wkly Rep. 2005;54(RR13):1-25.
(21.) COLA. Fast facts Web site. http://www.cola.org/fastfacts.html. Accessed February 8, 2008.
(22.) Medical Laboratory Observer. MLO archive Web site. http://www. mlo-online.com/Archives/default.aspx. Accessed October 11, 2006.
(23.) Advance for Medical Laboratory Professionals. Search article Web site. http://laboratorian.advanceweb.com/. Accessed February 8, 2008.
(24.) McDowell J. CDC ponders proficiency testing updates--can revised regulations catch up with new tests and analytes? Clin Lab News. 2007;33(4): 1,8,10.
(25.) American Society for Clinical Laboratory Science (ASCLS). http://www. ascls.org. Accessed October 11, 2006.
(26.) American Society for Clinical Pathology (ASCP). ASCP education and assessment. http://www.ascp.org. Accessed February 8, 2008.
(27.) Clinical Laboratory Management Association (CLMA). Archive search, proficiency testing. http://www.clma.org. Accessed February 8, 2008.
(28.) College of American Pathologists (CAP). Accreditation and laboratory improvement. http://www.cap.org. Accessed February 8, 2008.
(29.) Using Proficiency Testing to Improve the Clinical Laboratory: Approved Guideline. 2nd ed. Wayne, PA: Clinical and Laboratory Standards Institute; 2007. CLSI document GP27-A2.
(30.) United States Government Accountability Office. Report to Congressional Requesters--Clinical Lab Quality: CMS and Survey Organization Oversight Should Be Strengthened. GAO Web site; 2006. http://www.gao.gov/cgi-bin/ getrpt?GAO-06-416. Accessed October 11, 2006.
(31.) Edson DC, Massey LD. Proficiency testing performance in physician's office, clinic, and small hospital laboratories, 1994-2004. Lab Med. 2007;38(4): 237-239.
(32.) St. John TM, Lipman HB, Krolak JM, Hearn TL. Improvement in physician's office laboratory practices, 1989-1994. Arch Pathol Lab Med. 2000;124(7): 1066-1073.
Devery Howerton, PhD; John M. Krolak, PhD; Adam Manasterski, PhD; James H. Handsfield, MPH
Accepted for publication July 10, 2009.
From the Division of Laboratory Systems, National Center for Preparedness, Detection and Control of Infectious Diseases, Centers for Disease Control and Prevention, Atlanta, Georgia.
The authors have no relevant financial interest in the products or companies described in this article.
Reprints: Devery Howerton, PhD, Laboratory Quality Management Program, Coordinating Center for Infectious Diseases, Centers for Disease Control and Prevention, Mailstop D-10, 1600 Clifton Rd, Atlanta, GA 30329 (e-mail: email@example.com).
Table 1. US Department of Health and Human Services-Approved Proficiency Testing Programs Offering Chemistry and Hematology Analytes, 1994 Through 2006 Accutest Idaho Bureau of Laboratoriesc American Academy of Family Physicians Medical Laboratory Evaluation American Association of Bioanalysts New Jersey Department of Health American Academy of New York State Department of Pediatrics (a) Health American Osteopathic Pacific Biometrics Research Association (b) Foundation (a) American Proficiency Institute Puerto Rico Department of Health Solomon Park Research College of American Pathologists Institute (a) External Comparative Evaluation Wisconsin State Laboratory of for Laboratories Hygiene (a) Program discontinued in 2000. (b) Program discontinued in 1996. (c) Program discontinued in 2004. Table 2. Number of Laboratories in the Clinical Laboratory Improvement Amendments of 1988 (CLIA) Proficiency Testing Database, 1994-2006 Hospital/ Independent All Other Total Laboratories, Laboratories, Laboratories, Year No. No. No. 1994 8788 14 756 23 544 1995 8687 23 911 32 598 1996 9886 31 550 41 436 1997 10 124 30 376 40 500 1998 9831 29 012 38 843 1999 9532 26 625 36 157 2000 9313 25 113 34 426 2001 9234 23 515 32 749 2002 9173 22 978 32 151 2003 8818 21 362 30 180 2004 8990 21 764 30 754 2005 9003 20 794 29 797 2006 9071 20 625 29 696 Figure 1. Distribution of laboratory types participating in proficiency testing, 1996 and 2006. * Other includes ambulances, ambulatory surgery centers, ancillary test sites, assisted living facilities, blood banks, community clinics, comprehensive outpatient rehabilitative sites, end-stage renal disease dialysis centers, federally qualified health centers, health fairs, health maintenance organizations, home health agencies, hospices, industrial facilities, insurance testing sites, intermediate care facilities, mobile laboratories, pharmacies, other practitioner sites, prisons, public health laboratories, rural health care clinics, school/student health services, skilled nursing/nursing facilities, and tissue bank/repositories. Laboratory Type Number of Laboratories 1996 2006 Hospital 61.3% 22.2% Independent 7.2% 8.3% Physician Office 57.2% 49.8% Other * 19.3% 19.7% Note: Table made from bar graph.
|Gale Copyright:||Copyright 2010 Gale, Cengage Learning. All rights reserved.|