Test Verification & Validation
Before a laboratory can report a single patient result using a new method or analyzer, it must prove that the test works as intended. This process is strictly regulated by CLIA’88 and accreditation bodies like CAP and The Joint Commission. There is a critical distinction between Validation (required for non-FDA approved tests) and Verification (required for FDA-approved commercial tests). In Hematology, where most analyzers are FDA-cleared, the primary task is Verification
Definitions
- Validation: Establishing performance specifications for a test system that has not been cleared by the FDA (e.g., Laboratory Developed Tests - LDTs, or “Home-Brew” tests). The lab must prove accuracy, precision, sensitivity, specificity, and reportable range from scratch
- Verification: Confirming that an FDA-cleared (unmodified) test system performs according to the manufacturer’s claims in your specific laboratory. You are verifying that you can get the same results the manufacturer got
The Verification Protocol (Performance Specifications)
For a standard Hematology analyzer (e.g., verifying a new Sysmex XN), the laboratory must verify four key performance characteristics: Accuracy, Precision, Reportable Range (Linearity), and Reference Intervals
1. Accuracy (Method Comparison)
Accuracy is the “closeness to the true value.”
- Procedure: Run at least 40 patient samples on the new instrument (Test method) and the old instrument (Reference method/Gold Standard). The samples should span the entire clinical range (low, normal, and high values)
-
Analysis: Perform Linear Regression analysis
- Correlation Coefficient (r): Should be \(> 0.95\). Indicates the two methods move in sync
- Slope: Should be close to 1.0. Indicates proportional error
- Y-Intercept: Should be close to 0. Indicates constant error/bias
- Bias Plot: Look for systematic differences. If the new machine consistently reads Platelets 10% lower than the old machine, the lab may need to apply a factor or investigate calibration
2. Precision (Reproducibility)
Precision is the “closeness of repeated measurements.” It proves the machine is consistent
-
Within-Run Precision (Repeatability): Run the same sample 20 times in a row. Calculate the Mean, Standard Deviation (SD), and Coefficient of Variation (CV)
- Goal: The CV should be very low (e.g., \(< 1.5\%\) for RBCs)
-
Between-Run Precision (Day-to-Day): Run control material (Low, Normal, High) once a day for 20 days. This accounts for daily variables like temperature, different techs, and instrument drift
- Goal: Prove long-term stability
3. Reportable Range (Linearity)
This verifies the range of values over which the instrument provides accurate results without dilution
- Procedure: Run a linearity kit (commercial standards with known values covering the low to high range) or create a set of dilutions from a high-patient pool (e.g., 100%, 75%, 50%, 25%, 0%)
- Analysis: Plot the Expected Value vs. the Observed Value. The line should be straight
- Upper Limit of Linearity (AMR): Determine the highest number the machine can report (e.g., WBC = 400,000). Values above this must be diluted
- Lower Limit of Detection: Determine the lowest number distinguishable from noise (e.g., Plt = 5,000)
4. Reference Intervals (Normal Ranges)
The lab must verify that the manufacturer’s suggested normal range applies to their specific patient population
- Transference: If the demographics match, the lab can “transfer” the manufacturer’s range by testing 20 healthy donors. If \(\le 2\) results fall outside the range (90% acceptance), the range is verified
- Establishment: If the range cannot be transferred (or for a new LDT), the lab must test 120 healthy donors to statistically calculate the central 95% interval
Carryover Studies
In Hematology, carryover is a critical check. It ensures that a sample with a very high count doesn’t contaminate the next sample
- Procedure: Run a High sample (H1, H2, H3) followed immediately by a Low sample (L1, L2, L3)
- Formula: \(\%\text{Carryover} = \frac{L1 - L3}{H3 - L3} \times 100\)
- Goal: Carryover should be negligible (typically \(< 1\%\)). This proves the instrument’s wash cycle is effective
Calibration Verification (Cal Ver)
Verification is a one-time event when the machine is installed. Calibration Verification is an ongoing quality requirement (CLIA)
- Frequency: Every 6 months, or after major maintenance (e.g., replacing a laser)
- Process: Run linearity standards (Low, Mid, High) to prove the calibration curve is still valid
- Outcome: If Cal Ver passes, you do not need to recalibrate. If it fails, you must Recalibrate and then run controls
Documentation & Sign-Off
The Medical Director (Pathologist) is ultimately responsible for the test quality
- The Validation Package: A comprehensive binder (or digital file) containing all raw data, statistical graphs, precision studies, and the final conclusion (“Method is acceptable for clinical use”)
- Approval: The Medical Director must sign and date the verification summary before patient testing begins. Reporting patients before this signature is a major regulatory violation