Laboratory Mathematics
Proficiency in laboratory mathematics is essential for the Hematology laboratory scientist to validate automated results, prepare reagents, and interpret quality control data. While automation handles routine calculations, the laboratory scientist must be able to manually verify linearity dilutions, correct for interferences, and assess the statistical validity of test methods. This competency spans three distinct areas: general arithmetic (dilutions/solutions), analytical geometry (standard curves), and statistics (QC and diagnostic efficiency)
Concentration, Volume, & Dilutions
This area covers the manipulation of liquid specimens and reagents. Understanding the distinction between “ratio” and “dilution” is critical, as misinterpretation can lead to tenfold errors in reported results
- Solutions and Concentration: The most common expression is Percent Solution (parts per 100). The formula \(C_1V_1 = C_2V_2\) is the standard for altering concentrations (e.g., diluting a stock bleach solution to a working cleaning solution)
-
Dilutions: A dilution is the ratio of sample volume to total volume (sample + diluent)
- Formula: \(\text{Dilution} = \text{Sample Volume} / (\text{Sample Volume} + \text{Diluent Volume})\)
- Example: Mixing 100 µL of blood with 900 µL of saline is a 1:10 dilution: (1 part blood in 10 parts total)
- Ratio vs. Dilution: A “1:10 ratio” usually implies 1 part sample to 10 parts diluent (1:11 dilution). Scientists must verify the specific intent of the procedure
- Dilution Factor (DF): The reciprocal of the dilution. It is the number used to multiply the analyzer result to obtain the final patient report. For a 1:10 dilution, the DF is 10
-
Specific Hematology Corrections
-
Corrected WBC (for nRBCs): Nucleated RBCs interfere with WBC counts
- \(\text{Corrected WBC} = (\text{Uncorrected WBC} \times 100) / (\text{nRBCs per 100 WBCs} + 100)\)
- Citrate Correction: Results from a Sodium Citrate tube (used for platelet clumps) must be multiplied by 1.1 to account for the liquid anticoagulant dilution
-
Corrected WBC (for nRBCs): Nucleated RBCs interfere with WBC counts
Molarity & Normality
These units measure concentration based on molecular weight and chemical reactivity, providing greater precision than percent solutions
- Molarity (\(M\)): Moles of solute per Liter of solution. It is based on the Gram Molecular Weight (GMW). One mole equals the sum of the atomic weights of the molecule
-
Normality (\(N\)): Gram Equivalent Weights per Liter. It focuses on reactivity (Valence)
- Valence: The number of displaceable \(H^+\) ions (acid) or \(OH^-\) ions (base)
- Relationship: \(N = M \times \text{Valence}\). Normality is always equal to or greater than Molarity. For \(\text{HCl}\) (valence 1), 1 \(M\) = 1 \(N\). For \(\text{H}_2\text{SO}_4\) (valence 2), 1 \(M\) = 2 \(N\)
Standard Curves
Standard curves (Calibration curves) graphically establish the relationship between analyte concentration and instrument signal (Absorbance). They define the Analytical Measurement Range (AMR)
- Beer’s Law: States that Absorbance is directly proportional to Concentration (\(A = abc\)). Ideally, plotting concentration (X-axis) against absorbance (Y-axis) yields a straight line passing through zero
- Hemoglobin Curve: A classic linear curve. Patient results are calculated by interpolating against the line or using a calculation factor derived from the slope
- Coagulation Curves: Factor assays (e.g., Factor VIII) are not linear on standard graph paper. They are plotted on Semi-Log or Log-Log paper (Clotting time vs. % Activity) to produce a straight line for interpolation. In these assays, concentration and clotting time are inversely related
- Linearity: The reportable range is limited to the highest and lowest standards. Results falling outside this range cannot be extrapolated; the sample must be diluted and re-tested
Mean, Median, Mode, & Confidence Intervals
These statistical parameters are the foundation of Quality Control (QC). They describe the behavior of data populations (Control Material or Patient Data)
-
Measures of Central Tendency
- Mean: The arithmetic average. It represents the Target Value and measures Accuracy
- Median: The middle value. Used for reference ranges in non-Gaussian (skewed) populations
- Mode: The most frequent value. In Hematology, the peak of the RBC or PLT Histogram represents the Mode (most common cell size)
-
Measures of Dispersion
- Standard Deviation (SD): The measure of scatter around the mean. It represents Precision (Reproducibility)
- Coefficient of Variation (CV): The SD expressed as a percentage of the mean. It allows comparison of precision between different tests
-
Confidence Intervals (CI): Based on the Gaussian (Bell) curve
- \(\pm\) 2SD: Includes 95.5% of values. This is the standard acceptance range for Laboratory QC
- Reference Intervals: Normal ranges are typically established as the Mean \(\pm\) 2SD of a healthy population. By definition, 5% of healthy people will fall outside this range
Sensitivity, Specificity, & Predictive Value
These parameters evaluate the diagnostic utility of a test - how well it distinguishes disease from health
-
Sensitivity: The ability to detect those with the disease (True Positive Rate)
- Utility: Screening. A highly sensitive test rarely misses a case (Low False Negatives)
- Mnemonic: SnNout: (Sensitive tests, when Negative, rule OUT disease)
-
Specificity: The ability to identify those without the disease (True Negative Rate)
- Utility: Confirmation. A highly specific test rarely gives a false alarm (Low False Positives)
- Mnemonic: SpPin: (Specific tests, when Positive, rule IN disease)
- Predictive Values (PPV/NPV): The probability that a specific test result (Positive or Negative) is correct for a specific patient. unlike Sensitivity/Specificity, these values vary depending on the Prevalence of the disease in the population. The PPV drops significantly when screening for rare diseases