- Logan, Aaron C;
- Vashi, Nikita;
- Faham, Malek;
- Carlton, Victoria;
- Kong, Katherine;
- Buño, Ismael;
- Zheng, Jianbiao;
- Moorhead, Martin;
- Klinger, Mark;
- Zhang, Bing;
- Waqar, Amna;
- Zehnder, James L;
- Miklos, David B
Minimal residual disease (MRD) quantification is an important predictor of outcome after treatment for acute lymphoblastic leukemia (ALL). Bone marrow ALL burden ≥ 10(-4) after induction predicts subsequent relapse. Likewise, MRD ≥ 10(-4) in bone marrow before initiation of conditioning for allogeneic (allo) hematopoietic cell transplantation (HCT) predicts transplantation failure. Current methods for MRD quantification in ALL are not sufficiently sensitive for use with peripheral blood specimens and have not been broadly implemented in the management of adults with ALL. Consensus-primed immunoglobulin (Ig), T cell receptor (TCR) amplification and high-throughput sequencing (HTS) permit use of a standardized algorithm for all patients and can detect leukemia at 10(-6) or lower. We applied the LymphoSIGHT HTS platform (Sequenta Inc., South San Francisco, CA) to quantification of MRD in 237 samples from 29 adult B cell ALL patients before and after allo-HCT. Using primers for the IGH-VDJ, IGH-DJ, IGK, TCRB, TCRD, and TCRG loci, MRD could be quantified in 93% of patients. Leukemia-associated clonotypes at these loci were identified in 52%, 28%, 10%, 35%, 28%, and 41% of patients, respectively. MRD ≥ 10(-4) before HCT conditioning predicted post-HCT relapse (hazard ratio [HR], 7.7; 95% confidence interval [CI], 2.0 to 30; P = .003). In post-HCT blood samples, MRD ≥10(-6) had 100% positive predictive value for relapse with median lead time of 89 days (HR, 14; 95% CI, 4.7 to 44, P < .0001). The use of HTS-based MRD quantification in adults with ALL offers a standardized approach with sufficient sensitivity to quantify leukemia MRD in peripheral blood. Use of this approach may identify a window for clinical intervention before overt relapse.