Minimal residual disease (MRD) monitoring has proven to be one of the most substantial independent prognostic factors in patients with acute lymphoblastic leukemia (ALL) [1, 2]. MRD monitoring is defined as any approach—including cytogenetics, flow cytometry, PCR-based tools, and high throughput sequencing methods—aimed at detecting and possibly quantifying residual tumor cells beyond the sensitivity level of cytomorphology. To be informative, MRD assays for ALL should allow for the detection of 1 leukemic cell among 100,000 normal cells or more in virtually all patients. Currently, standardized methods for studying MRD in ALL are multi-parametric flow cytometry (MFC) of leukemia-associated immunophenotypes (LAIP), and more so, polymerase chain reaction (PCR) amplification-based methods that use leukemia-specific (fusion gene transcripts) or patient-specific [immunoglobulin/T-cell receptor (
In the past, it has been debated if peripheral blood (PB), rather than bone marrow (BM), could be used for MRD monitoring, regardless of the technique used (MFC or PCR). In fact, in B-ALL, MRD levels tend to be 1–3 logs lower in PB than in BM [23, 24], and blood analyses cannot replace bone marrow assessments. T-lineage ALL MRD assessments are typically carried out on BM samples.
MRD is a time point–dependent variable. MRD levels at different time points have different prognostic values for relapse: early MRD assessment identifies patients with a rapid tumor clearance and a very low risk of relapse, whereas any persisting MRD at the end of consolidation therapy is associated with a particularly poor prognosis.
This approach takes advantage of the presence of protein antigens in the nucleus, cytoplasm, or surface of the cell, which are sequentially acquired during normal cell development. The LAIP must be identified at diagnosis, before any therapy is carried out for each ALL case, by comparing the marker profile of leukemia cells to reference bone marrow samples. Marker profile comparisons are achieved through various combinations of monoclonal antibodies against surface, cytoplasmic, or nuclear leukocyte antigens. A second approach is represented by the so-called “different from normal (DFN)” analysis, which identifies leukemic blasts by recognizing immunophenotypic changes with respect to a normal counterpart population (either hematopoietic progenitors of similar lineage and maturational stage) through the evaluation of antigenic pattern expressions [25, 26]. At present, the most commonly used MFC panels comprise 6–8 monoclonal antibody (MoAb) combinations. The refinement of MFC has required a parallel advancement in hardware, software, and reagents.
Flow cytometry can be successfully applied to the majority of cases (>90%) and can reach a sensitivity of 10-4. Flow cytometry analysis is quick, and can produce MRD evaluations in only a few hours, and is, therefore, also useful to assess the therapeutic response following the first 2 induction weeks [27, 28]. However, some limitations do exist, such as the fact that samples must be analyzed immediately after collection to avoid cell death, a problem that arises when shipment is required for the centralized evaluation of MRD referral laboratories. Furthermore, a) post-induction regeneration of normal lymphoid cells co-expressing some ALL-type antigens can lead to false-positive results in B-ALL cases, b) the bone marrow sample hypocellularity and, in some patients, a phenotypic shift can induce erroneous or difficult interpretations . The EuroFlow Consortium has optimized and standardized immunostaining protocols for the diagnosis, classification, and prognostic subclassification of hematologic malignancies as well as for the detection of MRD during the clinical follow-up. However, experienced operators are still needed to correctly evaluate MRD results [30, 31].
The most commonly used technique is the molecular study-based approach of antigen-receptor gene rearrangements, i.e.,
Another method for molecular MRD detection and monitoring is based on fusion transcript analysis. Overall, more than 40% of ALL patients carry chromosomal translocations that generate chimeric transcripts. These are potentially ideal targets for MRD assessment , since they are critical primary events that are expressed in all leukemic cells, and are incredibly stable during the course of the disease. Within B-lineage ALL, the most common translocation detected in adult cases is the Philadelphia translocation (Ph), also called the Philadelphia chromosome, leading to the breakpoint cluster region-Abelson murine leukemia viral oncogene homolog 1 (
The novel next generation flow (NGF)-MRD approach takes advantage of innovative tools and procedures recently developed by the EuroFlow Consortium for sample preparation, antibody combinations (including choice of type of antibody and fluorochrome), and identification of B-ALL pathways in the BM, which allows to define the degree of immunophenotypic deviation of B-ALL cells from normal B-cell precursors (also in regenerating BM). Additionally, for T-ALL, a comparable strategy is used to obtain reliable (evidence-based) antibody combinations, in order to discriminate from various types of normal T-cells and other cells with cross-lineage marker expression [44, 45].
NGF-MRD is faster, reproducible, and has greater applicability (>95%) compared to molecular based MRD detection. Moreover, the costs of reagents and assays are estimated to be lower than those of NGS. NGF-MRD reaches a sensitivity close to 10-6, while conventional flow tools reach sensitivities in the range of 10-4–10-5. The higher sensitivity of NGF-MRD is mostly due to the use of standardized approaches, including instrument setting, sample processing with bulk lysis procedure, immunostaining, data acquisition, and data analysis with standardized (even automated) gating strategies for definition of cell populations . However, the acquisition of a large number of cells is needed to reach the required sensitivity. In the forthcoming decade, new flow technologies will improve the applicability and specificity of flow MRD measurements.
The digital PCR technology (ddPCR) has the potential to overcome the limitations of qPCR. DdPCR is the third-generation implementation of conventional PCR that allows the quantification of nucleic acid targets without the need of calibration curves [47, 48]. As reported in several studies [49, 50], based on the dynamic nature of the two methods, ddPCR appears to be more accurate than qPCR since: i) each sample is partitioned in droplets and each droplet is analyzed individually, ii) small changes in fluorescence intensity are more readily detected, and iii) the ratio between target DNA molecules to PCR reagents is substantially higher; his increases its amplification efficiency. Finally, the presence of inhibitors negatively affects qPCR efficiency but not that of ddPCR.
The ddPCR technology has been applied to various fields of medical diagnostics, in particular in molecular oncology [47-51], in prenatal diagnosis, and in hematologic malignancies [47,52-55] that are available in the literature. These studies have established analytical parameters to investigate the applicability of ddPCR for MRD detection and concluded that ddPCR has a sensitivity, accuracy, and reproducibility at least comparable to that of qPCR. Regarding MRD evaluations, some discordances have been observed at very low disease levels; in this setting, ddPCR showed an excellent analytical performance to quantify those low positive samples defined as PNQ by qPCR or to identify the false MRD+ cases. Recently, the clinical significance of ddPCR has been reported in a pediatric cohort of ALL patients . Nevertheless, no established guidelines for ddPCR MRD analysis and interpretation have so far been defined. A major standardization effort is underway within the EuroMRD Consortium.
Several groups have shown the value of NGS technologies for MRD detection in precursor and mature B-cell tumors [13-18,56-58]. NGS can be used to detect clone-specific
Sensitivity is a critical aspect of MRD detection. Methods allowing a sensitivity higher than 10-5 (routinely achieved by qPCR) might be of interest to identify very low-level diseases. Studies using the NGS platform in ALL and chronic lymphocytic leukemia have demonstrated that a sensitivity level of 10-6 , is achievable when higher amounts of DNA are used. This is reflected in the possibility of detecting early clonal evolution, a relatively frequent occurrence in relapsed ALL .
Many authors have reported that NGS appears to be more specific than qPCR in predicting relapse in ALL patients after induction  as well as after allogenic SCT . A comparative MRD analysis between qPCR and NGS showed within the Berlin Frankfurt Münster (BFM) trials a change in the stratification risk, mainly due to different interpretations of the two techniques within low-positive samples . The NGS quantitative discrimination is always superimposable to the sensitivity, whilst the qPCR quantitative range is usually inferior to the sensitivity threshold leading to cases which are defined as PNQ, as previously described .
However, NGS has a substantial intrinsic complexity and involves major costs. Furthermore, current robust and broadly applicable NGS-based MRD standardized protocols are still not available in academic laboratories. The Euroclonality-NGS Consortium is working to standardize guidelines for analysis and data interpretation. DdPCR and NGS appear to be feasible and attractive alternative methods for MRD assessment that can help to more precisely classify cases that qPCR is unable to detect or quantify.
Like other quantitative methods, MRD quantification techniques have a lower limit of detection and a lower limit of quantification. Therefore, MRD negativity is not synonymous with the absence of residual disease, which is why several authors use the term “measurable residual disease” instead of “minimal residual disease.” The sensitivity of measurements is determined by the particular technique, and the amount of cell correlates analyzed. Current treatment protocols require a sensitivity of at least 10-4. Some recent studies have shown that the use of commercial approaches for NGS MRD detection claim to reach sensitivities down to 10-7 . However, it is important to note that the amount of input DNA is crucial for reaching a particular sensitivity. This often represents a serious limitation in the aplastic samples during treatment . An MRD assessment using 100,000 cells can never reach a sensitivity of 10-6, even if the readout suggests that it did. It seems that
To allow for a correct interpretation of MRD results, the MRD report must provide information on the MRD technique used, MRD markers, theoretical limit of detection, and limit of quantification of the assay.
The clinical impact of MRD is now widely accepted and is regarded today as the most important prognostic factor in the state-of-the-art management of ALL. MRD can provide different information, according to the timing in which it is performed (very early during treatment, after induction/consolidation, and before and after SCT) and, more recently, it can be refined by the evaluation of additional genomic markers.
The initial MRD response to therapy is a relevant prognostic factor in both childhood and adult ALL ; indeed, MRD negativity at very early time points during induction therapy correlates with a particularly excellent outcome both in childhood and adult ALL, and is indicative of a high sensitivity to chemotherapy. The first pivotal study on molecular MRD analysis was carried out by the German Multicenter Study Group for Adult ALL (GMALL) on a large cohort of Ph-patients with standard-risk and high-risk features (N=580) that showed that the molecular response to standard induction and consolidation treatment was the only significant prognostic factor for remission duration and survival in both risk groups . These data have been confirmed by other groups, regardless of the cut-off values, MRD technique, the timing of MRD analysis, and the target patient population chosen.
As mentioned, MRD after induction/early consolidation is the most critical decision-making parameter for an allogeneic transplant. A recent meta-analysis on 21 published reports, including over 2,000 patients, confirmed that a pre-transplant positive MRD is a significant negative predictor of relapse-free survival (RFS), event-free survival (EFS), and OS; as expected, a positive MRD prior to transplant was not associated with a higher rate of non-relapse mortality . Therefore, these results show that MRD evaluation before transplant is extremely useful for treatment intensification since we can offer the opportunity to adequately use immunotherapeutic compounds (e.g., blinatumomab, inotuzumab, and in the future possibly CAR-T cells) aimed at obtaining an MRD negative status.
With regard to the post-transplant setting, MRD monitoring has been much less frequently used after SCT, primarily due to that donor chimerism monitoring provides an alternative for early relapse detection. Nevertheless, it has been shown that an
At relapse, molecular evaluation of
Novel therapies, such as monoclonal antibodies, bispecific T-cell engagers, or chimeric antigen receptor T-cells (CART), are an exciting advancement in the immunotherapeutic treatment of relapse/refractory B-ALL. These new therapeutic approaches make MRD an almost perfect therapeutic target, considering that MRD+ patients harbor significantly less leukemic cells and, therefore, a more manageable clinical profile than cases in hematologic relapse.
MRD is a powerful and independent predictor of outcomes in both children and adult ALL, during treatment, in the pre- and post-SCT settings, with different prognostic meanings based on the clinical context. However, MRD is a context-dependent variable with different prognostic meanings in first-line treatment compared with salvage therapy, Ph– compared with Ph+-ALL, and for early response assessment against post-remission monitoring. Molecular rearrangements (gene fusion and
No potential conflicts of interest relevant to this article were reported.