Q&A: Integrative risk model may predict Crohn’s development in healthy at-risk individuals
Click Here to Manage Email Alerts
Key takeaways:
- Researchers used demographic, biomarker and fecal microbiome data to develop the risk model.
- In the validation cohort, the model demonstrated a C-index of 0.79, with a time-dependent AUC of 0.8.
A new integrative model demonstrated “strong predictive performance” in identifying future risk for Crohn’s disease development among healthy first-degree relatives of patients with CD, according to a researcher.
“Previous studies have shown evidence of pre-disease signals that precede the development of inflammatory bowel disease, particularly CD, years before diagnosis,” Sun-Ho Lee, MD, PhD, of the University of Toronto and Zane Cohen Center for Digestive Diseases at Mount Sinai Hospital in Canada, told Healio. “However, these pre-disease biomarkers, when considered individually, have demonstrated only modest predictive performance.
“Therefore, we aimed to combine various pre-disease biomarkers to predict the future risk for CD in a prospective cohort of healthy first-degree relatives.”
Using data from the Crohn’s and Colitis Canada Genetic, Environmental, Microbial (GEM) Project, Lee and colleagues developed and validated the GEM-integrative risk score (GEM-IRS). They included 2,619 healthy first-degree relatives — 1,170 and 1,141 from the North American training and testing cohorts, respectively, and 308 from an Israeli cohort.
According to results published in Gastroenterology, 2.3% of first-degree relatives developed CD during a median follow-up of 6.8 years.
In a Healio interview exclusive, Lee outlined the predictive capability of GEM-IRS and discussed how this model might be leveraged to improve early disease detection.
Healio: Why did your team undertake this investigation?
Lee: By integrating biomarkers that reflect gut inflammation, intestinal permeability and the gut microbiome and its functional pathways, we hypothesized that we could accurately stratify the risk for developing CD among healthy first-degree relatives, achieving decent predictive performance.
Similar studies in other chronic immune-mediated diseases, such as type 1 diabetes and rheumatoid arthritis, have shown that risk-stratification during the preclinical phase is possible through a combination of family risk, genetic risk and serology.
In both diseases, intervention trials to prevent or delay disease onset in healthy at-risk individuals have been conducted. The first step in designing such prevention trials for CD is to accurately risk-stratify first-degree relatives at risk for developing CD, which was the primary research question of this study.
Healio: Describe the development and validation process.
Lee: We used baseline samples collected at enrollment in the Crohn’s and Colitis Canada GEM Project, a global multicenter study that has been prospectively following healthy first-degree relatives of patients with CD since 2008. Participants, either offspring or siblings, aged 6 to 35 years, were enrolled. Over 5,000 healthy first-degree relatives have been followed for a median of approximately 10 years, with 123 developing CD and 24 developing ulcerative colitis.
Using machine learning, we integrated high-dimensional gut microbiome data derived from sequencing, along with demographic factors and biomarkers of gut permeability and inflammation. The model was developed using a subset of the cohort, or the discovery cohort, and validated in a separate validation cohort, which was independent from the data from the discovery cohort. Machine learning allowed us to account for intercorrelations among biomarkers, which are often missed with traditional statistical approaches.
Healio: How did you determine the model’s predictive power?
Lee: The predictive power of the model was assessed through measures applied to the validation cohort. The concordance index (C-index), a measure used in time-to-event analyses, indicates how well the risk model predicts the order of events. A C-index, or area under the curve, closer to 1 indicates a perfect prediction.
The C-index for GEM-IRS in the validation cohort was an impressive 0.79. The time-dependent AUC also reached 0.8, suggesting consistent and strong predictive performance over time.
The model’s predictive performance was consistent across different geographic regions, achieving 0.8 in the North American validation cohort and a separate Israeli cohort from the GEM Project, thereby demonstrating the model's generalizability. Additionally, we estimated the probability of CD incidence at 1 year and up to 9 years, based on GEM-IRS, providing both relative risk and absolute incidence rates in the validation cohort.
For example, if a healthy first-degree relative has a GEM-IRS level ranging in the top quartile of the first-degree relative population, the estimated probabilities of developing CD at 1, 5 and 9 years were 1.2%, 5.3% and 8.5%, respectively.
Healio: What additional takeaways were key to the results?
Lee: Further analysis identified key factors contributing to the model in predicting CD development, including several novel microbial taxa and functional pathways. The pathogenesis of CD is known to be multifactorial, and our model incorporates various biomarkers that reflect multiple disease pathways, some of which may serve as potential therapeutic targets to reduce the risk for developing CD.
Healio: How might these results inform patient care going forward?
Lee: As of now, we do not have an intervention strategy to prevent or delay the onset of CD in first-degree relatives. However, the GEM-IRS model shows that by integrating various risk markers, we can stratify healthy first-degree relatives according to their risk for developing CD with a reasonable degree of accuracy.
The hope is that these findings will lead to the design of prevention studies targeting those at higher risk for developing CD, potentially using interventions to modify key disease risk factors identified in this study.
Although it is still too early to suggest a concrete approach for reducing CD risk in healthy at-risk relatives, accumulating evidence from this prospective first-degree relative cohort and other pre-disease cohorts, including the PREDICTS study and Nurses’ Health Study, indicates that there is a prolonged preclinical phase leading up to a CD diagnosis.
With improved risk stratification and a better understanding of the preclinical phase, we will be better positioned to design prevention studies that delay CD onset in at-risk individuals.
Healio: What additional research is needed?
Lee: With the risk score derived from this study, we can begin estimating the risk for CD development over time in healthy first-degree relatives based on baseline biomarkers. However, the predictive performance must be improved before pursuing a prevention trial.
Further studies are required to better define the preclinical stages of CD by incorporating novel molecular, cellular, proteomic and metabolomic signatures. Additionally, we need to investigate the dynamic changes in these pre-disease signatures over time.
The PROMISE Consortium, an international collaboration, aims to investigate and validate pre-disease signatures that precede IBD diagnosis. Collaborative efforts are underway to apply these novel omics to better define CD’s preclinical phase.
Pilot intervention studies are also in progress, which may provide evidence for delaying or preventing CD development in at-risk individuals, as seen in type 1 diabetes and rheumatoid arthritis.
Healio: Is there anything else our readers should know about this?
Lee: IBD is a continuum that begins with a long asymptomatic pre-disease phase and carries lifelong consequences. A family history of IBD should no longer be overlooked. A better understanding of disease triggers from pre-disease cohorts will not only enhance risk stratification in at-risk populations but also identify potential targets for early-stage interventions that modify disease risk.
This approach may hold the key to breaking through the current therapeutic ceiling in IBD treatment.