Read more

February 25, 2022
2 min read
Save

AI platform predicts thyroid cancer, pathologic stage, BRAF status using ultrasound images

An artificial intelligence platform predicted thyroid cancer with high accuracy based on analysis of routine ultrasound images, according to study results presented at Multidisciplinary Head and Neck Cancers Symposium.

“If [the thyroid nodule] is cancerous, we can further predict the tumor stage, the nodal stage and the presence or absence of BRAF mutation,” said senior author Annie W. Chan, MD, director of the head and neck radiation oncology research program at Massachusetts General Hospital and Harvard Medical School, who presented the findings.

Accuracy of AI platform.
Data derived from Paul R, et al. Abstract 10. Presented at: Multidisciplinary Head and Neck Cancers Symposium; Feb. 24-26, 2022; Phoenix.

Thyroid nodules have become very common, largely due to increased detection and improved diagnostics, Chan told Healio.

“Ultrasound is the gold standard for imaging thyroid nodules,” she said. “Radiologists determine whether a nodule is suspicious for malignancy or not by assessing certain features of the nodule and determine a Thyroid Imaging Reporting and Data System (TI-RADS) score. If the TI-RADS score is high, patients then get a biopsy to rule out or confirm malignancy.”

However, this process has drawbacks, including the amount of time required and significant interobserver variability, Chan said.

“The current approach also does not allow staging of thyroid cancer. One can only find out the staging after surgery is performed,” she added.

Chan and colleagues developed a multimodal AI model for thyroid cancer screening and staging from ultrasound images that combined TI-RADS-defined ultrasound properties as machine-learning features with radiomics, which extracted quantitative features from images; topological data analysis to identify spatial relationships between data points in images; and deep learning, in which algorithms processed data through AI neural network layers to generate predictions. The combination of these methods enabled acquisition of more data with less noise, according to Chan.

The researchers acquired 1,346 thyroid nodule images through routine diagnostic ultrasound from 784 patients to train and validate the platform. They divided the data set into internal training and validation and external validation data sets.

Samples from fine-needle biopsy confirmed malignancy, operative reports confirmed pathologic staging and genomic sequencing confirmed mutational status.

The multimodal AI platform achieved 98.7% accuracy (0.99 area under the curve) in prediction of thyroid nodule malignancies within the internal data set. This represented significant improvement compared with accuracy of individual models, including radiomics (88.7% [0.87 AUC], P < .001), deep learning (87.4% [0.92 AUC], P = .002), topological data analysis (81.5% [0.81 AUC], P < .001) and TI-RADS (80% [0.76 AUC], P < .001).

The model showed 93% accuracy for cancer prediction within the external validation data set.

In addition, a multimodal model that used radiomics, topological data analysis and machine-learning TI-RADS showed accuracy of 93% (0.93 AUC) for distinguishing T stage, 89% (0.88 AUC) for N stage and 98% (0.96 AUC) for extrathyroidal extension. It showed 96% accuracy (0.97 AUC) for identification of BRAF V600E mutations, which are common in papillary thyroid cancer and can be treated with targeted therapies.

“Our next step is to perform prospective validation by incorporating our AI platform in multicenter clinical trials,” Chan told Healio. “We would also like to test our AI platform, which is low-cost and noninvasive, in economically disadvantaged countries where resources are limited.”