Radiation exposure from mammography screening may have been overestimated
Click Here to Manage Email Alerts
Women receive approximately 30% less radiation during a screening mammography than previously thought, according to study results presented at the Annual Meeting of the American Association of Physicists in Medicine.
Current screening methods focus on depositing radiation dosage only to glandular tissue to minimize radiation damage. However, Andrew Hernandez, MD, a PhD candidate in biomedical engineering at the University of California Davis, and colleagues suggested that because these methods assume a uniform blend of glandular and fatty tissue, radiation dosage to the breast has been overestimated.
“With the availability of 3D breast synergy techniques, we are able to understand how the glandular tissue and the fat tissue are oriented within the breast,” Hernandez said during a press conference. “Using this information we are able to assess that the arrangement of the glandular tissue has a really important impact on the results of radiation dose.”
For small-, medium- and large-size phantoms, assuming homogenous distribution rather than heterogeneous distribution could result a 20% to 40% difference in the radiation dose actually received vs. assumed radiation dose, Hernandez said.
Hernandez and colleagues fit 3D glandular distributions of breast CT data sets from 219 women to bi-Gaussian functions to assign glandular distributions within compressed breast models.
Researchers then simulated monoenergetic normalized glandular dose values, or DgN(E) values, in compressed phantoms composed of a homogenous mixture of glandular and adipose tissue or heterogeneously distributed glandular tissue. These values were then weighted by mammographic X-ray spectra to produce polyenergetic DgN values for different tissue compositions.
Results showed across all phantom sizes and glandular fractions, when the heterogeneous distributions were centered within the breast phantom, polyenergetic DgN heterogeneous values were 30.1% lower than the polyenergetic DgN homogenous values for the Mo X-ray spectra and 21.6% lower for the W X-ray spectra.
Thus, radiation dose was overestimated between 25% and 35%, Hernandez said.
Because glandular tissue amounts and distribution are different in every woman, Hernandez said that those numbers could fluctuate by an additional 10% either way.
“On a patient-specific level these findings could be anywhere from 10% to 50% difference,” Hernandez said. “Before we are able to implement this in the clinic, we need to further validate our methods to show the radiation dose or these reductions and go through peer review before these can be implemented.”
The American Cancer Society recommends annual mammography screening for women beginning at age 40 years, whereas the U.S. Preventive Services Task Force says women should wait until they are aged 50 years before commencing biannual screening.
Hernandez pointed out that the decision to screen is based on a computation that comprises both the benefit of screening as well as the risk, the latter of which has been previously assessed erroneously, according to these data.
“Another thing to mention is that there’s a model we currently use that relates the radiation dose delivered during the mammogram to the risk associated with the patient,” Hernandez said. “There’s a lot of uncertainty associated with that and a lot of conversations that will continue to go on to try to understand it.” – by Anthony SanFilippo
Reference: Hernandez A, et al. Abstract 27307. Presented at: Annual Meeting of the American Association of Physicists in Medicine; July 12-16, 2015; Anaheim, California.
Disclosure: HemOnc Today was unable to obtain a list of relevant financial disclosures at the time of reporting.