Read more

December 07, 2024
5 min read
Save

Higher ferritin threshold cost-effective for diagnosing, treating iron deficiency

Key takeaways:

  • A ferritin threshold of 25 µg/L is cost-effective for identifying and treating women with iron deficiency.
  • Results remained consistent whether women received oral or IV iron supplementation.

SAN DIEGO — A higher ferritin threshold than that used by WHO and CDC is cost-effective for identifying and treating women with iron deficiency, according to study results presented at ASH Annual Meeting and Exposition.

Results showed comparable costs with three strategies — a ferritin threshold of 25 µg/L, a threshold of 15 µg/L or no screening — but a higher number of quality-adjusted life-years gained with the 25 µg/L threshold.

Quote from George Goshua, MD, MSc, FACP

The higher threshold resulted in an incremental cost-effectiveness ratio of less than $1,000 per quality-adjusted life-year — well below the accepted willingness-to-pay threshold in the United States of $50,000 to $150,000 per quality-adjusted life-year.

The findings — which appeared consistent regardless of whether treatment consisted of oral or IV iron supplementation — “fill a critical gap” in women’s health, George Goshua, MD, MSc, FACP, assistant professor of medicine (medical oncology and hematology) at Yale School of Medicine and Yale Cancer Center, and colleagues concluded.

“We expected the number to be low, but we were surprised at how low it actually was,” Goshua told Healio. “I think this should prompt us to re-evaluate what we do on a population level in terms of screening.”

‘A health equity issue’

Iron deficiency — the most common micronutrient deficiency — is one of the top five causes worldwide of years lived with disability, according to study background. The condition disproportionately affects women and is more common during pregnancy.

If left untreated, iron deficiency can progress to iron deficiency anemia, resulting in greater impact to physical, emotional and social well-being.

“This condition is relatively straightforward to diagnose — you just need to be aware of it,” Goshua said. “Testing is inexpensive and treatment is not complicated. Iron deficiency — even without anemia — is associated with a whole panoply of symptoms, and there is no reason why anyone should need to live with it when it’s so diagnosable and treatable.”

The burden of iron deficiency is exacerbated in the United States by underdiagnosis and undertreatment, Goshua and colleagues wrote. This is due in part to what Goshua characterized as “the heterogenous landscape” of ferritin testing in which “inappropriately low” thresholds — and sometimes no thresholds — are used for diagnosis.

CDC and WHO use 15 µg/L, but physiologic studies of iron status and metabolism demonstrate a ferritin threshold of at least 25 µg/L to 30 µg/L — and possibly as high as 50 µg/L — offer greater sensitivity for iron deficiency diagnosis.

Universal screening for iron deficiency is not conducted in the United States. With that context, Goshua and colleagues conducted the first cost-effective analysis to assess various ferritin thresholds for diagnosis and treatment of iron deficiency.

“We know this is a health equity issue for nearly half of the population of our country, as well as around the world,” Goshua said. “This probably should have been looked at a long time ago, but we didn’t have high-quality modern-day epidemiologic data until about a year and a half ago. It finally became possible to put forward an economic risk-benefit argument for why we should consider screening and treating on a population level.”

Methodology

Researchers created a Markov simulation model of adult women in the United States to assess the cost-effectiveness of three strategies for iron deficiency screening — a ferritin threshold less than 25 µg/L, a ferritin threshold less than 15 µg/L or no screening.

Investigators performed the analysis over a lifetime time horizon, establishing a willingness-to-pay threshold of $100,000 per quality-adjusted life year. They evaluated costs in 2024 U.S. dollars, used the U.S. Medical Expenditures Panel Survey to assess age- and sex-specific average annual health care costs, and they used National Health and Nutrition Examination Survey data to calculate epidemiologic prevalence of iron deficiency.

Researchers used data from randomized clinical trials of oral and IV iron, as well as WHO Vigibase data, to estimate the probability of iron-related adverse events.

Investigators entered women into the model at 18 years. Those determined to be iron deficient received iron supplementation and underwent hematology follow-up and retreatment in accordance with average menstrual blood losses until age 51 years, the median age of menopause in the United States.

The model used once-daily oral ferrous sulfate as the base-case iron supplement for iron deficiency treatment. Researchers also conducted a scenario analysis to evaluate treatment with IV iron dextran. They measured effectiveness in quality-adjusted life years.

Incremental cost-effectiveness ratio of screening vs. no screening served as the primary outcome.

Key findings

The base case analysis showed a ferritin threshold less than 25 µg/L to be the cost-effective strategy in all 10,000 Monte Carlo iterations.

Researchers calculated an accrual of about $210,000 in costs for each of the three strategies, but a higher number of quality-adjusted life-years with the 25 µg/L threshold (24.2) than the 15 µg/L threshold (23.1) or no screening (22.3).

A comparison of the 25 µg/L threshold vs. no screening showed an incremental cost-effectiveness ratio of about $100 per quality-adjusted life-year.

The scenario analysis that examined treatment with iron dextran revealed similar cost-effectiveness, with an incremental cost-effectiveness ratio of $900 per quality-adjusted life-year.

When investigators performed deterministic sensitivity analysis, they determined no parameter variations altered the conclusion that a ferritin threshold of less than 25 µg/L is the cost-effective approach.

“If we’re talking about pure dollars per quality-adjusted life-year, screening for diagnosis and treating iron deficiency is right on par — or even a little bit better — than many cancer screenings,” Goshua said.

Next steps

ASH is developing guidelines related to iron deficiency diagnosis, and the first iteration should be available in about 18 months.

“Some laboratories in this country don’t even have a ferritin cutoff, including systematic sex-specific differences that should not be present — any amount of ferritin is then considered normal and is not flagged,” Goshua said. “Our hope is that we’ll be able to make the implementation argument and share this information with as many stakeholders as possible so we get to the point where physicians know what normal ferritin should be, and the guidelines will be helpful in that effort.”

It is unclear whether these data will influence larger entities such as WHO or U.S. Preventive Services Task Force, Goshua said.

“This is the first time a model like this has been presented in the iron deficiency space, but it’s only focused on the United States,” he said. “Our plan is to build this out and adapt it using country-specific estimates for low- and middle-income countries. I suspect we will see similar results, understanding the willingness-to-pay thresholds are going to be lower outside the U.S.

“If those results are in the same ballpark, this could be one of the few interventions that truly could be cost-effective not just in the United States but around the world,” Goshua added. “Those next steps are necessary before an international body like WHO takes these estimates into account.”