Early diagnosis of iron deficiency and iron deficiency anemia with re-evaluated ferritin cut-off concentrations
Iron deficiency and iron deficiency anemia are remarkably common worldwide. However, recommended ferritin cut-off concentrations can vary widely and there is room for improvement in standardization across different assay platforms. Dr Katie Troike and Dr Adam McShane (Department of Pathology and Laboratory Medicine, Cleveland Clinic, Cleveland, OH, USA) recently conducted their own re-evaluation of ferritin cut-off concentrations for these conditions [1] and CLI was delighted to chat to them to discover more about their findings.
What is iron deficiency and iron deficiency anemia?
Dr Katie Troike (KT) Iron deficiency is essentially a lack of iron in the body and it’s the most common nutrient deficiency worldwide. Iron deficiency anemia (IDA) is the condition that develops naturally from iron deficiency due to impaired hematopoiesis that occurs when cells don’t have enough iron to synthesize the heme in hemoglobin. IDA is, therefore, also extremely prevalent. I think the current prevalence is about 1.24 billion people globally are affected by IDA, and some estimates suggest the iron deficiency component is actually double that of IDA. The majority of people affected by these conditions are women and children. IDA can affect daily life; it presents with a lot of symptoms, even in the absence of anemia, such as fatigue, poor concentration, lightheadedness, bruising, and so on. There are many different causes or etiologies. The main cause is simply not taking in enough iron, but also increased demand can cause IDA, for example anemia of pregnancy is very common. Also, certain medications such as proton pump inhibitors for the treatment of gastroesophageal reflux disease (GERD). Finally, blood loss is a big contributor to IDA: GI bleeding is a common cause and menstruation is thought to contribute pretty significantly to the increased prevalence in women.
Dr Adam McShane (AMcS) I would just add that this is a disease that is so prevalent and also a disease that’s very treatable. We have iron pills, you can treat underlying causes, so I think it highlights the importance of laboratory testing for a very prevalent disease that when treated can have life-changing consequences for the patients.
How are iron deficiency/IDA defined and diagnosed?
KT Serum ferritin is the most common marker used to screen for iron deficiency. Ferritin is an iron storage protein that is only produced in the presence of iron – there’s an iron-response element in the ferritin gene that controls its expression levels – so it’s a pretty good early indicator of iron deficiency or iron repletion. One caveat to that is ferritin is an acute phase reactant, which means that its levels are increased in states of inflammation and infection, so we have to be careful when we’re interpreting ferritin results because they can be affected by other conditions which may even mask iron deficiency. Ferritin cut-off levels are a very important tool, especially the lower cut-offs for identifying iron deficiency. However, ferritin reference ranges vary a lot between laboratories primarily because they are typically either established by those laboratories indepen-dently using reference interval (RI) studies or adopted and verified from vendor package inserts. A lot of professional organizations tend to have different cut-off levels. For example, the World Health Organization recommends a 15 ng/mL cutoff for ferritin, but a study of ferritin threshold recommendations found ranges from 12 to 100 ng/mL, which is obviously a pretty wide spread for diagnosing iron deficiency or IDA. Another challenge of measuring ferritin concentrations is that ferritin RIs are usually sex specific, as women have historically been reported to have lower serum ferritin concentrations compared to men, which is a big confounding factor. The majority of labs tend to use lower cutoff concentrations around 15 ng/mL , but it can vary pretty widely. The criteria for diagnosing anemia are a little more straightforward, by which I mean there is more standardization for defining anemia, and those are hemoglobin values less than 13 g/dL for men and less than 12 g/dL for women.
Why is there a need to reassess these cut-off levels?
KT I wish I could take credit for it. There’s actually been a lot of literature on this topic lately and I became really interested in the sex-specific cut-off values. Once I dug a little bit deeper, I found several studies reporting that the lower limit cut-off levels are far too low to detect iron deficiency with adequate sensitivity, especially for women. A lot of labs use limits below that 15 ng/mL concen-tration, for example I found one laboratory using 8 ng/mL, which is considerably lower than the recommended limit. The reason for this is perhaps because they performed an RI study and decided that 8 ng/mL was normal in their population. However, a lot of people have argued that even 15 ng/mL might actually be too low for diagnosis in most people. For instance, some studies have shown that women with levels above 15 ng/mL have undetectable iron stores in the bone marrow, which is the gold standard for the diagnosis of iron deficiency. Obviously, it’s very invasive to take bone marrow aspirate so it’s not typically done, but a study that correlating those data found the 15 ng/mL cut-off value to be a little bit too low. Because iron deficiency is so prevalent, labs that either establish their RIs (or adopt them) are most likely including a lot of people with subclinical iron deficiency in those studies. Hence, we are probably further underestimating and falsely lowering those cut-off values even more, which will contribute to underdiagnosis of these conditions. This is problematic, especially in women who are more prone to iron deficiency, and also for clinicians who really depend on reference ranges for diagnosis and may be unaware that these values might not be the best tools for the job. As Dr McShane noted, it’s really important to appreciate that these conditions are almost completely reversible with iron supplementation; however, in addition, if the patient has an underlying condition causing the iron deficiency it’s important to identify and treat that as well.
AMcS Foundationally, in lab medicine, we depend on RIs for a lot of the work we do. In order to determine RIs, we take a large number of outwardly healthy individuals and assign a 95th percentile, which is where the 95% that lie in the middle of the distribution curve creates your RI. The issue with studies of ferritin is that because of all the high prevalence of subclinical presentation of iron deficiency unfortunately people with iron deficiency will end up being un-knowingly included in the studies. Also, when the gold standard measurement is by bone marrow aspiration, that’s not a study many people are going to line up to do. This is how we arrive at the situation where potentially ferritin analysis has a low sensitivity for diagnosis of iron deficiency/IDA, especially in the female population.
What is the evidence suggesting that new RIs could be better than the traditional ones?
KT As I mentioned, there’s been a lot of concern recently about ferritin RIs and many papers have recommended using functional reference limits for ferritin instead of RI studies. These are physio-logical cut-off concentrations determined in studies where healthy and diseased individuals are used to cover the complete spectrum of disease. As a large proportion of iron deficiency is subclinical with high prevalence, these functional reference levels can be very helpful for diagnosis. We wanted to really investigate what the cut-off concentrations are in our population if we defined iron deficiency using different markers and then correlated that to the patient’s ferritin levels. We collected one year of data from outpatients who had had certain blood tests done. These were tests for iron, total iron binding capacity, transferrin saturation, hemoglobin and ferritin. We collected patient age, sex, and values for all of those tests and then divided them up into groups: iron replete patients (patients who have enough iron and normal iron, transferrin saturation, and hemoglobin); iron deficient patients (patients with low levels of iron and low transferrin saturation levels); and IDA patients (patients with low everything including hemoglobin). Firstly, we wanted to see if our current ferritin RIs that we have defined using an RI study still hold true to our current population. This is important because occasionally you need to reassess your RIs to ensure that they are still applicable to the population you’re studying. Fortunately, the RIs suggested by the two data sets matched almost exactly. Secondly, we wanted to assess if the lower cutoffs we defined are truly the best concentrations to use for the patients we defined as iron deficient and IDA. In order to do that we performed a receiver operator characteristic (ROC) curve analysis. We divided our groups into true positives and false positives based on their ferritin levels and essentially determined what the ideal cut-off concentrations would be to capture the majority of those patients with iron deficiency or IDA. This analysis showed us that our current RIs are not as sensitive as they could ideally be for diagnosing those conditions. For iron deficiency, sensitivity was about 15% for females and about 26% for males using our current cut-off levels (15 ng/mL for females and 30 ng/mL for males). The ideal cut-off values based on our study would be approximately 45 ng/mL for females and approximately 70 ng/mL for males for iron deficiency, which do fall within the – admittedly large – range of 12–100 ng ferritin/mL mentioned earlier. For IDA, our ideal reference values would be 30 ng/mL for females (which follows the trend of current recommendations) and 75 ng/mL for males. Increasing these lower limits would improve sensitivity to greater than 50% for iron deficiency and greater than 60% for IDA. Based on what I have seen throughout my investigation, if you want one cut-off level to suit everyone, 30 ng/mL would probably be the most ideal. There are confounding variables in our study that we have to take into consideration, but it’s interesting and reassuring that our result of 30 ng/mL matches the level now being recommended in the literature.
AMcS Our study was a large study, with data from over 30 000 patients that we were able to analyse. However, there are still limitations – for example, the patients were not clinically diagnosed as being anemic or not. Additionally, as Katie mentioned, there are confounding variables: ferritin is an acute inflammation marker and we didn’t rule that out for all these patients. However, statistically, having a data set as large as we did should provide us with reasonable accuracy given that we weren’t able to address some of these limitations for each individual patient.
In the light of this analysis, what are the recommendations for new RIs?
KT The results from our study are consistent with new information in the literature suggesting a lower limit of at least 30 ng/mL for every patient – men and women included. However, what this study showed us is we need to start re-evaluating these RIs and encourage professional organizations to perform these types of analyses and come to a consensus for us as laboratorians. It makes our jobs easier when we can adopt a clinical guideline, right? We are just one clinic in this country and our data may not necessarily be representative of the entire population, but I think we are seeing a trend pushing for increasing those lower limits of normal for ferritin. Hopefully, we can advocate for professional organizations to conduct large-scale studies or convene expert panels to create consensus guidelines. It would also be great to include data on ethnic background and individuals from the transgender population to ensure that ferritin RIs are relevant for everyone.
AMcS I would say we need to lean on our vendors for better standardization of ferritin and its commutability. For instance, we do have a reference standard but continued studies show that analysis of ferritin concentrations on platforms from different vendors give different results, and this lack of standardization in the assays makes it difficult to provide and work to a standardized threshold. Until this happens, I will continue to lean on our vendors to re-evaluate ferritin RIs. It is possible that the traditional RI study is not appropriate but the vendors have much more resources across the world for these kinds of analyses than individual laboratories.
How will this benefit patients and what further improvements might be on the horizon?
KT The main benefit for patients is that with more relevant RIs we can catch more cases early and so prevent progression of iron deficiency to anemia. Then, additionally, any underlying disease causing the iron deficiency or IDA can be identified and treated. Improved quality of life will be a big benefit of early diagnosis, especially for women. The symptoms are often non-specific, “I’m tired, I’m not feeling well”, compounded with a sub-optimal ferritin threshold can make prompt diagnosis challenging.
In the future, it would be great to have – as Dr McShane mentioned – better standardization of the ferritin assays. I’ve seen a lot of very recent momentum in this area, so hopefully we’re adding to those calls for that kind of change and standardization.
Reference
1. Troike K, McShane AJ. A-159 Re-evaluating Ferritin Cutoffs for the Diagnosis of Iron Deficiency and Iron Deficiency Anemia Free Clin Chem 2024;70(Suppl 1):i59–60 (hvae106.157; (https://doi.org/10.1093/clinchem/hvae106.157).
The author
Dr Katie Troike PhD
Pathology and Laboratory Medicine Institute, Cleveland Clinic, Cleveland, OH, USA
Email: kt305408@ohio.edu
The author
Dr Adam McShane PhD,
Pathology and Laboratory Medicine Institute, Cleveland Clinic, Cleveland, OH, USA
Email: mcshana@ccf.org







