Link to this chapter Copy to clipboard
Cite this chapter Copy to clipboard
Embed this chapter on your site Copy to clipboard
Embed this code snippet in the HTML of your website to show this chapter
Open access peer-reviewed chapter
By Lauren C. Ramsay and Christopher V. Charles
Submitted: April 22nd 2014Reviewed: August 21st 2014Published: June 17th 2015
Iron deficiency anemia (IDA) is the most prevalent micronutrient condition globally, with nearly 50% of anemia cases being caused by iron deficiency according to the World Health Organization (WHO) . While it is a condition that does not discriminate between the developed and developing world, the incidence is still higher in developing countries. In South-East Asia the WHO reported that 65.5% of preschool-age children suffer from anemia, and 48.2% and 45.7% of pregnant and non-pregnant women, respectively, also suffer from anemia, which represents the highest prevalence in the world .
As the most prevalent micronutrient condition in the world it is critical that there is a strong understanding of how to improve iron intake. In many cases this may involve taking iron supplement pills, however this is by no means the only approach that is currently being used. This chapter seeks to provide background information on iron deficiency and iron deficiency anemia (IDA), and review a number of current strategies currently being used to address these conditions around the world. This review is by no means exhaustive, but aims to cover a number of studies in each of the major iron deficiency intervention strategies.
The word anemia is derived from the Greek word ἀναιμία anaimia, meaning “without blood,” . Anemia is a deficiency of red blood cells and/or hemoglobin resulting in a reduction of the oxygen-carrying capacity of blood . An individual with a circulating hemoglobin concentration less than 120g/L is considered to be anemic, but this varies across age and sex based on iron requirement and hemoglobin thresholds, which can be seen in Figure 1 [1, 4].
Anemia is caused by a long-term iron imbalance resulting in the depletion of bodily iron stores over time [2, 5]. Iron is a necessary component in the production of red blood cells, and so the absence of sufficient supplies results in decreased hemoglobin production and subsequently red blood cell production [2, 5, 6]. The focus of this chapter is on IDA, and this is typically a result of inadequate dietary intake, but could also be a result of excessive blood loss from trauma or post-partum hemorrhage, or other diseases such as parasitic infections, malaria, and inherited hemoglobin disorders [5, 6].
There are several risk factors of IDA, which have resulted in uneven distribution of prevalence around the world, with the highest concentration in developing countries in Africa and Asia . The most commonly reported risk factors of IDA include: poverty [4, 7]; local dietary staples such as rice which have low bio-availability of iron ; genetic hemoglobinopathies ; consumption of untreated water ; sex where typically females are more likely to experience IDA than males ; low parental educational attainment [10-12]; maternal anemia ; and food insecurity . Based on the information presented above, it is clear that the determinants of IDA are complex and a number of social, ecological, biological and socioeconomic factors are at play. Across each study, however, poverty is the most salient predictor.
The signs and symptoms vary among individuals but there are always adverse effects of IDA. Pregnant and postpartum women and young children are most susceptible to iron deficiency and IDA because of the high demands for iron that growth and pregnancy have . These groups also experience some of the most severe symptoms .
Symptoms of IDA may include diminished work capacity, immune system dysfunction, neurocognitive impairment, dizziness, fatigue, and pallor [2, 6, 7]. Severe anemia can reduce significantly a woman’s ability to survive giving birth due to bleeding during and after childbirth [14, 15]. Additionally, pregnant women are at higher risk of preterm delivery when anemic . For these reasons, anemia is considered to be a major contributor to maternal mortality in the developing world. The WHO reports that anemia is a factor in 20% of maternal deaths .
Children are also highly susceptible to adverse effects of IDA and experience additional signs and symptoms if left untreated. Anemic children may experience height and weight disturbances, slowing of learning and behavior development, and interruptions in physical or mental growth . IDA can affect neurocognitive development, resulting in reduced psychomotor and cognitive abilities in children [2, 17]. These are most often measured on standardized measures of mental development, cognitive function tests, psychomotor scales, and educational achievement tests . The impaired cognitive and psychomotor development that children affected by iron deficiency or IDA experience is speculated to have a causal relationship to future earning potential, though study terms have not yet been long enough to quantify this . This is thought to occur because of the direct relationship between iron and oxygen delivery to the brain and muscles .
Iron is a critical component in many metabolic functions, particularly in delivering oxygen throughout the body as an essential component of red blood cells. The majority of human iron stores are found in hemoglobin . Iron is a component of every human cell and plays a critical role in many biochemical reactions in the body. It is involved in oxygen transportation, energy production, cellular respiration, DNA synthesis and the production of dopamine and serotonin, which are essential to neurotransmission [7, 14, 20, 21]. The storage and transportation of iron are highly dependent on red blood cells morphology and the presence of sufficient amounts of hemoglobin. Iron is toxic when too much is stored in the body and may cause tissue damage when the iron capacity has been exceeded [5, 19]. This problem can be compounded by the very slow rate at which humans naturally lose iron .
Dietary iron comes in two forms: heme and non-heme. Heme iron, which is more easily absorbed, comes from meat, poultry and fish. This source of iron is easily absorbed because it is delivered as the stable prophyrin complex and is unaffected by other food components . Contrary to this, non-heme iron is more difficult to absorb and comes from cereals, legumes and some vegetables .
Non-heme iron from food comes in several different chemical forms and the elemental iron is dependent on this and the body’s ability to recognize and absorb the iron. For example, the chemistry of ferric (Fe3+) is much more dependent on pH levels in the stomach relative to ferrous (Fe2+) iron; ferric iron requires a more acidic environment to be absorbed . Thus, consuming vitamin C with dietary iron or iron supplementation can be a critical step to absorbing the maximum amount of elemental iron possible [5, 23]. Non-heme iron is further complicated in the presence of tannins (e.g. in tea) or phytate (e.g. in whole grains), as these are known to inhibit the absorption of non-heme iron [5, 21].
Infection with soil-transmitted intestinal helminths is a common problem among populations living primarily in rural areas of low-income countries. These parasites are a large, polyphyletic grouping of multicellular organisms that can generally be seen with the naked eye in their mature stages. Helminths typically include roundworm (Ascaris lumbricoides), whipworm (Trichuris trichiura) and the hookworms (Necator americanus and Ancylostoma duodenale) . Some estimates suggest that more than 2 billion people on the planet are infected with one, or multiple helminths, with the highest prevalence occurring where sanitation is poor and water supplies are compromised .
Soil-transmitted helminths live in the intestine of an infected person, and are easily spread when a person comes into contact with the feces of an infected individual. This contact might result from the use of gardens, bushes or fields as an open latrine, or could result from the use of human feces as fertilizer that is intentionally sprayed onto crops and deposited into the soil. A new host is established when an uninfected person then unknowingly ingests the eggs of these parasites .
The association between parasite infection and anemia is well known and has been widely documented . Primarily, the organisms cause gastrointestinal bleeding resulting in blood loss, lowered hemoglobin values, and resultant anemia. Further, some organisms disrupt nutrient absorption by damaging the mucosal surface of gut, leading to poor absorption of micronutrients. For example, the two significant hookworm species Ancylostoma duodenale and Necator americanus, produce 5000-10,000 and 10,000-25,000 eggs per day, resulting in 0.03 mL and 0.15-0.23 mL blood loss per day, respectively . In addition, some hookworms release anti clotting factors that ensures continuous blood flow and thereby further leads to poor health outcomes .
Malaria is a well-known cause of morbidity and mortality in the developing world, initiated by infection by parasites of the genus Plasmodium. Estimates of the burden of disease suggest that more than 515 million episodes occur annually, representing 18% of all childhood deaths in Sub-Saharan Africa [28, 29]. Anemia caused by malaria has a multi-factorial pathophysiology with unknown molecular mechanisms . Research has shown that malaria is implicated in both the destruction of red blood cells through hemolysis of both infected and uninfected erythrocytes, but also through the decreased production of erythrocytes in the bone marrow . Malarial anemia is therefore typically normocytic and normochromic with a distinct absence of reticulocytes.
Severe malarial anemia is of great public health concern because of the widespread prevalence of malaria in the developing world where access to appropriate healthcare is limited, and because children and pregnant women are the hardest hit by the condition . Malarial anemia is most often observed in areas where high malarial transmission occurs . Severe anemia caused by P falciparum, one species of malarial infection, is responsible for approximately one-third deaths associated with the disease .
Treating iron deficiency and iron deficiency anemia must continue to be a priority of governments, non-governmental organizations and international aid agencies alike. The ideal approach to addressing iron deficiency is a nutritious and well balanced diet that naturally provides adequate iron intake. However, this is not a reality in much of the developing world where subsistence comes from cereals with low iron bioavailability. This generates the need for a cost-effective, simple and easy to administer solution that may be found in supplementation and fortification. Food fortification is widely considered to be the most cost-effective approach to treating IDA, but this is not without its challenges . While in the developed world fortified products are rampant, and often there are government regulations to ensure fortification of common food items (e.g. milk, breakfast cereals, and salt) this strategy can be less effective in the developing world. Even the smallest price increase in staple products in developing countries may be a deterrent to purchase and use them . With escalating food prices, which peaked in the 2008 food crisis, households in developing countries were spending large shares of their incomes on food and commonly had to switch to cheaper but less nutritious foods to combat hunger . In this case, the likelihood of households purchasing fortified foods may be a more difficult intervention to implement.
Currently several fortification approaches are being implemented; reference  describes three methods; 1) fortified products added during food processing (e.g. wheat flour); 2) fortification added at home during food preparation (e.g. multiple micronutrient powders); and 3) genetically engineered foods with enhanced nutrition (e.g. genetically engineered cereals). In general, iron fortification has been found to be very successful around the world; in a meta-analysis of randomized controlled trials for iron deficiency alleviation, 13 studies on iron fortification for women and 41 for children were assessed . These studies included fortification via sodium iron ethylenediaminetetraacetic acid (NaFeEDTA), ferrous sulfate, fortified candies, fortified curry powder and ferrous pyrophosphate . In these studies, efficaciousness of iron fortificants all showed significant results in improving hemoglobin concentration and reducing anemia in study populations . A separate meta-analysis of 60 studies showed similar results: iron-fortified foods are efficacious in increasing hemoglobin concentrations and reducing IDA in study populations .
The Lancet Series (2013) on maternal and child nutrition reported that during a trial of iron supplements in pregnant women they found a 67% reduction of iron deficiency anemia . Additionally, in non-pregnant women a review of studies showed that intermittent iron supplementation was effective in reducing the risk of anemia by 27% . This review also indicated that daily iron supplementation reduced the incidence of low birth weight by 19% . Despite the high success rates of iron supplementation, distribution and cost become major concerns and make compliance in iron deficiency reduction strategies very difficult to navigate. Iron supplementation, often in the form of a pill or liquid, is an expensive option for treating iron deficiency and IDA, though the elemental iron present is much higher .
Apart from the difficulties of program implementation, a cost-effectiveness analysis was conducted in four subregions of the world (African subregion, South American subregion, European subregion and South-East Asian subregion) in order to compare the cost-effectiveness (in $USD) of iron supplementation versus fortification. This study showed that in the developed world supplementation was more cost-effective due to well-developed distribution chains already in place . But despite the higher overall impact of iron supplementation on a population’s health, fortification is more cost-effective in rural and developing communities .
The use of micronutrient fortified candies and lozenges have been tested in some countries, including India and Indonesia. There are relatively few randomized controlled trials (RCTs) that assess the efficaciousness of this intervention compared to many of the other programs reviewed in this chapter.
Nutri-candy was developed as a strategy to alleviate iron deficiency and other micronutrient deficiencies in children aged 2-6 years, pregnant and lactating women and adolescent girls in the developing world .
In India, where various manufacturers produce nutri-candy, more than five million people in four states take a fortified candy every day at a cost of US $1.33 per person per year which includes the cost of transportation . In West Bengal a study showed that while using nutri-candy the prevalence of iron deficiency anemia reduced by 15%. In Haryana another study took place that found the prevalence anemia changed from 50% at baseline to 9.6% in the group receiving a daily nutri-candy .
In Indonesia a similar study took place to test the impact of Vitella, a chewy fruit candy fortified with multiple micronutrients, on iron levels in children aged 4-6 years . This study found that the hemoglobin concentration of children receiving the candies increased by 10.2g/L, and anemia prevalence decreased by 42.1% after 12 weeks of intervention .
Based on these few studies in India and Indonesia the use of fortified candies should be further researched and incorporated into national strategies to alleviate iron deficiency. All studies reported high acceptability of the product both from caregivers and children [38-40]. If incorporated into school programs or other national strategies it may be very effective in reducing the burden of iron deficiency in low-income countries.
Certain foods can be fortified after harvesting in order to deliver more iron during consumption and some can be genetically modified and bred to include more iron. The universal fortification of foods has been promoted as a strategy to address micronutrient deficiencies around the world. For example, in Canada and the United States it is a legal requirement to fortify milk during processing with Vitamins A, C and D [41, 42]. Universal food fortification is viewed to be an efficacious and cost-effective strategy to address IDA .
To use rice as an example, there are some strains of rice that have been genetically modified to have greater amounts of iron but the research in this area is still relatively new . This can be achieved by introducing other genes to rice breeds that increase the storage of iron on polished white rice . One way that this is done is by adding a gene from soybeans that adds the protein ferritin to rice, and studies have shown that this source of iron is bioavailable .
When rice is polished it loses much of its natural iron during this process and so a method of fortifying rice by applying an edible coating that improves the available iron in the rice . A third way to fortify rice is by using a product called Ultra Rice which is a high-iron simulated rice grain that can be mixed in with regular rice and is highly accepted with no reports of bad texture or taste [43, 47].
During a randomized controlled trial in the Philippines iron fortified rice was proportionately mixed with regular rice to provide more available iron in a serving of rice. Hemoglobin concentrations of children in the study sites significantly increased from baseline to endline, and prevalence of anemia significantly decreased by 4.7% . In Mexico when female factory workers were given fortified rice five days a week for six months iron status improved significantly, and the prevalence of anemia was reduced by 80% . However, a study in Brazil found that the use of fortified rice only once weekly in child-care centres brought about similar results . In addition to improved iron status, another study in Brazil found that study participants and their families exhibited a very high rate of acceptance of the fortified rice as part of their diet with no adverse effects or undesirable taste, colour or smell . Iron fortified rice, with proven effectiveness in improving iron status as well as high acceptability of the product, may be an important part of national nutrition strategies.
Wheat flour and maize flour can also be fortified to be more iron rich, and may be a useful tool in addressing iron deficiency in communities where large amounts of wheat flour and its products are consumed . The WHO supports the fortification of wheat and maize flour but suspects that it will be the most effective in reducing iron deficiency when it is mandated at the national level .
In a trial that assessed the efficacy of fortified wheat flour it was found that there was a lower prevalence of iron deficiency in women who consumed the fortified flour more regularly . In a study conducted with Indian school children, iron fortified whole wheat flour, after seven months the prevalence of iron deficiency significantly declined from 62% to 21% .
Iron biofortified pearl millet has also been tested as an approach to address iron deficiency in communities of developing countries. A study in Benin found that consuming iron fortified pearl millet can double the absorption of iron in women and may be a highly effective approach to combatting iron deficiency in millet-eating communities .
Despite the improvement in hemoglobin concentration in trial communities, there were challenges and limitations to this approach for addressing iron deficiency and IDA. Primarily, the cost to households associated with switching to fortified products proved to be a challenge. For example, in the Philippines when the market was flooded with unfortified rice by the government, the population began purchasing the cheaper, unfortified variety rather than the iron-fortified rice that was slightly more expensive, despite the added benefit . The other studies reviewed did not test the possibility of selling fortified products in markets so it is uncertain how this would work in different countries. It is likely that higher costs of fortified rice could be a barrier to the sale of the fortified rice, and this strategy may be difficult to sustain within a competitive market environment.
It is possible to fortify condiments such as fish sauce and soy sauce, which are commonly consumed in Asian countries. A study conducted using iron fortified Thai fish sauce showed that iron absorption from ferric sulfate fortified fish sauce added to a meal of rice and vegetables showed positive results . This suggests that fortified fish sauce may be an effective solution to combat iron deficiency in countries with high fish sauce use. A similar study conducted in Switzerland assessed NaFeEDTA fortified fish sauce and found that it is a potential fortification method to address iron deficiency and iron deficiency anemia . In Cambodia fortified fish sauce and soy sauce are becoming more salient in the iron deficiency discussion and the government intends to legislate fortifying these condiments by 2015 . In Vietnam, where fish sauce is consumed as part of the regular diet and produced locally, a study was conducted and found that women using iron fortified fish sauce for six months had an average higher hemoglobin concentrations by 8.7g/L than women who used regular fish sauce . However, there are still challenges associated with fortifying condiments which include a changing of the colour of the sauce due to the addition of fortificants as well as political challenges to monitoring and rolling out this type of initiative [54, 56].
While the cost-effectiveness ratio of this type of fortification has been positive, these analyses do not include the costs of potential health implications of high sodium diets . Sodium is known to cause many adverse health problems including hypertension, increased risk of cardiovascular disease and increased risk of stroke . It is possible that this type of intervention may promote excessive salt intake and the negative health impacts may outweigh the benefit of increased iron intake . In fact, the WHO recommends that “the use of salt as the vehicle for new fortification initiatives other than iodine and fluoride should be discouraged,” .
Micronutrient powders include any fortified powders that have at least two micronutrients in their composition. These powders are added to food being prepared in the home and then consumed, providing the benefits of the micronutrients they contain. This fortification method has received a great amount of attention in the nutrition world, as many micronutrient deficiencies are so prevalent around the world in both developed and developing countries and this strategy has the potential to address multiple deficiencies at once.
Sprinkles, a brand of MNP, were developed as a fortification method by creating a powder with iron and various micronutrients that can be ‘sprinkled’ on complementary foods before being fed to children [61, 62]. Various studies have shown that this is an effective method of increasing hemoglobin concentration and treating iron deficiency anemia in children [62-66]. In Cambodia two studies were conducted that demonstrated a significant decline in iron deficiency anemia among groups of children who were on a daily regiment of sprinkles versus a control group with no intervention [67, 68]. A study in Kenya assessed Sprinkles in a more real world setting by selling the product in local markets and found that in this situation they remained efficacious in reducing iron deficiency . In India, where IDA is incredibly prevalent, the use of MNPs resulted in significantly reduced levels of IDA in children after 24 weeks of treatment . In a meta-analysis conducted on 17 studies, micronutrient powders showed to significantantly reduce the prevalence of anemia by 34%, iron deficiency anemia by 57%, and improves hemoglobin concentration . These results were from studies that occurred in various countries including Ghana, India, Bangladesh, Kenya, Pakistan and Haiti, which represents a number of regions in the world .
While the randomized clinical trials that have been conducted to test various MNPs have been successful, these are all in highly controlled situations. Studies have consistently showed success in improving biomarkers such as hemoglobin concentration and a reduction in anemia prevalence, however the adherence to such programs in a less-controlled environment may differ. Distribution, cultural cooking practices, and cultural perceptions of MNPs may impact the effectiveness of such programs, though in the case of MNPs there is potential for this intervention to remain efficacious. Micronutrient powders were accepted in many communities by caregivers around the world because they can be incorporated into the regular feeding practices [64, 68]. In a study conducted in Bangladesh on sprinkle adherence  it showed that adherence was higher and hematological improvements were greater in groups who received more flexible instructions for use of 60 Sprinkles packets, rather than being told to use them daily. Further research into adherence and acceptability would be useful in order to develop a strong implementation strategy for different regions of the world.
Early studies demonstrated that cooking food in cast iron pots increases the iron content of certain foods and that this iron is bioavailable . In the face of the global iron deficiency endemic, researchers believed that promoting the use of cast iron pots could be efficacious in reducing the incidence of iron deficiency and iron deficiency anemia. Studies have taken place that assess the effectiveness, cost effectiveness and acceptability of this approach.
A laboratory study was conducted in order to assess the impact on the iron contents of green leafy vegetables when cooked in different pots. This study demonstrated that the material of the pot plays a significant role in the bioavailable iron in green leafy vegetables, and those cooked in iron utensils had 9% more available iron than those cooked in non-iron pots . There is still some debate whether or not the iron that leaches into the food is bioavailable, and the need for more extensive analysis into improving iron availability in more types of food such as maize and rice .
The main limitations of using cast iron pots to improve iron status are that there was low acceptability during randomized controlled trials [75-77]. Cast-iron pots were reported to be heavy, rust easily, and required more attention for cooking due being prone to higher cooking temperatures . In one study it was found that participants were selling the cast iron pots in the market in order to supplement low family incomes . On the other hand, cast iron pots required less wood for stoves since cooking times were faster, and the pots were considered to be very durable .
Despite increases in hemoglobin concentrations and a reduction in anemia in the trial communities, the use of cast iron pots is not an effective strategy for addressing iron deficiency and IDA in most cases. It is possible that in communities where the use of cast iron pots is already prevalent then promoting continued or increased use has potential to be a part of a larger strategy to eliminate iron deficiency and IDA because the communities already accept the use of these pots .
The Lucky Iron Fish, an intervention based on the same principles of cooking with a cast iron pot, has shown in a randomized controlled trial to be effective in increasing hemoglobin concentrations by 11.6g/L, and reducing anemia by half in the study population, compared to the control group after 12 months . This method involves boiling water or cooking soup with the Lucky Iron Fish (an iron ingot shaped like a common Cambodian fish) for 10 minutes and adding some form of ascorbic acid (citrus juice, most commonly) [80, 81].
While this has potential to be effective in addressing iron deficiency it has only been tested on women of reproductive age, and thus it is not certain that the Lucky Iron Fish will provide the right amount of iron for children. The clinical trial included only 6 pregnant women, and future research could seek out larger numbers of pregnant and lactating women to test the efficaciousness in these times of high iron demand. Furthermore, gaining a better understanding of the type of iron that leaches into water and foods would be beneficial, as would further testing of fortifiable foods. Additional challenges associated with this approach are predominantly surrounding acceptability and education, as this intervention requires significant behavior change in home practices. Research regarding the acceptability, adherence and cost-effectiveness of this strategy should take place in order to carefully compare it with alternate interventions to be included in a national nutrition strategy.
In Cambodia, it is reported that in 2006 63% infants and young children suffer from anemia . Current projects addressing iron deficiency in Cambodia include: the government supported supplementation for pregnant and post-partum women; Sprinkles; weekly iron folic acid supplements for women of reproductive age; helminth control; fortified foods such as rice and fish sauce; and the Lucky Iron Fish [82-84]. However, the official government stance, as advised by the World Health Organization (WHO) is for a supplementation program, and thus no official fortification program is currently in place or widely accepted by the numerous government and non-government groups working to address nutrition issues in Cambodia . Recent research assessing the possibility of iron-fortified condiments has put fortification on the government’s radar and is speculated to result in a change in government legislation . Iron deficiency and its associated anemia remains a salient health issue in Cambodia and looking ahead a more multi-faceted approach may be necessary to successfully combat this challenge.
The following guidelines are followed by Health Centres in Cambodia: First contact during pregnancy: 60 iron/folic acid tablets provided; Second contact during pregnancy: 30 iron/folic acid tablets provided; Post-partum: 42 iron/folic acid tablets provided .
Iron supplementation is known to be incredibly effective due to the high bioavailability of iron [85-87]. This is especially important to ensure during pregnancy when iron demands of women are greater, and the risks can be more severe [1, 87]. Daily iron supplementation is the most effective strategy at protecting iron stores in women during pregnancy but large doses of iron are also associated with negative side effects . The negative side effects may cause problems with compliance, thus damaging the effectiveness of these programs . Currently an urgent need for evaluation on the government’s supplementation program exists – as no studies have reported the effectiveness in at-need populations and monitoring the changes in iron deficiency and IDA.
In two focus groups held in February 2014 in the remote Preah Vihear province in northern Cambodia, women (approximately 30 in each village; Koh Ker and Ker) reported that less than 5% of participants visited a health centre during their pregnancy . Collectively, the women reported that they were aware that at their community health centre they would have received free iron supplements for during pregnancy and after delivering their child. Despite this, women chose (by personal preference or circumstances dictated this) to deliver in their homes with a local midwife from their village. The health centres can be expensive to travel to and are far away from their villages, in the case of the two villages that were visited. The roads used to travel are barely passable, which only becomes more difficult during the rainy season (June – November) . This suggests that in some parts of Cambodia the government initiatives may not be reaching marginalized populations, the population that demonstrates the greatest need.
There is a significant difference between fortification and supplementation with regards to economic feasibility, government involvement and effectiveness in improving health. Fortification has an opportunity to be a more cost-effective and efficacious treatment for iron deficiency and iron-deficiency anemia than supplementation in many developing countries. While iron supplements should continue to be a critical component of a national strategy to alleviate iron deficiency and its associated anemia, particularly in times of high iron demands such as pregnancy, a multifaceted approach should be developed to more effectively combat this micronutrient deficiency.
Each of the fortification strategies reviewed in this chapter were efficacious in improving iron status in a controlled environment. However in many cases this does not provide assurance that adopting these strategies will be effective in every community. Adherence and acceptability are critical to the success of these interventions, but most RCTs do not assess these conditions, as it can be difficult with the need for controlled settings to conduct trials. As with many other development strategies, the approach will vary across, and even within countries, which makes developing guidelines for the treatment of ID and IDA difficult. What works in one country will likely not work in the exact same way in a neighbouring country, and the same can be said from one village to the next. However, some of the interventions that were reviewed in this chapter demonstrated greater success in different regions of the world than others. The use of cast-iron pots did not prove to be an acceptable strategy in most countries due to the preference for lighter, easier to clean aluminum pots.
Multi-micronutrient powders have proven to be consistently effective in RCTs in alleviating iron deficiency in children. Both government distribution channels and selling sachets of MNPs in a market setting have been successful. A limitation of this approach is that few studies have tested the efficaciousness of MNPs in treating ID and IDA in groups other than children. The few studies conducted testing the effectiveness of MMPs in pregnant women the results are modest, and has been suggested that alone they are not adequate to address maternal iron deficiency . There are still challenges being addressed with this approach but it has the potential to be efficacious as part of a national strategy. Since women of reproductive age are in great need of iron fortification programs multi-micronutrient powders should not be the sole component of a national approach, but could be included as part of a strategy to improve children’s iron status.
Similarly, biofortified staples such as rice and wheat flour were highly acceptable and effective in decreasing ID and IDA. Future research could focus on cereals beyond rice and wheat flour and the possibility of enriching them with iron either during the breeding and growing of a crop, or during processing. Further research surrounding genetically modifying rice to contain more iron is required, otherwise the iron must always be added during processing and risks being rinsed off when the rice is washed. Biofortified staples can be very sensitive to market prices, which could be a serious limitation in competitive market settings. This approach would work if governments are able to pass legislation that mandates all of the main staple product in a region be fortified with iron. The challenges associated with this are difficult, but the positive health outcomes have the potential to be vast.
Ultimately, for any of these fortification strategies to be successfully implemented extensive education and health behavior campaigns must take place. Properly planned education materials and outreach can promote adherence and acceptability at a community level. Iron deficiency can sometimes not be felt immediately by an individual, especially when compared to hunger, which is why it is often called “hidden hunger,” along with other micronutrient deficiencies. This is relevant to education and behavior change because certain behaviours that aren’t routine (such as washing your hands, or taking an iron supplement) may not seem credible or urgent because there are no immediately visible health concerns . This may pose a challenge to communicate the future implications and the less tangible impacts of these things on their health in order to promote healthier behaviours.
One of the largest and most difficult questions is then: who is responsible? At this point in the battle to alleviate iron deficiency and IDA the most dangerous thing that can be done is nothing. Inaction in the face of this epidemic will benefit no one, and the negative health consequences of iron deficiency will persist. The cognitive and physical development consequences and threat to maternal health that iron deficiency poses are severe and will only continue to be the most prevalent nutrition issue in the world without the combined support of governments, aid agencies and non-government agencies alike.
964total chapter downloads
Login to your personal dashboard for more detailed statistics on your publications.Access personal reporting
Edited by David Claborn
By Elsa Dias, Fernando Garcia e Costa, Simone Morais and Maria de Lourdes Pereira
Edited by David Claborn
By Elizabeth F. Rangel, Simone M. da Costa and Bruno M. Carvalho
We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.More about us