Since the late 1980s, few people have failed to notice the rising incidence and severity of food allergies in the industrialized world. Indeed, the CDC has reported that the incidence of food allergies in children rose an additional 18% between 1997 and 2007.[i] It also seems that food allergies are becoming more common in adults.
Many theories have been put forth in an effort to explain this rise, though none of them but one seems to fully account for the facts that we definitively know about food allergies, in terms of both their epidemiology and pathophysiology.
Almost everyone feels on some level that the cultivation and consumption of genetically modified foods is bad, though because the effects of eating genetically modified organisms (GMOs) are still poorly understood, no one can really definitively say why. As a result, it is both easy and tempting to blame them for virtually any current epidemic health problem, since they could in theory be causing virtually anything because the long-term effects of eating GMOs are still poorly understood.
Since most people feel that GMOs are bad, and since the mind often demands a concrete reason on which to base its feelings, many people blame GMOs at least in part for the increase in food allergies. One glaring problem with this theory, however, is the fact that many of the most common food allergens—milk, eggs, and peanuts—are not, nor have they even been in the contemporary sense, genetically modified.
It becomes difficult to blame GMOs for why more and more children are developing life-threatening peanut allergies, to the extent that some schools are banning peanut butter outright, when in fact there do not as yet exist any genetically modified peanuts. Some claim that peanuts have been contaminated by other nearby GMO crops, cotton for instance, but the connection here remains tenuous at best.
Probably the most widely held theory regarding the rise in allergic conditions is the hygiene hypothesis.
Basically, it is believed that the modern environment is now so clean that our immune systems are becoming “restless” and now go around looking for trouble by mounting exaggerated allergic responses to otherwise harmless triggers, as if for lack of having any better thing to do.
It was noticed, for instance, that the immunoglobulin E (IgE) antibodies mediate allergic responses originally evolved to fight parasites, so it has been proposed that the lack of parasitic disease in the industrialized world has created allergies as some sort of compensatory outlet for this unoccupied facet of the immune system.
Some support for this theory has come from the observation that allergic conditions, including food allergies, are more common in urban populations than they are in rural populations,[ii] implying that children who live in cities are not being exposed to all the healthy germs existing in the mud, animal excrement, etc., to which all of those filthy rural children who grow up on farms are exposed.
There are a couple of problems with this theory, though. First, it relies largely on ridiculous stereotypes rather than on hard scientific evidence. It is presumed that children who live in rural homes are “dirtier” and get exposed to more germs than those refined, cultured city children whose parents regularly sanitize their toys with disinfecting wipes, but given the ubiquitous presence of germs, can anyone be sure this is actually true?
Consider that many parasites are spread through human feces—one could easily argue that a typical doorknob in a typical urban daycare center would expose an urban child to a far greater diversity of pathogens than, say, a rural child pulling off a pair of muddy boots and then eating a sandwich without his parents giving him any hand sanitizer first.
Beyond ignorant stereotypical assumptions, there really is no reason to expect that rural children would be any less hygienic than their urban counterparts, although it remains true that children raised in rural environments develop fewer food allergies.
Another problem with the hygiene hypothesis is that, in truth, the body does not actually work in the way the hypothesis supposes. The main assumption behind the hygiene hypothesis is that in a more hygienic environment the T-helper 2 (Th2) immune response has nothing to keep itself occupied, so it responds by going haywire and becoming overactive.
And yet, every other structure and function in the human body would seem to follow the opposite pattern: atrophy with disuse. If you immobilize an arm in a cast, for instance, the muscles in that arm do not respond to this by cramping up and spasming uncontrollably—if they are not being put to use, they simply atrophy. This is because the body is very economical and doesn’t waste precious energy and nutrients maintaining a function no longer in use. A similar phenomenon can be observed when someone accustomed to drinking caffeine or alcohol abruptly discontinues those things.
The liver does not react to this by randomly attempting to over-detoxify otherwise nontoxic things because it simply has “nothing better to do.” It simply stops producing the enzymes it previously used to detoxify those substances; hence, any previous tolerance will slowly disappear. And yet, we’re expected to believe that if you remove the threat of parasites through hygiene, the immune system elements which normally address those threats will not simply diminish in proportion, but, instead, actually do the opposite and grow out of control?
Given this, and considering also that hygiene was recognized very early on as possibly the single most important thing to maintaining a state of good health (to the extent that the ancient Greeks literally worshipped the concept as a goddess responsible for preventing human sickness), it seems difficult to accept that children are now fatally vulnerable to peanuts because they practice hygiene too effectively.
The recent surge of interest in the still poorly understood human microbiome has served to refine the hygiene hypothesis somewhat. Just as anything can theoretically be blamed on GMO foods, since their effects are poorly understood, so too can anything theoretically be blamed on a disturbed intestinal microbiome, because that too remains poorly understood. The newest iteration of the hygiene hypothesis states that being too clean has probably altered the composition of the gut microbiome, and it is through this disturbance that the rise in food allergies is ultimately mediated.
One problem which remains with this theory is that it still doesn’t fit the epidemiological data very well, as I plan to illustrate. Since attempting to experimentally induce food allergies in children would of course be unethical, epidemiological data is really the best we have to work with when it comes to unraveling the causes of food allergy.
The following are all hard facts regarding the epidemiology of food allergies, each supported by firm scientific evidence:
- Food allergies are more common in urban children than rural children.[iii]
- Food allergies are more common in children born at latitudes further from the equator.[iv]
- The incidence of food allergies in infants correlates with the season of their birth.[v]
- Food allergies are more common in children with darker skin.[vi]
To any thinking person, the above facts would suggest that it is not lack of filth in the environment which is predisposing these children to food allergies, but instead merely a lack of adequate sun exposure and vitamin D!
And indeed, another risk factor supported by scientific evidence is:
- Food allergies are more common in children with lower serum vitamin D levels.[vii]
The hygiene hypothesis might explain the urban-rural connection, but is anyone seriously suggesting that black children are any more or less hygienic than white children?
And yet, African ancestry has been identified as a major risk factor for developing food allergies in the industrialized world, as has living further from the equator, being born during autumn or winter, and having lower serum vitamin D levels. It seems far more likely that rural children have fewer food allergies not because they’re filthier and get exposed to more germs, but in fact because they are likely to spend more time outdoors in the sun.
The spike in life-threatening food allergies was first noticed in the late 1980s, having only increased in widespread severity since then. It is intriguing to note this timeframe’s correlation with the invention of a brand new children’s toy which the world had never seen before: the home video game console.
The first true video game system, the Atari 2600, appeared in 1977, and the original Nintendo Entertainment System was released in 1983, revolutionizing the way in which modern children play. Since then, video games have only grown more popular, and with the subsequent advent of home computers and, later, tablets and smart phones, children are spending more of their playtime indoors than ever before.
Of course, if low vitamin D levels were contributing to children’s food allergies, one would also expect this theory to fit with the pathophysiological basis of food allergies.
Consider the following immunological facts:
- Vitamin D suppresses the maturation of dendritic cells, which initiate the process of allergic sensitization.[viii]
- Vitamin D enhances the maturation of regulatory T cells (Tregs), which suppress the allergic response.[ix]
- Vitamin D stabilizes mast cells by inhibiting interleukin-12 (IL-12) production in dendritic cells[x] and promoting interleukin-10 (IL-10).[xi]
- Vitamin D reduces IgE antibody production in B lymphocytes (B cells).[xii]
- Vitamin D strengthens the gut-mucosal barrier by increasing cadherin expression.[xiii]
In other words, vitamin D alters the function of the immune system at several levels, all of which correspond to an overall decreased allergic response, and strengthens the gut-mucosal barrier, the integrity of which is crucial to preventing allergic sensitization to foods. In light of the above, when it comes to explaining the disturbing increase in the incidence of food allergies, the best theory would seem to be that the pandemic level of food allergies in children is probably related to the pandemic level of vitamin D deficiency in children.
True, this is probably not the only contributing factor, but when it comes to treating food allergies and allergic conditions in general, the importance of vitamin D should neither be forgotten nor underestimated. It might well be the root cause.
[i] Branum, A. M., & Lukacs, S. L. Food Allergy among U.S. Children: Trends in Prevalence and Hospitalizations. NCHS Data Brief, No. 10, October 2008. Accessed 2/27/19 at https://www.cdc.gov/nchs/products/databriefs/db10.htm
[ii] Gupta, R. S., Springston, E. E., Smith, B., Warrier, M. R., Pongracic, J., & Holl, J. L. (2012). Geographic Variability of Childhood Food Allergy in the United States. Clinical Pediatrics, 51(9), 856–861. https://doi.org/10.1177/0009922812448526
[iv] Osborne, N., Ukoumunne, O., Wake, M., & Allen, K. (2012). Prevalence of Eczema and Food Allergy is Associated with Latitude in Australia. Journal of Allergy and Clinical Immunology, 129(3), 865-867. https://doi.org/10.1016/j.jaci.2012.01.037
[v] Tanaka, K., Matsui T., Sato, A., Sasaki, K., Nakata, J., Nakagawa, T., Sugiura, S., Kando, N., Nishiyama, T., Kojima, S., & Ito, K. (2015). The Relationship between the Season of Birth and Early-Onset Food Allergies in Children. Pediatric Allergy and Immunology, 26(7), 607-613. https://www.ncbi.nlm.nih.gov/pubmed/26177863
[vi] Kumar, R., Tsai, H. J., Hong, X., Liu, X., Wang, G., Pearson, C., Ortiz, K., Fu, M., Pongracic, J. A., Bauchner, H., & Wang, X. (2011). Race, Ancestry, and Development of Food-Allergen Sensitization in Early Childhood. Pediatrics, 128(4), e821-e829. https://www.ncbi.nlm.nih.gov/pubmed/21890831
[vii] Allen, K. J., Koplin, J. J., Ponsonby, A. L., Gurrin, L. C., Wake, M., Vuillermin, P., Martin, P., Matheson, M., Lowe, A., Robinson, M., Tey, D., Osborne, N. J., Dang, T., Tina Tan, H. T., Thiele, L., Anderson, D., Czech, H., Sanjeevan, J., Zurzolo, G., Dwyer, T., Tang, M. L., Hill, D., & Dharmage, S. C. (2013). Vitamin D Insufficiency is Associated with Challenge-Proven Food Allergy in Infants. Journal of Allergy and Clinical Immunology, 131(4), 1109-1116. https://www.ncbi.nlm.nih.gov/pubmed/23453797
[viii] Suaini, N. H., Zhang, Y., Vuillermin, P. J., Allen, K. J., & Harrison, L. C. (2015). Immune Modulation by Vitamin D and Its Relevance to Food Allergy. Nutrients, 7(8), 6088-6108. doi:10.3390/nu7085271
[xiii] Kong, J., Zhang, Z., Musch, M. W., Ning, G., Sun, J., Hart, J., Bissonnette, M., & Li, Y. C. (2008). Novel Role of the Vitamin D Receptor in Maintaining the Integrity of the Intestinal Mucosal Barrier. American Journal of Physiology: Gastrointestinal and Liver Physiology. 294(1):G208-G216. Accessed at https://www.ncbi.nlm.nih.gov/pubmed/17962355
Any homeopathic claims are based on traditional homeopathic practice, not accepted medical evidence. Not FDA evaluated.
These statements have not been evaluated by the Food and Drug Administration. These products are not intended to diagnose, treat, cure, or prevent any disease.