|This article needs additional citations for verification. (May 2015) (Learn how and when to remove this template message)|
Mismatch theory is a concept in evolutionary biology that refers to fluidity in fitness criteria.
The essence of mismatch theory is that organisms possess traits (including behavioral, emotional, and biological) that have been passed down through generations, preserved by natural selection because of their adaptive function in a given environment. However, the given environment of the evolutionary period can be quite unlike the current environment. Therefore, traits that were at one time adaptive in a certain environment, are now "mismatched" to the environment that the trait is currently present in. This can present a number of problems for the organism in question. One example is the taste of foods high in fat and sugar to humans. In Pleistocene environments, sugars and fats were relatively uncommon in the human diet. In the modern Western diet, however, foods with such properties are relatively easy to acquire. This can be problematic since an abundance of such foods combined with the human adaptation to prefer them can, and often does, contribute to obesity and chronic metabolic syndrome.
The role of evolutionary mismatch in disease has been explored by a group of researchers at Duke University. They theorize that mismatches inclusive of vitamin D deficiency attributed to lack of regular exposure to sunlight, chronic stress, and particularly the elimination of keystone species of organisms such as geohelminths from the human body are thought to play a part in the development autoimmune diseases. These diseases are rare in undeveloped countries where people live in conditions that more closely reflect humankind's biological experiences over millennia.
Mismatch in Human Evolution
Humans have most recently evolved from life as hunter-gatherers, where food resources were scarce. In the 10,000 years since the advent of agriculture, the environment and lifestyle that humans live in have changed considerably, so much so that evolution has not been able to keep up with the rapid changes in environment. This leads to the notion of a mismatch theory in humans, such that the legacy of the adapted traits humans acquired in hunter-gatherer times has led to many modern day problems such as myopia, breast cancer, diabetes and osteoporosis, in addition to the oft discussed rise in obesity.
Myopia, or nearsightedness, has become a very prevalent disorder in modern humans. Up to 80% of some developed people in Asian populations suffer from myopia, and up to 30% of some European descendants are affected by the disorder. Some have suggested that the Mismatch Hypothesis may explain the rise in myopia. In hunter-gatherer times, a predisposition for nearsightedness would have been heavily selected against given the importance of good vision. However, in hunter-gatherer times, the lifestyle and environmental conditions may have prevented the myopia predisposition from manifesting itself into the disorder, allowing the alleles that predispose myopia to evade natural selection. Research has found that the development of myopia is likely correlated with years of traditional, modern schooling, where children often stress their eyes from a very young age. Because modern day environments require schooling, the predisposition for myopia often develops into the disorder, whereas this was not the case during hunter-gatherer times when children's eyes were rarely stressed on the level they are today. In this way, the rise in myopia can be attributed to the mismatch hypothesis.
The increase in prevalence of diabetes in modern times may also be attributed to mismatch theory. Human diet has changed considerably over the 10,000 years since the advent of agriculture. Whereas hunter-gatherers struggled to find foods high in sugar and fat, in current times diets are full of both sugar and fat. The evolutionary history of humans has selected for an intense craving for high sugar/high fat food, because these foods are so high in calories and would have provided a hunter-gatherer with much needed energy. Nonetheless, because such foods were scarce, ancient human bodies were designed to keep most of the ingested sugars in the bloodstream to maintain easy flow of sugars from the bloodstream to the brain. In hunter-gathere times, a defect in the insulin-receptor may have been adaptive. With a non-functioning insulin-receptor, a cell's glucose-transporter would not work and sugars would be left in the bloodstream as opposed to transported into cells. With this in mind, a predisposition for diabetes may have actually carried a selective advantage during hunter-gatherer times.
In modern human environments, however, sugars and fats are highly attainable. The evolutionary history that may have predisposed us to non-functioning insulin-receptors is now non-adaptive and leads to the development of diabetes. The mismatch hypothesis via a substantial change in diet and fat/sugar availability can partially explain the marked increase in diabetes in modern human populations.
In addition to the changes in diet, changes in hygiene may also be responsible for the increase in diabetes. Modern obsession with hygiene and antibiotics have changed the composition of human gut biomes, which are responsible for much of the digestion of our food. Research has found that in mice that have been treated with antimicrobials to eliminate much of the gut biome, a high fat diet is much more likely to result in obesity and the development of diabetes than in untreated mice with the same diet. Thus, the combination of changes in diet and hygiene may be responsible for the increase in prevalence of diabetes in modern human populations, a result that can be explained with the mismatch hypothesis.
Another human disorder that can be explained by Mismatch Theory is the rise in osteoporosis in modern humans. In advanced societies, many people, especially women, are remarkably susceptible to osteoporosis during aging. Fossil evidence has suggested that this was not always the case, with bones from elderly hunter-gatherer women often showing no evidence of osteoporosis. Evolutionary biologists have posited that the increase in osteoporosis in modern Western populations is likely due to our considerably sedentary lifestyles. Women in hunter-gatherer societies were physically active both from a young age and well into their late-adult lives. This constant physical activity likely lead to peak bone mass being considerably higher in hunter-gatherer humans than in modern-day humans. While the pattern of bone mass degradation during aging is purportedly the same for both hunter-gatherers and modern humans, the higher peak bone mass associated with more physical activity may have led hunter-gatherers to be able to develop a propensity to avoid osteoporosis during aging.
- Flannery, KV. 1973. The Origins of Agriculture. Annual Review of Anthropology 2: 271-310.
- Cordain, L., Eaton, S. B., Sebastian, A., Mann, N., Lindeberg, S., Watkins, B. A., O’Keefe, J. H. & Brand-Miller, J. (2005). Origins and evolution of the Western diet: health implications for the 21st century. The American Journal of Clinical Nutrition 81, 341-354.
- Rosner, M and Belkin, M> 1987. Intelligence, Education, and Myopia in Males. Archives of Ophthalmology 105: 1508-1511.
- Cani, P. D., Bibiloni, R., Knauf, C., Waget, A., Neyrinck, A. M., Delzenne, N. M. & Burcelin, R. (2008). Changes in Gut Microbiota Control Metabolic Endotoxemia-Induced Inflammation in High-Fat Diet–Induced Obesity and Diabetes in Mice. Diabetes 57, 1470-1481.
- Lieberman, D.E. 2013. The Story of the Human Body: Evolution, Health, and Disease. Pantheon Books, New York, NY.