Tag Archives: Evolution

An Evolutionary Explanation For Why You Wear Glasses

Empirically testing health-related hypotheses formulated through an evolutionary lens can prove to be difficult. The environment and the human experience are radically different from the first 6 million years of human evolution. Living on the edge of human existence and the top end of the techno-scientific scale, we are far removed from the environment to which many of our genes are hypothesized to be properly suited. Fortunately, the human race is a diverse group of individuals who have dispersed across the globe and have acclimated to a variety of circumstances. Accordingly, a few hunter-gatherer societies remain in parts of Africa. Though neither their genes nor their cultures are identical to original hunter-gatherers, they do retain the closest genetic and sociocultural similarity to human ancestors in the modern world. This is not to say that they are “less evolved” than other human societies. This notion is elementary and indicative of evolutionary ignorance. They are very well suited for their habitat, both genetically and culturally. Fortunately, those of us who are less suited for our environments, both genetically and culturally (i.e., everyone else, particularly in the US), can glean incredible insights about the functioning our own bodies and to what dietary and daily circumstances our physiology is best suited.

I recently wrote a primer on evolutionary medicine (which can be found here), which might be beneficial to read before getting into the specifics. This post will focus on myopia, or “near-sightedness,” the visual condition where objects at a distance are out of focus. Myopia affects about 15% of Africans, a third of Americans and Europeans, and over 75% of Asians – a curious bias that I’ll address later in the article. Fortunately (sort-of), myopia is easy to treat with glasses or contacts, and can even be cured to some extent with Laser-Assisted in situ Keratomileusis, commonly known as LASIK. Myopia occurs when the eye is too long, causing the focal point of light to occur prematurely, resulting in a blurry image. As a result, corrective lenses refract the light before it hits the cornea, essentially “overshooting” the refraction. For example, myopic corrective lenses will be thicker on the sides and thinner in the middle, causing the light to spread out slightly more before it hits the cornea, ultimately moving the focal point further back in the eyeball. With LASIK, a high frequency laser is used to vaporize (note: no heat is used. The vaporization is due to the light wavelength) tissue on the center of the cornea, thus reshaping the cornea so that light will be correctly refracted.

63345-004-1DB996D5

In order to focus, the eye depends on ciliary muscles that are attached to the lens. When focusing on something far away, as would often be the case outdoors, the ciliary muscles contract, stretching the lens to a flattened shape. When focusing on something up close, such as a book, television, computer, or phone, the muscles relax, allowing the lens to become more concave. Think of a camera lens: to focus on something far away, you use a longer lens or zoom in. Doing this moves the focal point of distant objects further back, allowing them to be in focus. To take up-close shots you use a macro lens, which is a very short, rounded lens that moves the focal point for near objects closer to the lens. This is how the eye works. Myopia is what happens when your zoom function is broken. Evolution and an analysis of our current sociocultural context might be able to tell us why this happens.

2022820-accomodation

I’m a student, and spend a lot of my time looking at a book, a laptop, or a phone. I love to get outside when I can, but, ultimately, most of my time is spent looking at things up-close. That means that the ciliary muscles in my eye – the zoom muscles – spend most of their time relaxed. Just like any other muscle that goes unused, the ciliary muscle will likely begin to atrophy and become weaker (as far as I’m aware, no quantitative studies have been performed on ciliary muscle size or mitochondrial count, probably because this would be difficult or impossible to do on a living person. Perhaps future studies can examine the ciliary muscles of recently deceased individuals and compare individuals who suffered from myopia with individuals who had normal vision). Over time, particularly if it occurs throughout critical stages of development during childhood, the muscles may become to weak to contract and properly pull the lens flat, thus preventing you from being able to focus on distant objects. Of course, this begs the question of whether or not the muscle be strengthened. I don’t know, and I’m not sure that I am willing to find out by using myself as a guinea pig. Unfortunately, that makes me part of the problem of “dysevolution,” as coined by Harvard paleoanthropologist and human evolutionary biologist, Daniel Lieberman. Dysevolution refers to the circle of treating diseases without trying to change or fix the cause. Our technology and scientific understanding has advanced so rapidly in the past 100 years that we can fix things such as myopia with ease. Often this cycle is perpetuated by comfort. Why change what the way I do things when I can just buy contacts or glasses? My previous post mentions several other possible mismatch diseases, and Lieberman’s book, “The Story of the Human Body,” goes into detail about many of them. For many of them – if not most – we simply ignore the possible cures and instead opt for a more “comfortable” and easy treatment. However, this cycle is sure to grow and intensify as time goes on.

Evolutionary medicine is sometimes difficult to empirically test. However, as mentioned above, modern day hunter-gatherer societies can offer incredible insight and points of comparison for how sociocultural differences may affect our “mismatch diseases.” Studies of this kind are unfortunately few and far between (possibly because research funding also focuses on treatments). However, studies with hunter-gatherer societies have shown that very few members suffer from myopia (as well as many other non-infectious ailments, such as type-2 diabetes, heart disease, osteoporosis, and even cavities). The thought is that they are exposed to a variety of visual stimuli and their visual environment is constantly changing. This “exercises” their ciliary muscles and keeps them strong. Experiments have also shown that animals that are deprived of visual stimuli will grow elongated eyeballs. Similarly, people who spend more time indoors, particularly with studying, as is common in many Asian cultures, exhibit much higher instances of myopia whereas those who spend some time outdoors, as is more common in many African cultures, tend to have a lower rate of myopia. Our eyes did not evolve to see things 2 feet from our face all day long. They evolved to keep up alive from the plethora of visual stimuli in nature and to help us search for food: 2 things that many people, particularly children in developed countries, no longer need to do.

The solution isn’t to give up studying and electronics. It’s much more simple than that. Nearly everyone uses books and electronics, so why doesn’t everyone have myopia? One possibility is genetics, though that doesn’t seem like a plausible explanation. Rates of myopia have only skyrocketed in the last century, and any latent mutation for poor vision would have most certainly been selected against in our ancestors. The likely “cure” for myopia is balance. Spend time outside, especially as a child. The data from lab experiments as well as social statistics seem to point in this direction. If we continue to ignore the cause and only treat the symptoms, we are trapping ourselves in an ever growing cycle in which we become more and more dependent upon technology.

Evolution: The Missing Link in Medicine

“Nothing in biology makes sense except in the light of evolution.”

– Theodosius Dobzhansky

Evolution is arguably one of the most widely supported and powerful theories in all of biology, and potentially science as a whole. It has been a dominant explanation for over 100 years. Once genetics entered the picture in the first part of the 20th century, Darwin’s common descent and Mendel’s inheritance were improved upon, greatly expanded, and solidified into the new synthesis of evolution. Consistently verified through genetics, paleontology, geology, ecology, microbiology, and many other fields of science, evolution has become a pervasively potent field of study. It has created huge disciplinary offshoots – including evolutionary biology, evolutionary genetics, and evolutionary anthropology to name a few – and has become the theoretical foundation for all of biology.

Some people today argue that humans are no longer under evolutionary pressures, and, thus, are no longer evolving. Though this seems to make sense superficially, it is simply not true. The first issue is that humans only live about 80 years; a mere snapshot of our species’  existence. It is difficult to observe phenotypic differences as a result of biological evolution in only a few decades. That being said, scientists have found some very recent biological changes have occurred, including the altered expression of the FTO gene. The FTO gene codes for a protein that regulates appetite. While it does not “make” a person obese (genes tend to predispose, not determine), it has been correlated with obesity. The catch? It seems to have only been expressed after about 1940, according to a study published just 2 days ago. The study (which can be found here) found that, after 1942, the FTO gene showed a strong correlation with increase BMI. Why, though, would a gene that has not changed suddenly become active?

The Environment

What did change in the 1940’s? Technology. WWII offered an incredible economic boost to the US that massively increased technological enterprise and was the main contributing factor the world superpower status that the US achieved in the 40’s. As technology increased, labor decreased. After all, the main purpose of technology is to make human life simpler. When human life becomes simpler, people become more sedentary. New technology also allowed for cheaper, higher calorie, over-processed food. This one will take a while to work out. The difference could be epigenetic alteration, novel environmental stimuli, or even another gene interacting with FTO. While more testing will be needed to show exactly what happened in the early 40’s that altered FTO expression, the fact that something did occur, likely stemming from environmental changes, still remains. Biological evolution doesn’t have to be the changing of DNA sequence; that is far too simplistic. Anytime phenotypic or genotype ratios change on a species-wide level, evolution is occurring. No population is in Hardy-Weinberg equilibrium, and no population ever will be. Human wills continue to evolve biologically. While cultural evolution has exceedingly outpaced biological evolution, giving the mirage that biological evolution has “stopped,” the truth is that culture can either augment or stagnate biological evolution, depending upon the situation. A cultural change to drinking more milk may augment lactase persistence (and in fact, it did), while a cultural propensity to live in climate controlled housing year-round may slow other aspects of biological evolution. Nature doesn’t necessarily control natural selection; more broadly, the environment (cultural or natural) mediates evolution.

So, why is evolution important in medicine? Sure, doctors need to understand things like microbial evolution and how it plays a role in infectious diseases, but what about human evolution? How can a knowledge of human evolution impact medicine?

Cultural evolution has rapidly and drastically altered the human environment, thus changing how the human species evolves. More importantly, our environments have changed so aggressively that our bodies cannot keep up. (Before I go on, I have to make something clear. I am not a proponent of the Paleo Diet; if you’d like to know why, check out this post.) This means our bodies are often best adapted to the environments of the past (though these vary drastically). This has given rise to what are sometimes referred to as “mismatch diseases.” The list is extensive, but includes maladies such as atherosclerosis, heart disease, type-2 diabetes, osteoporosis, cavities, asthma, certain cancers, flat feet, depression, fatty liver syndrome, plantar fasciitis, and irritable bowel syndrome, to name a few. Some of these may not be actual mismatch diseases, but many of them likely are. Furthermore, many of these illnesses feed off one another, creating a terrible feedback loop. 100 years ago you’d likely die from an infectious disease; today, most people in developed nations will die from heart disease, type-2 diabetes, or cancer.

These diseases don’t have to be essential baggage of modernity. Anthropologists and (and some intrepid human evolutionary biologists) study modern day hunter-gatherer societies in order to glean information about the nature of our species pre-Neolithic Revolution. It’s important to note that these are not perfect models (cultural and biological evolution has still occurred in these hunter-gatherer societies), but are the best available. Interestingly, modern day hunter-gatherers don’t suffer from many of these mismatch diseases (This effect can’t be explained by longevity; hunter-gatherers regularly live into their late 60’s and 70’s. Though unusual to many of us, their lives aren’t as brutish as they are often portrayed). Diseases such as type-2 diabetes, hypertension, heart disease, osteoporosis, breast cancer and liver disease are rare among the societies. What’s more, myopia (near-sightedness), asthma, cavities, lower back pain, crowded teeth, collapsed arches, plantar fasciitis, and many other modern ailments are exceedingly rare. So what’s different? The easy answer is their diet, lifestyle, and environment. The difficult answer involves elucidating the physiological importance of certain social norms and biochemical processes of differing diets. Some very exciting work is beginning to arise in this field, dubbed “evolutionary medicine.”

Modern medicine and medical research focuses largely on treating problems, i.e., drugs and procedures that alleviate symptoms after the disease has manifested. While the cause is noble, and indeed necessary, it’s not enough. The childish logic of medical research creates a cycle of sickness-treatment that, in 2012, totaled almost $3 trillion in healthcare costs. Furthermore, the sedentary and Epicurean lifestyle in which many Americans live willingly feeds this cycle; among the less privileged, necessity feeds this cycle through the inability to afford healthy food, limited access to health education, and a sociocultural feedback loop that breeds its own vicious cycle.

There will likely never be a drug that can cure cancer (of which there are thousands of variants that can even differ between individuals who have the same variant), heart disease, type-2 diabetes, or many of the other previously mentioned noninfectious diseases. The rationale is akin putting water in your car’s gas tank and hoping additives will make it work as efficiently as gasoline. The car was built to run off gasoline. Similarly, your body has evolved to not eat an excessive amount of salts, carbs, and sugars (of which the different types, particularly glucose and fructose, do not have the same biochemical effects during digestion), sit for extended periods of time, wear shoes (particularly those with arch support; a common misconception is that arch support is good for you when, in fact, it weakens the muscles of the arch, leading to ailments such as collapsed arches and plantar fasciitis), read for several hours at a time, chew overly processed food, or many of the other things that people in developed nations commonly do, often times see as a luxury.

Modern medicine needs a paradigm shift. Funding needs to support not only treatments, but also investigations into prevention. The medical cause of diabetes may be insulin resistance, but what causes insulin resistance and how can we prevent it? Sugar may cause cavities, but what can do to prevent this? Shoes, even comfy ones, may cause collapsed arches, but how do we prevent this? The immediate response may be that this sort of prevention cannot be attained without abandoning modern technology all together. However, this isn’t the case, and it’s not the argument I’m trying to make. Research should focus on a broad range of interacting variable, including diet, work environment, school environment, and other aspects evolutionarily novel environments. Only after research from this evolutionary perspective takes place can constructive conversations and beneficial environmental changes occur. We don’t have to abandon modern society to be healthy; we just need to better understand how our lifestyle affects our bodies. Items such as smoking and alcohol are already age limited and touted as dangerous to health. Is junk food, particularly soda, any different? We don’t put age regulations on cigarettes or alcohol to protect bystanders. Instead, these regulations protect children who cannot be relied upon to make proper choices in their naivety. Should soda be under these same constraints?

If medicine and medical research does not undergo this paradigmatic shift and incorporate an evolutionary perspective, the outcome does not bode well for us. Medical costs will continue to rise with little room for improvement and greater opportunities for socioeconomic factors to play into the quality of healthcare available. This ad hoc treatment approach to medicine is not sustainable, and is not the best we can do.

Multiplex Automated Genome Engineering: Changing the world with MAGE

Humans have evolved a most unique mastery of toolmaking through advanced technology. As an extension of our biological bodies, technology has loosened the grip of natural selection. This is particularly true in the field of biomedicine and genetic engineering. We have the ability to directly alter the blueprint of life for any purpose we wish. Beginning in the 1970’s with the creation of recombinant DNA and transgenic organisms, genetic engineering has offered scientists the ability to study genes on a level that may not have seemed possible at the time. The field has provided a wealth of knowledge as well as practical implications, such as knockout mice and the ability to produce near-endless amount of human insulin for diabetics.

As of 2009, multiplex automated genome engineering (MAGE) has ushered in a new branch of genetic engineering – genomic engineering. We are no longer restricted to altering single genes, but rather are able to alter entire genomes by manipulating several genes in parallel. This new ability, brought about by MAGE technology, allows for nearly endless applications that stretch well beyond medicine or industry; agriculture, evolutionary biology, and conservation biology will benefit tremendously as MAGE technology progresses. Genetic engineering advancements such as MAGE are poised to revolutionize entire fields of science, including synthetic biology, molecular biology, and genetics by offering faster, cheaper, and more powerful methods of genome engineering.

Homologous Recombination

Genetic engineering underwent a revolutionary change in the 1980’s, largely due to the pioneering work of Martin Evans, Mario Capecchi, and Oliver Smithies. Evans and Kauffman were the first to describe a method for extracting, isolating, and culturing mouse embryonic stem cells. This laid the foundation for gene targeting, a method that was independently discovered by both Oliver Smithies and Mario Capecchi. Mario Capecchi and his colleagues were the first to suggest mammalian cells had the machinery capable for homologous recombination with exogenous DNA. Smithies took this a step further, demonstrating targeted gene insertion using the β-globin gene. Ultimately, the combined work of Evans, Smithies, and Capecchi on homologous recombination earned them the Nobel Prize in Physiology or Medicine in 2007. The science of homologous recombination has allowed for many scientific discoveries, primarily through the creation of knockout mice.

Homologous recombination works under many of the same principles are chromosomal recombination in meiosis, wherein homologous genetic sequences are randomly exchanged. The difference lies in the fact that homologous recombination works with exogenous DNA and on a gene level rather than chromosomal level.

1

2

The method works by using a double stranded genetic construct with flanking regions that are homologous to the flanking regions of the gene of interest. This allows for the sequence in the middle, containing a positive selection marker and new gene, to be incorporated. The positive control should be something that can be selected for, such as resistance to a toxin or a color change. Outside of one of the flanking regions of the construct should lie a negative selection marker; the thymidine kinase gene is commonly used. If homologous recombination is too lenient, and the thymidine kinase gene is incorporated into the endogenous DNA, it can be detected and disposed of. This is to prevent too much genetic information from being exchanged.

Using this method, knockout mice can be created. A knockout mouse is a mouse that is lacking a functional gene, allowing for elucidation of the gene’s function. Embryonic stem cells are extracted from a mouse blastocyst and introduced to the gene construct via electroporation. The successfully genetically modified stem cells are selected using the positive and negative markers. These are isolated and cultured before being inserted back into mouse blastocysts. The mouse blastocysts can then be inserted into female mice, producing chimeric offspring. These offspring may be mated to wild-type mice. If the germ cells of the chimeric mouse were generated from the modified stem cells, then the offspring will be heterozygous for the modified gene and wild-type gene. These heterozygous mice can then be interbred, with a portion of the offspring being homozygous for the modified gene. This is the beginning of a mouse line with the chosen gene “knocked-out.”

3

Multiplex Automated Genome Engineering Process

The major drawback of the previously described method of “gene targeting” is the inability to multiplex. The process is not very efficient, and targeting more than one gene becomes problematic, limiting homologous recombination to single genes. In 2009, George Church and colleagues solved this issue with the creation of multiplex automated genome engineering (MAGE). MAGE technology uses hybridizing oligonucleotides to alter multiple genes in parallel. The machine may be thought of as an “evolution machine,” wherein favorable sequences are chosen at a higher frequency than less favorable sequences. The hybridization free energy is a predictor of allelic replacement efficiency. As cycles complete, sequences become more similar to the oligonucleotide sequence, increasing the chance that those sequences will be further altered by hybridization. Eventually, the majority of endogenous sequences will be completely replaced with the sequence of the oligonucleotide. This process only takes about 6-8 cycles.

4

After the E. coli cells are grown to the mid log phase, expression of the beta protein is induced. Cells are chilled and the media is drained. A solution containing the oligonucleotides is added, followed by electroporation. This step is particularly lethal, killing many of the cells. However, the cells are chosen based on positive markers (optional, but increases efficiency) and allowed to reach the mid-log phase again before repeating the process. Church and his colleagues have optimized the E. coli strain EcNR2 to work with MAGE. EcNR2 contains a plasmid with the λ phage genes exo, beta, and gam as well as being mismatch gene deficient. When expressed, the phage genes will help keep the oligonucleotide annealed to the lagging strand of the DNA during replication, while the mismatch gene deficiency prevents the cellular repair mechanisms from changing the oligonucleotide sequence once it is annealed. Using an improved technique called co-selection MAGE (CoS-MAGE), Church and colleagues created EcHW47, the successor to EcNR2. In CoS-MAGE, cells that exhibit naturally superior oligo-uptake are selected for before attempting to target the genes of interest.

MAGE technology is currently in the process of being refined, but shows incredible promise in practical applications. Some of the immediate applications include the ability to more easily and directly study molecular evolution and the creation of more efficient bacterial production of industrial chemicals and biologically relevant hormones. Once the technique has been optimized in plants and mammals, immediate applications could be realized in GMO production and creation of multi-knockout mice that will give scientists the ability to study gene-gene interactions on a level previously unattainable. A more optimistic and perhaps grandiose vision could see MAGE working towards ending genetic disorders (CRISPR technology, an equally incredible genomic editing technique, may beat MAGE there) and serving as a cornerstone technique in de-extinction. The ability to alter a genome in any fashion brings with it immense power. The possibilities for MAGE are boundless, unimaginable, and are sure to change genomic science.

For more information on Homologous recombination, see:

http://www.bio.davidson.edu/genomics/method/homolrecomb.html

For more information on MAGE, see:

Wang, H. H., Isaacs, F. J., Carr, P. A., Sun, Z. Z., Xu, G., Forest, C. R., & Church, G. M. (2009). Programming cells by multiplex genome engineering and accelerated evolution. Nature, 460(7257), 894-898.

Wang, H. H., Kim, H., Cong, L., Jeong, J., Bang, D., & Church, G. M. (2012). Genome-scale promoter engineering by coselection MAGE. Nature methods, 9(6), 591-593.

For more information on CRISPR (which I highly recommend; it’s fascinating), see:

https://www.addgene.org/CRISPR/guide/

The Paleo Diet – Brilliantly Simple, or Simply Wrong?

Introduction to the Paleo

 According to thepaleodiet.com, “the Paleo Diet, the world’s healthiest diet, is based upon the fundamental concept that the optimal diet is the one to which we are genetically adapted.” Who can disagree with that? After all, it does make sense that the best diet would be one that, according to our genetics, our body can utilize most efficiently. However, is this what the Paleo Diet actually offers?

The Paleo Diet claims to offer “modern foods that mimic the food groups of our pre-agricultural, hunter-gatherer ancestors.” First we have to look at what the Paleo Diet means by our “ancestors.” Being a “paleo” diet, it is referring to our ancestors in the Paleolithic era, which extends from about 2.5 million years ago to about 10,000 years ago, just after the end of the last ice age and around the dawn of the Neolithic – or agricultural – revolution. 2.5 million years is a pretty broad range to select a diet from, but perhaps not so broad on an evolutionary timescale.

One issue that arises when studying the diets of ancient hominids is the fact that archaeological sites aren’t all too common past 10,000 years ago. The reason probably lies in the fact that prior to the Neolithic revolution, people were hunter-gatherers. They didn’t really have permanent settlements. Hunter-gatherers travel to where the food – presumably that which can be hunted (migratory animals such as elk, bison, caribou, etc. depending upon geographic location) and gathered (berries, nuts, shellfish, and so on) – is. This would vary by the season and even by the century as animals permanently migrated to new locations or became over-hunted in their current location. However, when mankind developed agriculture about 10,000 years ago, people began to establish permanent settlements. These settlements, which were fueled by the domestication of plants and animals and thus liberation from hunting and gathering, provide a rich source for archaeological artifacts. It’s difficult to find the few material bits and pieces of a nomadic lifestyle. When people settle for hundreds or even thousands of years in a location, artifacts build up, and the chances of finding something 10 millennia later are much greater.

How do we know about their diet? Archaeological evidence

So, how do we know what the hunter-gatherers ate? One way is to look through the archaeological sites that we do have. Animal bones are often signs that the inhabitants ate meat. Furthermore, we might find tools that could have been used for butchering along with cut marks on the bones that imply that the animal was butchered. Along with this, we can track morphological changes over time. Changes in the size and structure of certain bones, such as the mandible and cranium, might indicate a change in diet. A diet heavier in meat could require a larger mandible and would imply an increase in calories that would be necessary to support a larger brain in the larger cranium.

Osteological analysis, though, is qualitative at best. It’s important to remember that an archaeological site is merely a snapshot in time. For example, a site that was abandoned in the winter (maybe to move somewhere warmer, a death of the inhabitants, or something completely different) might show a heavy use of meat due to the fact that not many plants grow in the colder months. With so few sites, there isn’t very strong evidence one way or the other about diets. Small sample sizes can be incredibly biased.

Stable Isotopes

Another way is to study ancient diets is by using stable isotope analysis. If you remember from chemistry class, isotopes are two elements with the same number of protons but a differing number of neutrons. Because proton (atomic) number defines elemental properties, the two elements are actually the same element, but with slightly different weights. For example, about 99% of the carbon in the atmosphere is C12 – carbon with an atomic mass number (combined number of protons and neutrons) of 12. This is the most stable form of carbon, and thus the most abundant. Carbon has two other isotopes that are relevant to scientific studies, C13 and C14. Though there are many more isotopes, they are found in minute amounts and are so unstable that they decay rather quickly.

You have probably heard of carbon dating, which measures the relative abundance of C14 in an organic artifact and derives an approximate date based on known rates of decay for C14. This works based on the fact that there is a certain ratio of C12 to C14 in the atmosphere, which is taken up by organisms. After the organism dies, C14 begins to decay due to its heavier weight. While this is based on the assumption that C14 to C12 ratios were the same in the past, it can often be cross-verified with other forms of dating, such as stratification, phylogenetic dating, other forms of radiometric dating, and sometimes even early writings (for example, the date derived from carbon dating an item purportedly from some event can be compared to a written, dated historical document describing the event).

Stable isotope analysis works, as the name implies, by measuring a stable, rather than radioactive isotope. Because C13 is not heavy enough to decay (C12 and C13 are the only stable isotopes of carbon, and C14 is the most stable radioactive isotope), it will remain in the bones and teeth in the same C12:C13 ratio as when the organism was alive. Great! Although C12 and C13 are not discriminated in our bodies, some plants distinguish between C12 and C13, ever so slightly. Ribulose-1,5-biphosphate carboxylase/oxygenase – commonly known as RuBisCO – is an enzyme that, in most plants, binds to the CO2 entering the stoma. Rubisco happens to have a slight affinity for C12, meaning the plant – and everything that eats the plant – has a disproportionate amount of C12 to C13. These plants are known as “C3” pathway plants.

In arid climates, where water is even most precious, plants had to adapt. A problem arose due to the fact that water escapes from the stoma when it opens to have rubisco capture CO2. Therefore, some plants, known as C4 pathway plants, evolved to use another enzyme, PEP-carboxylase, to bind CO2. PEP-carboxylase binds much more strongly to CO2 than rubisco, and doesn’t present a preference for either C12 or C13.

Carbon isotopes are used in conjunction with other elemental isotopes, such as nitrogen, to assess relative ratios of plant to meat in diets. This is all based on small differences between heterotrophs and autotrophs, carnivores, herbivores, and omnivores. For example, organisms higher in the food chain tend to have more N15 than organisms lower in the food chain. It is important to understand the isotopic variation of the ecosystem, however, they can vary, especially when environmental manipulation (such as cooking) comes into play. Ultimately, stable isotope analysis has a modest amount of discriminatory power, but is not comprehensive. It utilizes quantitation to make a qualitative claim, and does so on a limited number of samples.

Problems with the logic of a Paleo Diet

Which “paleo” should we eat like? 10,000 B.C.E. Inuit people? 200,000 year old Mitochondrial Eve? 1 million year old Homo erectus? Clearly there were times, and species, of hominids that ate more meat than others. An Inuit living in north Canada survived largely off of seal fat. However, Homo erectus probably lived more off of fruits and nuts. Humans survived and came to dominate the planet due largely to their adaptability, including our omnivorous diet. Our ability to adapt to mostly nuts or mostly blubber has granted us freedom to roam from the heart of Africa to the frozen lakes of Canada. Paleolithic hunter-gatherers simply ate what was available to them.

Many Paleo dieters cite articles discussing health disparities that arose when agriculture entered the picture. While this is true, it’s not necessarily because we stopped eating a “paleo diet.” More likely, health problems arose because we stopped eating such a wide variety of foods. Many ancient peoples went from elk, bison, nuts, and berries to what we could domesticate. Eventually, our domesticated crops and animals grew in variety and things leveled out a little more. This was likely not a rapid transition. Domestication may have started out as simply a way to supplement hunting and gathering before the boom of the Neolithic Revolution. Regardless of your diet, it is important to eat a variety of food in order to encompass all nutritional ingredients. Many people in Westernized cultures today eat a much more monotonous diet than they should.

Are we genetically identical to our “Paleo” brothers and sisters?

One of the main arguments of the Paleo Diet is that our genome has changed little since the end of the Paleolithic period, meaning our bodies are still best adapted to the diet of that time. This argument is a bit short-sighted. To claim that our genome has not adapted to our Neolithic lifestyle is simply incorrect. It is true that our genome evolution lags far behind our cultural evolution, and is often overshadowed by it. However, there do exist some key differences in our genomes from those of a Paleolithic hominid. The two most well known adaptations are the amylase and lactase mutations. Amylase is an enzyme that allows for digestion of starch from grain. As the Neolithic Revolution kicked into gear, those with an extra copy of the amylase gene better metabolized all of the new grain they could grow. This extra gene places amylase in the saliva, helping break down the starch at the beginning of digestion rather than beginning halfway through in the gut.

The second mutation is a regulatory mutation. People are born with a gene that regulates the production of lactase, an enzyme that breaks down the biologically unusable dairy sugar lactose into the biologically usable sugars galactose and glucose. Before animal husbandry practices of the Neolithic Revolution, the lactase gene would be transcriptionally inactive, or “turned off” in most people around the age of 5-7. After this age, the child no longer breast fed, and really had no need for lactase. However, once people began raising dairy animals, such as goats and cattle, dairy products such as milk and yogurt became an important staple food. This seems to have caused positive selection for the genetic mutation that allowed the lactase gene to remain “on” throughout life. Those with the lactase and amylase mutations could better exploit dairy and grain products than those without the mutations. So, while our genomes are not radically different, they are indeed different, and have adapted to some of the Neolithic diet changes.

Microbiomes

Although our genome is relatively similar to our ancestors, our microbiome certainly isn’t. The microbiome is the summation of microorganisms that inhabit us. This might not seem like a big deal, so let me put it in perspective. If we were to take the entire amount of DNA in your body, including that of the microorganisms, human DNA would comprise only about 10%. The other 90%? That would be the microbiome. You are 90% microorganism. With the recent completion of the human microbiome project, expect to see some incredible discoveries about the differences between ourselves and our Paleo ancestors in the near future.

So how do we study the Paleo microbiome? One way is through ancient DNA. Unfortunately (or fortunately, for researchers today), there were no Paleo dentists around, nor were there any Paleo toothbrushes. When people ate, plaque built up and calcified on their teeth. This calcified plaque is called dental calculus, and it preserves the DNA of the microorganisms that made up the plaque along with some of the DNA from the actual food. From this, using Next Generation Sequencing techniques, we can learn more about the kinds of food and the microorganisms that were present in the bodies of our ancestors. By comparing what we find to oral microbiomes today, we can have a better understanding of what Paleo people ate. Also, microfossils can be preserved in the dental calculus, allowing for a visual confirmation of food in the plaque. Again, these are qualitative measures that are inhibited by sample size. But, these are the best methods we have and they are producing some excellent research.

Is the food still the same?

People freak out about GMOs. The truth is basically everything we eat – meats and plants alike – are genetically modified. Over thousands of years we have artificially selected plants and animals for particular traits. As our genome has changed since Paleolithic times, plant and animal genomes have radically changed, largely due to human manipulation. So, even if you eat according to the Paleo Diet, you are eating the modern-Paleo Diet, not the Paleo-Paleo Diet. So, really, you aren’t even eating like you think the ancestors ate. Our modern plants are “human inventions,” as Dr. Christina Warinner – a leading Dental calculus expert at the University of Oklahoma – puts it.

Ultimately, the Paleo Diet, as it is marketed, isn’t really a Paleo Diet at all. There’s no harm, and definitely some benefit, in cutting refined sugars and overly processed meats out of your diet. However, eating modern versions of nuts, fruits, veggies, and more meat isn’t going to make you any more like a Paleo-man or Paleo-woman than if you just eat a normal, balanced diet. If anything, skipping out on legumes, dairy, and multi-grain wheat, which are prohibited in the Paleo Diet, could cause a lapse of certain nutrients. Technological and agricultural advances have produced some amazing foods that our Paleo ancestors could have only dreamt about. If you really want to be Paleo, then take advantage of the advances in food science. It’s what our ancestors would have done.

*A form of this is also published at http://anthronow.com/wp-content/uploads/2016/04/AnthroZine_1601.pdf