Category Archives: Science

Progressing the Person and Policy

The English word “person” has a long and convoluted history. Though the word itself likely derives from the Latin, persona, referring to the masks worn in theatre, its meaning has evolved over time. One of the biggest conceptual overhauls came in the 4th century AD during a church council that was held to investigate the concept…

via Progressing the Person and Policy — Savage Minds

Artificially Intelligent, Genuinely a Person

It’s difficult to overstate our society’s fascination with Artificial Intelligence (AI). From the millions of people who tuned in every week for the new HBO show WestWorld to home assistants like Amazon’s Echo and Google Home, Americans fully embrace the notion of “smart machines.” As a peculiar apex of our ability to craft tools, smart…

via Artificially Intelligent, Genuinely a Person — Savage Minds

Medicine, Technology, and the Ever-Changing Human Person

Though we often take for granted that humans are persons, they are not exempt from questions surrounding personhood. Indeed, what it means to be a person is largely an unsettled argument, even though we often speak of “people” and “persons.” Just as it’s important to ask if other beings might ever be persons, it is […]

via Medicine, Technology, and the Ever-Changing Human Person — Savage Minds

Of Primates and Persons — Savage Minds

Savage Minds welcomes guest blogger Coltan Scrivner for the month of January. Coltan will be writing a series of posts on personhood from different disciplinary perspectives. When I moved to Chicago for graduate school, one of the first things I did was go to the Lincoln Park Zoo. Just like with other zoos I’ve been […]

via Of Primates and Persons — Savage Minds

A Tale of Cookies and Milk: How We Adapted to Consuming Grains and Dairy

Humans are curious creatures. We like to poke and prod at new things to see what will happen. This curiosity is part of the reason we are successful. Though it can sometimes lead to disastrous outcomes, curiosity can be the reason not only for cultural inventions, but biological changes. This is especially true for our diet, which has changed radically in the past 10 – 20 thousand years. Two of the biggest changes have been our ability to efficiently digest grains and dairy. The agricultural revolution led to a lot of changes in human diet, including grain and dairy. Humans were experimenting with many new types of food. I’m sure the first individual to started eating grain was met with a warmer  reception than the one who suggested we start drinking cow and goat milk. At any rate, both ventures wound up changing our biology and culture. Just think: without amylase and lactase, Santa would be having something other than cookies and milk.

The Short Story of Amylase

In order to digest grains or any other starchy food, an organism needs an enzyme called amylase. Amylase hydrolyzes starch, eventually getting to the glucose molecules contained within the food. Though amylase is not unique to humans, there are some unique aspects about human amylase. In humans there is a positive correlation between the number of copies of the gene responsible for production of amylase, AMY1, that exist in a genome and the expression of amylase in the saliva. Interestingly, the average human contains about 7 times as many copies of AMY1 as chimpanzees, suggesting evidence for amylase selection after our split from the common ancestor with chimpanzees. The small differences between DNA in human AMY1 genes suggest a fairly recent selection event. Moreover, populations with high-starch diets had more AMY1 copies than populations with low-starch diets, further supporting a more recent selection as well as fairly rapid evolution. When it comes to diet, it seems natural selection can act fairly quickly.

The process of carbohydrate digestion begins in the mouth with an enzyme called ptyalin, also known as α-amylase. Ptyalin hydrolyzes the glycosidic bonds within starch molecules, breaking them down into the disaccharide sugar known as maltose. In the walls of the stomach, specialized cells called parietal cells secret hydrogen and chloride ions, creating hydrochloric acid. Amylase, which works at an optimum pH of about 7, cannot function in the highly acidic environment of the stomach.

The second part of starch digestion is initiated in the small intestine by an enzyme called pancreatic amylase. Though pancreatic amylase and salivary amylase are coded by two different DNA segments, they are side by side in the genome. It has been suggested that an endogenous retrovirus inserted DNA in-between the two copies of amylase that existed in our ancestors’ genome; this interruption in the open reading frame of the gene caused a mutation that promoted amylase production in the saliva from one of the gene copies that originally coded for pancreatic amylase. This mutation would have had a clear advantage, allowing for greater breakdown of starchy foods. Further evidence for the positive selection of salivary amylase production can be seen in its independent convergent evolution in mice and humans.

So the story for amylase is fairly short. Our ancestors began with two pancreatic amylase genes, which split to create one pancreatic and one salivary amylase gene. Over time, copy-number variations in genes occurred and were either selected for or against. Random gene duplication in conjunction with varying diets among human populations has resulted in the amylase locus being one of the most variable copy-number loci in the entire human genome.

The Somewhat Longer Story of Lactase

            The Neolithic (agricultural) revolution brought about some of the biggest cultural changes that our species has ever seen. Small groups of hunter-gatherers began to morph into large societies of agricultural-based farmers that existed in tandem with a group of people who lived a nomadic herding lifestyle. Nomadic herders could travel between these newly formed cities, trading meat, milk, or animals for agricultural products such as recently domesticated plants and grains. This substantial change in lifestyle caused a rapid overhaul in many aspects of human biology, including immunity, body size, and prevalence of certain digestive enzymes.

Lactase is the enzyme that breaks down the disaccharide sugar lactose, found in dairy products, into the monosaccharides glucose and galactose. Lactase is an essential enzyme because it allows infants to break down the lactose in the mother’s milk. However, there is a down-regulation of the lactase gene during childhood for a significant portion of world’s population. Curiously, the portion of the world’s population that does not experience down-regulation of the lactase gene are mostly of European descent. There is an interesting correlation between geographic location and percent of the population with lactase persistence. The further North you go in Europe, the more lactase persistence you find. This probably has to do with the fact that the colder climate of Europe, especially northern Europe, left fewer options for food consumption. The ability to digest and reap the benefits of lactose into adulthood could have acted as a major factor in surviving to reproductive age, thus increasing the prevalence of lactase persistence in those cultures.

Milk has a decent amount of calories and fat to keep energy reserves up, allowing people to survive harsh winters in Northern Europe. In addition, it provides nutrients such as calcium, protein, and vitamins B12 and D. Today in the Western world we see the high caloric and fat content of milk as a threat of weight gain. However, people living in 7000 B.C. would have seen this as a gold mine for survival. As essential as the calories and fats were to Northern European Neolithic people, the vitamin D content of milk may have been equally as important. In order for the body to synthesize vitamin D, it needs UVB rays from sunlight. This is an issue at northern latitudes, where it’s colder and there’s less sunlight than many other areas on Earth. Moreover, the amount of UVB light that can be absorbed is dependent upon angle at which the Sun’s rays strike the Earth. So even during a clear sunny day in the winter, people living in northern latitudes may not be absorbing UVB rays.

vitamind

One way to combat the low levels of UVB rays is to have fair skin. UVB rays that strike the skin will cause the synthesis of cholecalciferol (Vitamin D3) from 7-dehydrocholesterol that is already present in the skin, eventually leading to the production of a usable form of vitamin D. Specifically, 7-dehydrocholesterol is found predominately in the two innermost layers of the epidermis. This can be an issue for UVB absorbance since, melanin, which is the pigment responsible for darker skin, absorbs UVB at the same wavelength as 7-dehydrocholesterol. Indeed, it turns out that fair-skinned people (who tend to live in colder and more northern climates) are more efficient at producing vitamin D than darker skinned people.

Vitamin D is really an underappreciated nutrient. It is essential for absorption of calcium, which is nearly ubiquitous in its usage throughout the body, from brain functioning to muscle contraction. Recent research has illuminated other uses for vitamin D, including regulation of genes associated with autoimmune diseases, cancers, and infection. One study in Germany found that participants (average age of 62) with low vitamin D levels are twice as likely to die, particularly of cardiovascular problems, in the following 8 years than those with the highest vitamin D levels.

Though it isn’t too important to us today, lactase persistence might have saved the populations of Neolithic people in Northern Europe. Its dose of fat and calories helped bump up energy stores while the calcium and vitamin D found in whole milk reaped significant nutritional benefits. Though there are still many questions surrounding the evolution of lactase persistence in sub-populations of humans, the selection of this phenotype is quite clear. Those with lactase persistence would have had supplemental nutrition that might have helped them survive the Northern European winters.

 

 

Does Stress Really Cause Stomach Ulcers?

Stomach ulcers, also known as peptic ulcers, have an interesting history in medicine. It was originally believed that stress somehow caused them, and doctors didn’t have much advice to offer patients who were suffering from a peptic ulcer. In the 1980’s, however, Barry Marshall and Robin Warren ran some experiments suggesting that a Helicobactor pylori bacterial infection caused peptic ulcers. As with many new discoveries, the news was met with fierce skepticism in the scientific and medical communities. To quell the skeptics, Barry Marshall actually swallowed some H. pylori to prove his hypothesis. This remarkable feat provoked some scientists to begin experimenting with the bacterium, and they found that Marshall was correct. H. pylori causes ulcers by weakening the mucosal lining, allowing stomach acid to come into contact with the stomach lining. With numerous repeated studies finding consistent results, it seemed like a closed case. This discovery was such a big deal that Marshall and Warren even won a Nobel Prize for their work.

However, as with most things in human biology, a wrench was eventually thrown into the equation. It was later discovered that a significant portion of the population carries H. pylori in their stomach, but don’t exhibit ulcers. How could this be? Though all the details are still not fully elucidated, it seems there is a back-and-forth battle going on in the stomachs of people with H. pylori that is relatively harmless and asymptomatic – that is, until they become stressed.

Let’s assume you are one of the majority of people on earth who has H. pylori living in your stomach. The bacterium probably colonized your stomach long ago, but you haven’t noticed anything out of the ordinary. Suddenly, there is a tragic psychological stressor in your life: you lose your job, fail an important final, your significant other ends the relationship – pick your poison. After a few weeks, your life begins to improve and you are feeling better about yourself, except for the excruciating stomach pains you are experiencing, especially after eating. You decide to see a doctor and are prescribed an antibiotic to fight off an H. pylori infection. What happened?

Because a psychological stressor can cause a physiological stress response, your sympathetic nervous system kicks into gear at the onset of the stressor. When the body is eliciting a stress response, cortisol is released in large amounts. One of the properties of cortisol is anti-inflammation. This anti-inflammatory property works by inhibiting the synthesis of a group of compounds known as prostaglandins. More specifically, cortisol prevents the synthesis of arachidonic acid, a precursor to prostaglandin. But, what does this have to do with ulcers?

As it turns out, a particular prostaglandin known as PGE2 is responsible for regulation of both stomach acid secretion and mucous secretion. As with many compounds in the body, PGE2 will have varying effects depending upon the cellular receptor to which it binds. The body is remarkably conservative. Often the same molecule can be used for a wide range of effects depending upon the receptor to which it will bind. You can think of the molecule as a skeleton key and the receptors as a bunch of old doors. The key can open any of the doors, but there can be a very different outcome depending on which door is opened. If PGE2 binds EP3 receptors, acid secretion is inhibited; if it binds EP4 receptors, acid secretion is stimulated and mucous secretion is stimulated. This mechanism makes sense, as an increase in stomach acid would warrant extra mucous to protect the stomach lining. Following this logic, if stomach acid secretion is down (PGE2 binding EP3), the body is going to conserve a little energy by making cuts on mucosal production.

An analysis of this information reveals a few key points. First, it has been shown that cortisol is inversely correlated with stomach acid secretion. This means that PGE2 is binding to the EP3 receptors. So, more cortisol -> less PGE2 -> PGE2 binds EP3. This suggests that PGE2 has a higher affinity for EP3 receptors than it does for EP4 receptors, meaning that it will bind EP3 receptors until most of them are bound before it begins to bind EP4 receptors. So, when you stress, cortisol concentration rises. When cortisol concentration rises, prostaglandin production decreases. Low concentrations of prostaglandin mean PGE2 will bind EP3 preferentially over EP4. This results in a slowing down of stomach acid secretion, which in turn lowers mucosal secretion. The figure below shows a flow chart of events, starting with chronic stress and ending in a peptic ulcer.

Picture1

This flow chart illustrates the cascade of events leading to an ulcer. In essence, H. pylori becomes an opportunistic pathogen. It takes advantage of the lower levels of mucous, which acts as a barrier between the stomach contents and the sensitive stomach lining.

 

Taking this information into account, the initial story begins to make more sense. After a few weeks of the blues, you find a new job, ace the test, and get the girl. Things are looking up. Because things are getting back to normal, the parasympathetic nervous system begins to take on its normal hours of operation and the sympathetic nervous system finally winds down. When this happens, your cortisol levels go back to basal levels, meaning that prostaglandin production is on the rise once again. PGE2 is being synthesized in larger amounts, and it begins binding EP4 receptors, turning on acid secretion and mucosal secretion. But there is a problem.

Over the past few weeks your acid secretion has been down and the stomach mucosal lining has thinned, and H. pylori has been proliferating at an increasing rate. With your normal defenses down, H. pylori has had the upper hand in the battle and has virtually wiped out the remainder of your mucosal lining and infected several cells in the lining of the stomach. As the parasympathetic system continues to stimulate digestion, the stomach acid overwhelms the under-mucoused stomach and begins to, quite literally, eat through the lining, resulting in an ulcer.

So, stress doesn’t “cause” the ulcer, but it weakens the mucosal lining, affording H. pylori an opportunity to finish clearing out the rest of the mucous and cause an infection.

 

When DNA Isn’t Enough: Methylation, Forensics, and Twins

DNA evidence is often considered a “home run” in forensics. If you find readable DNA at the crime scene, and it matches a suspect, a correct conviction is almost assured. A DNA sample can often point to a single individual with ridiculous specificity – often 1 in a quadrillion or greater. But, what happens when someone else shares your DNA?

Monozygotic, or “identical” twins differ from dizygotic, or “fraternal” twins in that they come from the same zygote, hence, “mono”zygotic. In other words, identical twins come from 1 fertilized egg, while fraternal come from two. This means that Identical twins will share the same DNA, while fraternal twins will share as much DNA as any other sibling pair. There are, of course, many iterations of monozygosity depending when during development the split actually takes place. This nuance has led scientists in Germany to a possible solution to the issue of identical twin DNA.

During development, only a few cells are present. These cells begin to differentiate into the different tissue types that they will become. As these cells divide rapidly to produce the all of the daughter cells, mutations can occur in the DNA. If the mutation occurs earlier, it will be present in a larger ratio of the daughter cells, and will be more easily detectable during the twin’s lifetime. This differentiation of tissues also means that, the earlier the twins split, the less mutations they will have in common (and, thus, the more differences you can detect in their DNA). It has been suggested recently that, a handful of single nucleotide mutations, or “SNPs” can be found between twins. However, these SNPs aren’t so easy to find in a sea of 3 billion other nucleotides. To find these few differences and find them reliably, the entire genome of both twins must be sequenced several times over. In the case of the German scientists, their experiment results in 94-fold coverage, meaning they covered each of the 3 billion nucleotides 94 times. This must be done to ensure accuracy. At 3 billion nucleotides, a 99.9% accuracy will still result in 30 million errors. If anything, this shows how incredibly accurate our cellular machinery is.

At any rate, the scientists tested their new method on a set of twins, and it worked. In the end, twelve SNPs were identified between the twin brothers. Typically, one experiment is not considered to hold much weight in science, but this particular experiment is backed by strongly reinforced genetic theory, and the results were exactly what we would expect.

So, case solved, right? Well, maybe not. It turns out that this method comes with a hefty price tag – over $100,000. This is far too much to be practical in forensic case work, especially when you consider that about 1 person in 167 is an identical twin. Of course, this price will go down as DNA sequencing prices continue to plummet in light of newer, better technology. Still, it will be many years before anything like this will be affordable (a typical forensic DNA test costs in the neighborhood of $400-$1000). Furthermore, the instruments used in this method (Next generation sequencing), though typical in research science, have not been approved for use in court. That in and of itself can be a challenging obstacle to overcome, regardless of costs.

Perhaps in a few decades these issues will be resolved. Perhaps not. Either way, it might be a good idea to have a plan in the meantime. This is (hopefully) where my master’s thesis comes in.

DNA is composed of four nucleotides, commonly noted as “A T C and G.” Throughout life, a methyl group – a carbon and three hydrogens – attaches to some of the C’s in your genome. This is known as DNA methylation, which is a big component of the larger phenomenon known as epigenetics. As it turns out, these methyl groups attach randomly to the C’s, though some evidence suggests that environmental conditions may play some part in this. In any case, the attaching of methyl groups to C’s is different among individuals – even identical twins. In fact, studies have shown that newborns already exhibit DNA methylation discordance. Presumably, these differences would become more pronounced as time goes on. Not many studies have looked at this, but the ones that have also show evidence of greater discordance with age.

There is a potential issue with studying DNA methylation: it doesn’t occur uniformly among tissues. In other words, a blood sample and a skin sample from the same individual will show different patterns of methylation. Moreover, cells within the same tissue can show different methylation patterns. Though not insurmountable, these issues make methylation analysis a tricky subject.

To tackle the first issue of tissue discordance, one could simply match the type of DNA you take from the suspect with the type of DNA you have at the scene. The second issue of intra-tissue discordance is a bit trickier to tackle. For starters, we don’t know too terribly much about how DNA methylation works. Ostensibly, if methylation differences occurred early in development, then they would show the same pattern of proliferation as the SNPs that occur early in development. This means that the same DNA methylation pattern would be present in all of the daughter cells, and show up easily in a DNA sample from that tissue.

Another possible solution would be to take a statistical approach. This would involve looking at the methylation patterns several times and coming up with an “average” methylation. For example, let’s say there are 10 C’s susceptible to methylation in a particular DNA sequence. If I run 10 samples from a DNA swab, I might find the number of methylated C’s to be: 3, 4, 5 ,3, 2, 4, 5, 3, 4, 4. If you average these, you get 3.7 out of 10 possible methylated C’s. Thus, you might say that this DNA sequence shows 37% methylation. If you do the same thing for the other twin and come up with 5.5 out of 10 possible methylated C’s, you could say that the other twin’s sequence shows 55% methylation. Ideally, these number would be relatively reproducible, especially as you increase the sample number and/or number of potentially methylated C’s per sequence.

Compared to the SNP method, my project is less definitive. However, good protocols would still make the method definitive enough. Once you narrow the suspects down to two twins via normal DNA testing, you have two possible outcomes: a match between one twin and the sample at the crime scene, or inconclusive. At this point, you just need to differentiate between two people, not 7 billion. Thus, the required statistical power is much, much lower. The big difference between my method and the SNP method is the price. Whereas the SNP method costs between $100,000 and $160,000, my method could be done in-house for less than $5000. Furthermore, my method is performed using the same instruments as traditional DNA testing, meaning that the new instrumentation does not need to be validated for use in court.

So, while it will take some work, and my project is more of a proof of concept study, the use of DNA methylation in forensics is generating a lot of attention. One of the issues with methylation in my study, i.e., different patterns in different tissues, has been a major benefit to a different use of DNA methylation – tissue identification. The idea here being that if you can identify consistent methylation patterns among a tissue type, you can use those patterns to identify the tissue. Another aspect that is relevant to my project, the increase in methylation with age, has been vetted as a possible investigative tool. If you can identify level of methylation that are consistent with different age groups, you can potentially “age” a suspect just by their DNA methylation. Studies on methylation aging are few and far between, but preliminary results are promising, suggesting that age-based methylation analysis can get within +/- 5 years of an individual’s actual age.

As we learn more about DNA methylation, it will become more useful. This is true not only for forensics, but also medicine, since methylation plays an important role in turning genes “on” or “off.” This is particularly true in cancer, where abnormal DNA methylation seems to occur. But, before we try to cure cancer with methylation, perhaps we can perform the smaller task of telling two twins apart from each other.

*Also published in part at http://forensicoutreach.com/library/when-dna-isnt-enough-methylation-forensics-and-twins-part-1/

and

http://forensicoutreach.com/library/when-dna-isnt-enough-methylation-forensics-and-twins-part-2/

An Evolutionary Explanation For Why You Wear Glasses

Empirically testing health-related hypotheses formulated through an evolutionary lens can prove to be difficult. The environment and the human experience are radically different from the first 6 million years of human evolution. Living on the edge of human existence and the top end of the techno-scientific scale, we are far removed from the environment to which many of our genes are hypothesized to be properly suited. Fortunately, the human race is a diverse group of individuals who have dispersed across the globe and have acclimated to a variety of circumstances. Accordingly, a few hunter-gatherer societies remain in parts of Africa. Though neither their genes nor their cultures are identical to original hunter-gatherers, they do retain the closest genetic and sociocultural similarity to human ancestors in the modern world. This is not to say that they are “less evolved” than other human societies. This notion is elementary and indicative of evolutionary ignorance. They are very well suited for their habitat, both genetically and culturally. Fortunately, those of us who are less suited for our environments, both genetically and culturally (i.e., everyone else, particularly in the US), can glean incredible insights about the functioning our own bodies and to what dietary and daily circumstances our physiology is best suited.

I recently wrote a primer on evolutionary medicine (which can be found here), which might be beneficial to read before getting into the specifics. This post will focus on myopia, or “near-sightedness,” the visual condition where objects at a distance are out of focus. Myopia affects about 15% of Africans, a third of Americans and Europeans, and over 75% of Asians – a curious bias that I’ll address later in the article. Fortunately (sort-of), myopia is easy to treat with glasses or contacts, and can even be cured to some extent with Laser-Assisted in situ Keratomileusis, commonly known as LASIK. Myopia occurs when the eye is too long, causing the focal point of light to occur prematurely, resulting in a blurry image. As a result, corrective lenses refract the light before it hits the cornea, essentially “overshooting” the refraction. For example, myopic corrective lenses will be thicker on the sides and thinner in the middle, causing the light to spread out slightly more before it hits the cornea, ultimately moving the focal point further back in the eyeball. With LASIK, a high frequency laser is used to vaporize (note: no heat is used. The vaporization is due to the light wavelength) tissue on the center of the cornea, thus reshaping the cornea so that light will be correctly refracted.

63345-004-1DB996D5

In order to focus, the eye depends on ciliary muscles that are attached to the lens. When focusing on something far away, as would often be the case outdoors, the ciliary muscles contract, stretching the lens to a flattened shape. When focusing on something up close, such as a book, television, computer, or phone, the muscles relax, allowing the lens to become more concave. Think of a camera lens: to focus on something far away, you use a longer lens or zoom in. Doing this moves the focal point of distant objects further back, allowing them to be in focus. To take up-close shots you use a macro lens, which is a very short, rounded lens that moves the focal point for near objects closer to the lens. This is how the eye works. Myopia is what happens when your zoom function is broken. Evolution and an analysis of our current sociocultural context might be able to tell us why this happens.

2022820-accomodation

I’m a student, and spend a lot of my time looking at a book, a laptop, or a phone. I love to get outside when I can, but, ultimately, most of my time is spent looking at things up-close. That means that the ciliary muscles in my eye – the zoom muscles – spend most of their time relaxed. Just like any other muscle that goes unused, the ciliary muscle will likely begin to atrophy and become weaker (as far as I’m aware, no quantitative studies have been performed on ciliary muscle size or mitochondrial count, probably because this would be difficult or impossible to do on a living person. Perhaps future studies can examine the ciliary muscles of recently deceased individuals and compare individuals who suffered from myopia with individuals who had normal vision). Over time, particularly if it occurs throughout critical stages of development during childhood, the muscles may become to weak to contract and properly pull the lens flat, thus preventing you from being able to focus on distant objects. Of course, this begs the question of whether or not the muscle be strengthened. I don’t know, and I’m not sure that I am willing to find out by using myself as a guinea pig. Unfortunately, that makes me part of the problem of “dysevolution,” as coined by Harvard paleoanthropologist and human evolutionary biologist, Daniel Lieberman. Dysevolution refers to the circle of treating diseases without trying to change or fix the cause. Our technology and scientific understanding has advanced so rapidly in the past 100 years that we can fix things such as myopia with ease. Often this cycle is perpetuated by comfort. Why change what the way I do things when I can just buy contacts or glasses? My previous post mentions several other possible mismatch diseases, and Lieberman’s book, “The Story of the Human Body,” goes into detail about many of them. For many of them – if not most – we simply ignore the possible cures and instead opt for a more “comfortable” and easy treatment. However, this cycle is sure to grow and intensify as time goes on.

Evolutionary medicine is sometimes difficult to empirically test. However, as mentioned above, modern day hunter-gatherer societies can offer incredible insight and points of comparison for how sociocultural differences may affect our “mismatch diseases.” Studies of this kind are unfortunately few and far between (possibly because research funding also focuses on treatments). However, studies with hunter-gatherer societies have shown that very few members suffer from myopia (as well as many other non-infectious ailments, such as type-2 diabetes, heart disease, osteoporosis, and even cavities). The thought is that they are exposed to a variety of visual stimuli and their visual environment is constantly changing. This “exercises” their ciliary muscles and keeps them strong. Experiments have also shown that animals that are deprived of visual stimuli will grow elongated eyeballs. Similarly, people who spend more time indoors, particularly with studying, as is common in many Asian cultures, exhibit much higher instances of myopia whereas those who spend some time outdoors, as is more common in many African cultures, tend to have a lower rate of myopia. Our eyes did not evolve to see things 2 feet from our face all day long. They evolved to keep up alive from the plethora of visual stimuli in nature and to help us search for food: 2 things that many people, particularly children in developed countries, no longer need to do.

The solution isn’t to give up studying and electronics. It’s much more simple than that. Nearly everyone uses books and electronics, so why doesn’t everyone have myopia? One possibility is genetics, though that doesn’t seem like a plausible explanation. Rates of myopia have only skyrocketed in the last century, and any latent mutation for poor vision would have most certainly been selected against in our ancestors. The likely “cure” for myopia is balance. Spend time outside, especially as a child. The data from lab experiments as well as social statistics seem to point in this direction. If we continue to ignore the cause and only treat the symptoms, we are trapping ourselves in an ever growing cycle in which we become more and more dependent upon technology.

Evolution: The Missing Link in Medicine

“Nothing in biology makes sense except in the light of evolution.”

– Theodosius Dobzhansky

Evolution is arguably one of the most widely supported and powerful theories in all of biology, and potentially science as a whole. It has been a dominant explanation for over 100 years. Once genetics entered the picture in the first part of the 20th century, Darwin’s common descent and Mendel’s inheritance were improved upon, greatly expanded, and solidified into the new synthesis of evolution. Consistently verified through genetics, paleontology, geology, ecology, microbiology, and many other fields of science, evolution has become a pervasively potent field of study. It has created huge disciplinary offshoots – including evolutionary biology, evolutionary genetics, and evolutionary anthropology to name a few – and has become the theoretical foundation for all of biology.

Some people today argue that humans are no longer under evolutionary pressures, and, thus, are no longer evolving. Though this seems to make sense superficially, it is simply not true. The first issue is that humans only live about 80 years; a mere snapshot of our species’  existence. It is difficult to observe phenotypic differences as a result of biological evolution in only a few decades. That being said, scientists have found some very recent biological changes have occurred, including the altered expression of the FTO gene. The FTO gene codes for a protein that regulates appetite. While it does not “make” a person obese (genes tend to predispose, not determine), it has been correlated with obesity. The catch? It seems to have only been expressed after about 1940, according to a study published just 2 days ago. The study (which can be found here) found that, after 1942, the FTO gene showed a strong correlation with increase BMI. Why, though, would a gene that has not changed suddenly become active?

The Environment

What did change in the 1940’s? Technology. WWII offered an incredible economic boost to the US that massively increased technological enterprise and was the main contributing factor the world superpower status that the US achieved in the 40’s. As technology increased, labor decreased. After all, the main purpose of technology is to make human life simpler. When human life becomes simpler, people become more sedentary. New technology also allowed for cheaper, higher calorie, over-processed food. This one will take a while to work out. The difference could be epigenetic alteration, novel environmental stimuli, or even another gene interacting with FTO. While more testing will be needed to show exactly what happened in the early 40’s that altered FTO expression, the fact that something did occur, likely stemming from environmental changes, still remains. Biological evolution doesn’t have to be the changing of DNA sequence; that is far too simplistic. Anytime phenotypic or genotype ratios change on a species-wide level, evolution is occurring. No population is in Hardy-Weinberg equilibrium, and no population ever will be. Human wills continue to evolve biologically. While cultural evolution has exceedingly outpaced biological evolution, giving the mirage that biological evolution has “stopped,” the truth is that culture can either augment or stagnate biological evolution, depending upon the situation. A cultural change to drinking more milk may augment lactase persistence (and in fact, it did), while a cultural propensity to live in climate controlled housing year-round may slow other aspects of biological evolution. Nature doesn’t necessarily control natural selection; more broadly, the environment (cultural or natural) mediates evolution.

So, why is evolution important in medicine? Sure, doctors need to understand things like microbial evolution and how it plays a role in infectious diseases, but what about human evolution? How can a knowledge of human evolution impact medicine?

Cultural evolution has rapidly and drastically altered the human environment, thus changing how the human species evolves. More importantly, our environments have changed so aggressively that our bodies cannot keep up. (Before I go on, I have to make something clear. I am not a proponent of the Paleo Diet; if you’d like to know why, check out this post.) This means our bodies are often best adapted to the environments of the past (though these vary drastically). This has given rise to what are sometimes referred to as “mismatch diseases.” The list is extensive, but includes maladies such as atherosclerosis, heart disease, type-2 diabetes, osteoporosis, cavities, asthma, certain cancers, flat feet, depression, fatty liver syndrome, plantar fasciitis, and irritable bowel syndrome, to name a few. Some of these may not be actual mismatch diseases, but many of them likely are. Furthermore, many of these illnesses feed off one another, creating a terrible feedback loop. 100 years ago you’d likely die from an infectious disease; today, most people in developed nations will die from heart disease, type-2 diabetes, or cancer.

These diseases don’t have to be essential baggage of modernity. Anthropologists and (and some intrepid human evolutionary biologists) study modern day hunter-gatherer societies in order to glean information about the nature of our species pre-Neolithic Revolution. It’s important to note that these are not perfect models (cultural and biological evolution has still occurred in these hunter-gatherer societies), but are the best available. Interestingly, modern day hunter-gatherers don’t suffer from many of these mismatch diseases (This effect can’t be explained by longevity; hunter-gatherers regularly live into their late 60’s and 70’s. Though unusual to many of us, their lives aren’t as brutish as they are often portrayed). Diseases such as type-2 diabetes, hypertension, heart disease, osteoporosis, breast cancer and liver disease are rare among the societies. What’s more, myopia (near-sightedness), asthma, cavities, lower back pain, crowded teeth, collapsed arches, plantar fasciitis, and many other modern ailments are exceedingly rare. So what’s different? The easy answer is their diet, lifestyle, and environment. The difficult answer involves elucidating the physiological importance of certain social norms and biochemical processes of differing diets. Some very exciting work is beginning to arise in this field, dubbed “evolutionary medicine.”

Modern medicine and medical research focuses largely on treating problems, i.e., drugs and procedures that alleviate symptoms after the disease has manifested. While the cause is noble, and indeed necessary, it’s not enough. The childish logic of medical research creates a cycle of sickness-treatment that, in 2012, totaled almost $3 trillion in healthcare costs. Furthermore, the sedentary and Epicurean lifestyle in which many Americans live willingly feeds this cycle; among the less privileged, necessity feeds this cycle through the inability to afford healthy food, limited access to health education, and a sociocultural feedback loop that breeds its own vicious cycle.

There will likely never be a drug that can cure cancer (of which there are thousands of variants that can even differ between individuals who have the same variant), heart disease, type-2 diabetes, or many of the other previously mentioned noninfectious diseases. The rationale is akin putting water in your car’s gas tank and hoping additives will make it work as efficiently as gasoline. The car was built to run off gasoline. Similarly, your body has evolved to not eat an excessive amount of salts, carbs, and sugars (of which the different types, particularly glucose and fructose, do not have the same biochemical effects during digestion), sit for extended periods of time, wear shoes (particularly those with arch support; a common misconception is that arch support is good for you when, in fact, it weakens the muscles of the arch, leading to ailments such as collapsed arches and plantar fasciitis), read for several hours at a time, chew overly processed food, or many of the other things that people in developed nations commonly do, often times see as a luxury.

Modern medicine needs a paradigm shift. Funding needs to support not only treatments, but also investigations into prevention. The medical cause of diabetes may be insulin resistance, but what causes insulin resistance and how can we prevent it? Sugar may cause cavities, but what can do to prevent this? Shoes, even comfy ones, may cause collapsed arches, but how do we prevent this? The immediate response may be that this sort of prevention cannot be attained without abandoning modern technology all together. However, this isn’t the case, and it’s not the argument I’m trying to make. Research should focus on a broad range of interacting variable, including diet, work environment, school environment, and other aspects evolutionarily novel environments. Only after research from this evolutionary perspective takes place can constructive conversations and beneficial environmental changes occur. We don’t have to abandon modern society to be healthy; we just need to better understand how our lifestyle affects our bodies. Items such as smoking and alcohol are already age limited and touted as dangerous to health. Is junk food, particularly soda, any different? We don’t put age regulations on cigarettes or alcohol to protect bystanders. Instead, these regulations protect children who cannot be relied upon to make proper choices in their naivety. Should soda be under these same constraints?

If medicine and medical research does not undergo this paradigmatic shift and incorporate an evolutionary perspective, the outcome does not bode well for us. Medical costs will continue to rise with little room for improvement and greater opportunities for socioeconomic factors to play into the quality of healthcare available. This ad hoc treatment approach to medicine is not sustainable, and is not the best we can do.