Tag Archives: Culture

Progressing the Person and Policy

The English word “person” has a long and convoluted history. Though the word itself likely derives from the Latin, persona, referring to the masks worn in theatre, its meaning has evolved over time. One of the biggest conceptual overhauls came in the 4th century AD during a church council that was held to investigate the concept…

via Progressing the Person and Policy — Savage Minds

Why Cultural Appropriation Matters

Cultural appropriation is a tricky topic to unpack and explain in a manner that keeps the attention of those who believe it to be “PC crap,” but also doesn’t dampen the significance of the issue. But we should try anyway.

I’ve no doubt played a role in cultural appropriation throughout my life, with no bad intentions or awareness that I was doing anything harmful. Growing up in okla humma, Choctaw for “Red People,” I was surrounded by Native American culture. Half of the cities I can name in Oklahoma derive from a Native American word or phrase in the language of one of the 67 tribes represented in the state. You can buy dream catchers and arrowheads at gas stations along the interstate, and Oklahoma museums have some of the largest Native American collections in the world. The designation of Oklahoma as Indian Territory in the 19th century laid the foundation for the incredibly complex and muddled mixing of unique cultures that white people typically lump into “Native American” culture. This amalgamated meta-culture, if you will, has been commodified into a staple of Oklahoma tourist attractions and local affairs. To those born here, the combined Native American culture is a frequent part of every day life, even though many don’t understand the significance of the cultural artifacts in their original context.

Continue reading Why Cultural Appropriation Matters

Why Can’t Rachel Dolezal be Black?

The news of Rachel Dolezal as someone who has “pretended” to be black came to light at an interesting time. A few weeks ago, former decathlon gold medalist Bruce Jenner, came out as a trans-woman, henceforth identifying as Caitlyn Jenner. It is broadly accepted among academics that gender and sex are not the same thing; sex is a biological reality, and gender is a social role that someone fills in society. While biological sex tends to be binary (male/female, with the exception of things such as intersex), gender can be seen as more of a spectrum. So how does this relate to race, or does it?

As a preface, I am not suggesting that gender is culturally equivalent to race, though both are cultural constructs and neither are biological realities. Race is a manner in which people are classified by phenotypic characteristics (often skin color), while gender, though often defined by phenotypic characteristics, describes a role in society. Race does not define a societal role. That being said, there are similarities between race and gender insofar as both relate to social identity and both can be seen as a spectrum.

Continue reading Why Can’t Rachel Dolezal be Black?

Evolution: The Missing Link in Medicine

“Nothing in biology makes sense except in the light of evolution.”

– Theodosius Dobzhansky

Evolution is arguably one of the most widely supported and powerful theories in all of biology, and potentially science as a whole. It has been a dominant explanation for over 100 years. Once genetics entered the picture in the first part of the 20th century, Darwin’s common descent and Mendel’s inheritance were improved upon, greatly expanded, and solidified into the new synthesis of evolution. Consistently verified through genetics, paleontology, geology, ecology, microbiology, and many other fields of science, evolution has become a pervasively potent field of study. It has created huge disciplinary offshoots – including evolutionary biology, evolutionary genetics, and evolutionary anthropology to name a few – and has become the theoretical foundation for all of biology.

Some people today argue that humans are no longer under evolutionary pressures, and, thus, are no longer evolving. Though this seems to make sense superficially, it is simply not true. The first issue is that humans only live about 80 years; a mere snapshot of our species’  existence. It is difficult to observe phenotypic differences as a result of biological evolution in only a few decades. That being said, scientists have found some very recent biological changes have occurred, including the altered expression of the FTO gene. The FTO gene codes for a protein that regulates appetite. While it does not “make” a person obese (genes tend to predispose, not determine), it has been correlated with obesity. The catch? It seems to have only been expressed after about 1940, according to a study published just 2 days ago. The study (which can be found here) found that, after 1942, the FTO gene showed a strong correlation with increase BMI. Why, though, would a gene that has not changed suddenly become active?

The Environment

What did change in the 1940’s? Technology. WWII offered an incredible economic boost to the US that massively increased technological enterprise and was the main contributing factor the world superpower status that the US achieved in the 40’s. As technology increased, labor decreased. After all, the main purpose of technology is to make human life simpler. When human life becomes simpler, people become more sedentary. New technology also allowed for cheaper, higher calorie, over-processed food. This one will take a while to work out. The difference could be epigenetic alteration, novel environmental stimuli, or even another gene interacting with FTO. While more testing will be needed to show exactly what happened in the early 40’s that altered FTO expression, the fact that something did occur, likely stemming from environmental changes, still remains. Biological evolution doesn’t have to be the changing of DNA sequence; that is far too simplistic. Anytime phenotypic or genotype ratios change on a species-wide level, evolution is occurring. No population is in Hardy-Weinberg equilibrium, and no population ever will be. Human wills continue to evolve biologically. While cultural evolution has exceedingly outpaced biological evolution, giving the mirage that biological evolution has “stopped,” the truth is that culture can either augment or stagnate biological evolution, depending upon the situation. A cultural change to drinking more milk may augment lactase persistence (and in fact, it did), while a cultural propensity to live in climate controlled housing year-round may slow other aspects of biological evolution. Nature doesn’t necessarily control natural selection; more broadly, the environment (cultural or natural) mediates evolution.

So, why is evolution important in medicine? Sure, doctors need to understand things like microbial evolution and how it plays a role in infectious diseases, but what about human evolution? How can a knowledge of human evolution impact medicine?

Cultural evolution has rapidly and drastically altered the human environment, thus changing how the human species evolves. More importantly, our environments have changed so aggressively that our bodies cannot keep up. (Before I go on, I have to make something clear. I am not a proponent of the Paleo Diet; if you’d like to know why, check out this post.) This means our bodies are often best adapted to the environments of the past (though these vary drastically). This has given rise to what are sometimes referred to as “mismatch diseases.” The list is extensive, but includes maladies such as atherosclerosis, heart disease, type-2 diabetes, osteoporosis, cavities, asthma, certain cancers, flat feet, depression, fatty liver syndrome, plantar fasciitis, and irritable bowel syndrome, to name a few. Some of these may not be actual mismatch diseases, but many of them likely are. Furthermore, many of these illnesses feed off one another, creating a terrible feedback loop. 100 years ago you’d likely die from an infectious disease; today, most people in developed nations will die from heart disease, type-2 diabetes, or cancer.

These diseases don’t have to be essential baggage of modernity. Anthropologists and (and some intrepid human evolutionary biologists) study modern day hunter-gatherer societies in order to glean information about the nature of our species pre-Neolithic Revolution. It’s important to note that these are not perfect models (cultural and biological evolution has still occurred in these hunter-gatherer societies), but are the best available. Interestingly, modern day hunter-gatherers don’t suffer from many of these mismatch diseases (This effect can’t be explained by longevity; hunter-gatherers regularly live into their late 60’s and 70’s. Though unusual to many of us, their lives aren’t as brutish as they are often portrayed). Diseases such as type-2 diabetes, hypertension, heart disease, osteoporosis, breast cancer and liver disease are rare among the societies. What’s more, myopia (near-sightedness), asthma, cavities, lower back pain, crowded teeth, collapsed arches, plantar fasciitis, and many other modern ailments are exceedingly rare. So what’s different? The easy answer is their diet, lifestyle, and environment. The difficult answer involves elucidating the physiological importance of certain social norms and biochemical processes of differing diets. Some very exciting work is beginning to arise in this field, dubbed “evolutionary medicine.”

Modern medicine and medical research focuses largely on treating problems, i.e., drugs and procedures that alleviate symptoms after the disease has manifested. While the cause is noble, and indeed necessary, it’s not enough. The childish logic of medical research creates a cycle of sickness-treatment that, in 2012, totaled almost $3 trillion in healthcare costs. Furthermore, the sedentary and Epicurean lifestyle in which many Americans live willingly feeds this cycle; among the less privileged, necessity feeds this cycle through the inability to afford healthy food, limited access to health education, and a sociocultural feedback loop that breeds its own vicious cycle.

There will likely never be a drug that can cure cancer (of which there are thousands of variants that can even differ between individuals who have the same variant), heart disease, type-2 diabetes, or many of the other previously mentioned noninfectious diseases. The rationale is akin putting water in your car’s gas tank and hoping additives will make it work as efficiently as gasoline. The car was built to run off gasoline. Similarly, your body has evolved to not eat an excessive amount of salts, carbs, and sugars (of which the different types, particularly glucose and fructose, do not have the same biochemical effects during digestion), sit for extended periods of time, wear shoes (particularly those with arch support; a common misconception is that arch support is good for you when, in fact, it weakens the muscles of the arch, leading to ailments such as collapsed arches and plantar fasciitis), read for several hours at a time, chew overly processed food, or many of the other things that people in developed nations commonly do, often times see as a luxury.

Modern medicine needs a paradigm shift. Funding needs to support not only treatments, but also investigations into prevention. The medical cause of diabetes may be insulin resistance, but what causes insulin resistance and how can we prevent it? Sugar may cause cavities, but what can do to prevent this? Shoes, even comfy ones, may cause collapsed arches, but how do we prevent this? The immediate response may be that this sort of prevention cannot be attained without abandoning modern technology all together. However, this isn’t the case, and it’s not the argument I’m trying to make. Research should focus on a broad range of interacting variable, including diet, work environment, school environment, and other aspects evolutionarily novel environments. Only after research from this evolutionary perspective takes place can constructive conversations and beneficial environmental changes occur. We don’t have to abandon modern society to be healthy; we just need to better understand how our lifestyle affects our bodies. Items such as smoking and alcohol are already age limited and touted as dangerous to health. Is junk food, particularly soda, any different? We don’t put age regulations on cigarettes or alcohol to protect bystanders. Instead, these regulations protect children who cannot be relied upon to make proper choices in their naivety. Should soda be under these same constraints?

If medicine and medical research does not undergo this paradigmatic shift and incorporate an evolutionary perspective, the outcome does not bode well for us. Medical costs will continue to rise with little room for improvement and greater opportunities for socioeconomic factors to play into the quality of healthcare available. This ad hoc treatment approach to medicine is not sustainable, and is not the best we can do.

A Case for the Coalescence of Science and the Humanities

To contemplate the nature of humanity, there must exist endeavors from both the sciences and humanities. Each branch of knowledge brings to the table its own unique perspectives, assumptions, and models of learning. The sciences teach us about the natural world and it’s functioning. From the microscopic investigations of DNA to the search for exoplanets, science has defined the latter part of the Anthropocene – the epoch in which the global ecosystems have been subjugated and forever changed by human activities. Science ushered technology into a dimension that was previously unimaginable, where there seems to be no end to the artificial extensions of our biological domains.

With its jumpstart from the 17th century Enlightenment, scientific inquiry and discovery has revolutionized our world. The Age of Enlightenment saw a rising of reason, skepticism, and individual thought from which there was no precedent. The Cambrian Explosion of scientific knowledge, the Enlightenment brought about scientific discoveries that rewrote the trajectory of human existence. Philosophes, freed from the dogmatic ideology of the past, drew up the blueprints of our future. However, as successful and revolutionary as the Enlightenment was, it proved unsuccessful at reaching the core of human spirit, unable to tap into the emotional side of human nature. In an attempt to fulfill the unsatiated desire to understand the core of humanity, the Romantic era was ushered in. The 19th century champions of creative arts filled the emotional gap left by scientific endeavors. Expressions of individuality, residues of the Enlightenment, flooded the arts. The importance of aesthetic value was stressed, and the human imagination was extended in all directions. Romantics attempted to divulge the secrets of the human experience, the continuum on which humans ride in the cosmos. A more focused and anthropocentric approach, Romanticism succeeded where the Enlightenment had failed, but failed where the Enlightenment had succeeded.

As science and the humanities grew increasingly complex, their existence seemed to be a fixed dichotomy, henceforth irreconcilable. Answering two fundamentally different types questions, the humanities and science are both essential to a holistic understanding of our existence in the larger cosmos. As our technoscientific advances increase at an astounding rate, our defining of the Anthropocene becomes ever more acute. Advances in science and technology drive our imposition on nature; our ability to repurpose the existing and to create anew are changing the landscape of Earth. To counteract the effects of science and technology on nature, we turn to… science and technology. Science shows us how to do things, however, it lacks in the ability to show us what we should do. This requires a taste of the humanities. The humanities represent the venture into and extension of our human continuum. They attempt to unveil and explain the idiosyncrasies of human thought, creativity, and overall existence. Much like the scientific endeavor, the humanities’ endeavor is a never-ending quest. There is always something new to discover that has the potential to shift our way of thinking or understanding.

As we penetrate deeper into the depths of nature, we must apply the knowledge and revelations from the humanities to our excursion. As we continue dominion over the Earth and extend our understanding of nature, we must give ourselves a course to follow. Because the humanities explore and explain our specialized place in the cosmos, they are in the best position to evaluate our intrusions upon nature. Questions of value cannot be answered by science. As prescient and imaginative as science is, it still follows the shadow of the humanities. Science fiction drives the frontiers of science. Our explorations into human nature and creativity are the precursors of scientific explorations. A coalescence of these two primary branches of learning is essential to our continued existence. Each serves as its own weight in the balance. To see the larger picture of our existence in the cosmos, we must turn to science. To understand our own existence and the intricacies from which it is fueled, the humanities are irreplaceable. To wisely advance in our existence, we must amalgamate the two into a functional framework.

Eschewing Scientific Curiosity in the US – A Slippery Slope.

NPR recently wrote a story titled, “When Scientists Give Up.” The story revolved around scientists that, in the middle of their career, decided to switch professions entirely due to issues with funding. Now, I am a bit biased when it comes to the importance of science, and I’ll be the first to admit that. However, I think it’s clear what role science has played, and must continue to play, in our society (unashamedly using this as a plug for a previous blog that I wrote concerning science in society, which can be found here). And, don’t get me wrong: there’s nothing wrong with a change of career, whether it’s due to poor job prospects or simply a change of interest. That being said, what on Earth is an individual who spent a minimum of 8 rigorous years at a well-respected school – gaining knowledge for a very particular career – doing switching careers at 40? On that same note, why is someone in whom the US has invested millions of dollars in grant money changing career paths? Clearly, there is something wrong with this picture.

Science doesn’t prove facts – it explains them. Science doesn’t prove evolution, science explains evolution; science doesn’t prove gravity, science explains gravity; science doesn’t prove that cells form the basis of life, science explains how cells form the basis of life. All of these things are already taken as facts (so, yes, evolution is a “fact” in the same sense gravity or cell theory are “facts.” It’s an observation that science attempts to explain in a systematic, reviewed manner). Now, if science never proves things, how long does it need to work on an explanation before it can be taken for granted? There’s no real answer to this question, though it does require a decent amount of time. The answer is more a function of how well it stands the test, rather than the how long. Gravitational theory has been standing for nearly 400 years, evolutionary theory, cell theory, and germ theory (that is, the theory that microorganisms can cause disease) for about 150 years. Does that mean that these theories have went un-amended? Of course not! That’s what science does: it pokes and prods at our ideas, refining them until they are able to stand the test – any test. As iron sharpen iron, so one scientist – or a large community of them – sharpens another.  Each of the previously mentioned theories were rather radical at their time, outside the common consensus and understanding of the time. Galileo’s ideas were so radical that he spent the last decade of his life under house arrest. All because he was trying to explain what he was seeing, and that explanation was regarded as too radical.

Money is a precious thing, not to be thrown around lightly (unless of course, it’s being blown on military-related projects, but that’s another story). Grant writing is tough, and the competition is incredibly fierce. As such, corporations that shell out grant money – hereafter referred to as “grantfathers,” a term I want credit for coining – are careful about to whom they choose to give. Unfortunately, it seems that more and more grantfathers are becoming conservative with their wagers. They’re spending their dollar playing the penny machines rather than the quarter-slots. While I understand the safe approach, it’s destroying the very essence and character of science. Yes, ideas must be rigorously tested and stand the test of time – even the boring ones. However, focusing only on this area of science, and ignoring the frontier-busting, trailblazing, imagination and passion driven areas of science is doing an injustice to the scientists, the field of study, and the country.

We didn’t get to where we are by playing it safe. Science – and by extension, technology – demands innovation. Innovation breeds errors. Errors, in the scientific community, breed precision. The current generation is afraid of failure. We are willing to stand up for a cause, maybe more so than most generations hitherto, but we are afraid to actually act upon the cause. This culturing of “skittishness” is driving science and technology into a plateau, shrinking the branches that emerge from the trunk of discovery. The innovation is there – the action is missing. But, who can blame them? You can only study what someone is willing to give you money for. In the modern field of science, you have to look out for numero uno, even if it flies in the face of everything you got into science for in the first place.

If government funding stays its conservative route, the future of the US as a leader in science and technology will grow dimmer and dimmer, overshadowed by more daring countries. Now, as a scientist, where will you go? Do you stay in the US, where funding is tight and sight is narrow, or across the pond, where funding, while still competitive, is more open-minded and nurturing of scientific curiosity? As the worlds’ greatest minds begin to reconvene outside of the US, our position as a global leader will diminish into something more second-rate. Once the scientists are gone, it will prove to be a difficult task getting them back. Students will seek degrees overseas, where the funding and mentors are to be found. The US has held this position for a long time, but is slowly slipping as king of the mountain. Once the avalanche starts, it will be difficult to reverse. If the way we currently handle pre-emptive tasks – such as fossil-fuel dependency or drug-resistant bacteria – says anything about how we will handle this issue, it may already be too late.

The Role of Science in Society


Carl Sagan once stated, “… the consequences of scientific illiteracy are far more dangerous in our time than in any time that has come before.” This statement becomes truer every day, as scientific and technological innovations are occurring at an ever-increasing rate. Studies suggest that less than 30% of Americans are “scientifically literate,” meaning that over 70% of Americans would have trouble reading – and understanding – the science section of the New York Times. So, why is this important? After all, everyone has their strengths and weaknesses.

The problem with this view is that science is a driving force behind our sociocultural evolution. New ideas and new inventions are constantly redefining how we live our lives. As time goes on, science and technology will define most of life as we live it. Already, this is true. 100 years ago, people often lived day by day without electricity. Today, the most frightening thing most people could imagine would be a total loss of electricity. Imagine all of the things that simply wouldn’t work without it: phones, televisions, the Internet, lighting, heat and A/C, automobiles, and many parts of the manufacturing process for everyday items. We have built a society in the United States that is almost entirely dependent upon electricity. Personally, It’s difficult for me to imagine a world without electricity because everything I know is based on upon it. Life has become relentlessly complex and multifaceted. Most people have no idea how the world around them – that is, this semi-artificial world, or anthropogenic matrix – functions.

As time goes on, our day-to-day lives will become less and less “natural” and more and more artificial. This is not inherently bad. However, it does raise the standards for what we must understand about how the world, especially our anthropogenic matrix, works. Failing to keep a basic understanding of science and technology is destined to segregate the population, facilitating the rise of an “elite” few, resembling more of an oligarchy than a representative democracy. I’m not much of a conspiracy theorist, and I don’t mean to imply that a “New World Order” is going to secretly control our lives. I do, however, think that if nothing is done about our general ignorance of science, we will slip away from the democracy that we claim to love so dearly. How? How can ignorance of science and technology lead to the failure of democracy? After all, you can vote regardless of your scientific literacy. While it’s true that you can vote while being largely ignorant of how the world works, this is part of the problem. To be clear, I do not think that there should be any kind of scientific literacy test in order to vote. This would only serve as fuel for the ever-broadening gap between those who understand science and those who don’t. In a democracy, everyone should be able to vote. However, given the state of knowledge that we currently have and the increasingly complex world in which we find ourselves, uneducated voting has disastrous consequences.

A Little Politics

Politics is, in its most basic form, the practice of influencing a population. This is done by verbally persuading people to get behind an action that will be set in motion order to guide the population down a particular path of life. The United States is a representative democracy, which means officials are elected by the public to govern the public. The United States is not a simple representative democracy; many modifications are set in order to give the minority a voice. However, in light of these modifications, “majority rules” is still the rule of thumb. On its surface, a “majority rules” system seems ideal. Going with what most people want or believe is the best thing to do seems like a solid idea. I agree that this is typically a good philosophy – that is, as long as those voting are educated on the matter at hand.

The Modern Intersection of Science and Government

The base of everything in our lives is built from science; it holds together our infrastructure. When a politician makes a motion to change or regulate something, he or she is making a change that affects our anthropogenic matrix, and, consequently, the natural world in which our matrix operates through such acts as deforestation, ozone depletion, species extinction, etc. If a constituent does not have a basic understanding of how the world works, then how can that individual make a good decision with regards to electing a public official who will pass laws that affect the world? Moreover, ignorance of science and technology (not to mention poor reasoning and logical evaluation skills that tend to accompany science education) leads to a vote based largely on emotion and superficial similarity. If you know very little about a subject, you cannot make an educated decision regarding that subject. If not based on an educated understanding, something else must be the base upon which you make decisions. The next best choice would be decisions based on reason and logic. Unfortunately, a fostering of critical thinking is also aloof in many educational settings. Science acts as a major source of training by which people learn to reason and form logical conclusions. In turn, many – though not all – who base their decisions on logical reasoning are in the same group of people who base their decisions on knowledge of science.

If you don’t use a knowledge of science to aid in political decision-making, it’s likely that you are more swayed by charisma and emotional triggers. Those candidates who are more like you, or at least are ostensibly like you, are more likely to sway your opinion. After all, that’s what politics is all about – persuading people. If most of your constituents are not scientifically literate, then you as a politician will be less likely to use science as a persuasion tactic and more likely to use charisma and emotionally charged wording that resonates with many of your constituents. Though not a valiant method of persuasion, it is a smart one. Unfortunately, this only perpetuates the current epidemic of scientific illiteracy.

Why Public Knowledge of Science Matters

One major problem with scientific illiteracy is that politicians can make a poor decision, intentionally or unintentionally, with no one to call them out. Regulations or the lack thereof concerning issues such as climate change, medical research, and irresponsible use of resources must be made based on the science that is used to study and understand these matters. If a politician uses a non-scientific basis for creating laws (a basis fueled by a constituency who is scientifically illiterate and, perhaps, an ulterior motive such as monetary stock in the decision), then consequences are sure to ensue. The effects can be immediate, such as lack of funding for education or medical research, or delayed, as with the consequences surrounding anthropogenic climate change.

Politics aside, understanding science and technology is imperative to functioning in our ever increasingly technological world. 100,000 years ago, one had to be a skillful hunter or gatherer; 10,000 years ago, one needed to be adept in agriculture; today, we must stay informed on, at the very least, the basics of science. Expertise is not required for social and political progress, but awareness is essential.

Genesis 1: A Story of Functional Creation, not Material Creation.


When looking to the Bible for information on mankind’s purpose, many modern Christians tend to overlook Genesis. On its surface, the articulation of Genesis 1 appears as an account of material ontology, or material creation. However, this understanding of the creation account is superficial and requires no investigation into the text, or more importantly, the sociocultural context of the literature. In fact, I will make the argument that reading Genesis 1 as an account of material ontology is not faithful to the original intention or reception of the passage. In lieu of a material ontological reading, I argue for a functional ontological reading of Genesis 1 that stays true to the text and dispels much of the contemporary debate in Christian cosmogony (belief of universe origins).

The purpose of a functional interpretation of Genesis 1 is not to abolish inconsistencies in Christian cosmogony (though this is accomplished in the process), but rather to give a more insightful and meaningful reading of the text that communicates to the modern reader the same message that it communicated to the ancient Hebrew listener. In order to fully appreciate the intention of the text, the reader must explore the culture of the ancient Hebrew people, understand the divergence between several Hebrew words and their English translations, and be mindful not to cast their post-enlightenment, materialistic perspective of the world into the reading of the text. Taking this approach to the text will not only render a functional ontological interpretation of Genesis 1 probable, but will also disavow a material ontological understanding of the creation account.


Cosmology, or the understanding of the universe, has drastically changed over the span of recorded history. The currently accepted cosmological model is an expanding universe following the implosion of an original singularity – the Big Bang. Though there are still many unanswered questions about the Big Bang, most of which will likely never be fully elucidated, there has been a steady flow of empirical evidence supporting the theory since the 1960’s. This “creation account“ is representative of how cosmology functions in the 21st century Western World – namely, accumulation of empirical evidence supporting a hypothesis. A relic of the 16th century Scientific Revolution, our empirical worldview in the West has lead to astounding advancements in science, medicine, and technology that has propelled humanity forward at a remarkable rate. However, this empirical approach to the world is so infused with the Western worldview that it is often difficult for one to step outside of this perspective.

We in the Western World tend to see any non-empirical approach to knowledge or truth as primitive and often not worthy of pursuit. This naïve approach becomes problematic when studying the writings of ancient societies that had a different cosmology and cosmogony than our own; the Bible, containing a library of ancient Hebrew and Greek writings, is no exception. The cosmology and cosmography (physical arrangement of the cosmos) of the Biblical authors and their contemporaries differed profoundly from modern day ideology. This difference is expressed in the literature of these times, including the books of the Bible. Though perspicuous to its contemporaries, the intentions of Biblical literature, particularly that of the Old Testament, require deliberation from modern day Christians.

No sensible person today would deny that they are sitting on a spherical planet that orbits a star in the center of a solar system. In fact, nearly everyone today would agree with the fact that our solar system is just one of a countless number of solar systems in the Milky Way galaxy, which is, in turn, but a single galaxy among countless others. This belief is not held by a single ethnic group, religion, or country, but rather is held by humanity writ large. This is our modern cosmography, and it’s difficult to rationally deny given our current understanding. Mankind’s cosmography has undergone several paradigmatic shifts since the Biblical (or Babylonian) cosmography of the Old Testament authors.

The name “Genesis” actually derived from a 3rd century Greek transliteration. The Hebrew name for the first book of the Bible, “Bereshit,” means “In the beginning,” alluding to its opening words. These opening words set up the theme of the chapter, namely, the cosmogony of the Hebrew people. This brings up an important point. It must be understood that the Bible was written for all people of all times, but was not written to all people of all times; the Old Testament was written to the ancient Hebrew people. When a work of literature is written, the author employs imagery and ideas that are familiar and relevant to the intended audience. Genesis, being an origins account, includes cosmology in its narrative. The image below is representative of a typical Babylonian cosmology, of which the ancient Mesopotamian people, including the Hebrews, embraced.


It’s evident that ancient cosmography was very different from modern cosmography. To the ancient Mesopotamians, however, this cosmography made sense. This was how they understood their world. When the author of Genesis speaks of the “firmament,” we cannot translate it as sky, as this is not what it meant to the ancient audience. Firmament, to the ancient Israelites, was a solid structure holding back the waters above. This belief in a firmament and waters above was common among all Babylonian cosmography.

So, is the Bible wrong or untruthful for mentioning a firmament that we now clearly know is not there? I don’t think we can make the claim that the Bible is “wrong” about this if we keep in mind that it was written to a specific audience at a specific point in time. A material ontological reading – what many today mean by “literal,” though this is a misnomer – presents a problem: the Bible is supposedly unwaveringly truthful, and it claims that there is a firmament in the sky, above which lies some sort of ocean. Now what? Do we accept that there is a firmament in the sky and stop paying our half a penny per tax dollar for NASA, or do we investigate a little more? If we accept a material ontological reading of Genesis 1 but do not accept the cosmography of Genesis 1, then we have quite the theological conundrum. If the intention of Genesis 1 was to communicate material ontology, then it would need to be written using an understanding common to all people of all times in order to get the message across while also preventing falsehood from arising within the text. Perhaps, then, the message of Genesis 1 is not material ontology.

Function and Existence

The nature of existence is not something people contemplate on a regular basis. In a modern Western World mentality, the nature of existence is intrinsically tied to biological life. However, “alive” and “existence” are communicating two different ideas. A rock exists, but we would not consider it “alive.” So, what does it mean to exist? In ancient Mesopotamia, material properties were not a sufficient condition for existence; an object or being’s existence was contingent upon function. This reality was true for cultures writ large, including the Israelites. This notion of functional existence was also expressed in creation stories in the ancient Near Eastern world. In essence, creation stories, including those of the Israelites, were stories about the gods giving function and order to a system.

When investigating the idea of existence, a hermeneutical approach must be incorporated in the analysis. One example of this is the Hebrew word “bārā’,” which is translated as create. In the Bible, bārā’ is only used in reference to God. Also, there are a number of instances in the Bible where bārā’ must be understood as functional creation; correspondingly, there seem to be no instances where bārā’ mandates material creation. Exegetical work on bārā’ seems to suggest a functional connotation. It might seem odd at first to have more than one word for an action, but this is common among languages. A language belonging to a culture that relies on the phases of the moon might have a dozen or more words for “moon” depending upon the context in which it is used. Language is a tool that is molded based on what is important to the user. The idea of a word for creation, used in the context of function, follows suit with a functional ontological reading of Genesis 1. Functional creation is not only an ancient notion. Even today there are examples of existence that rely on function.

John Walton gives a clear, modern example of modern functional creation in his book “The Lost World of Genesis One.” Imagine a restaurant. When does a restaurant come into existence? Is it a restaurant when the building is constructed (i.e., the material creation)? A building can be or become anything, so this cannot be the case. The most sensible answer seems to that a restaurant becomes a restaurant after a safety inspector deems the restaurant fit to conduct business. A restaurant that is closed, for one reason or another, is not in “existence.” Business, which is the function of the restaurant, is required for existence; thus, its existence is defined by its function. Naming is also related to function. The name Yahweh can be translated as “I am,” which speaks to the Judeo-Christian understanding of God’s function as an eternal and omnipresent being. Of course, material properties must precede function as a necessary condition for existence, but material properties alone may not be a sufficient condition for existence. Restaurants aren’t the only example of functional ontology today. Many things, including corporations, businesses, stocks, the Internet, and governments require, in one-way or another, functional ontology. It is no stretch of the imagination to envision how a culture, void of modern science and empirical based thinking, could have ontology rooted in function. Consequently, this provides support for the functional worldview of the ancient world, specifically with the Israelites.

Can Genesis be Material and Functional Ontology?

Given the evidence, it seems to follow that Genesis was written as functional ontology. However, this does not necessarily eliminate the possibility of also reading it as material ontology. Many Christians argue that a “literal” reading of Genesis 1 is required if we are to take the Bible seriously. This proves problematic, as everything we know and understand about life, the universe, and much of science in general, is not in accordance with a material ontological reading of Genesis. Many arguments exist to reconcile the discrepancies between science and a material ontological reading of Genesis 1, however all of them rely on some sort of ad hoc modification leading to a concordist view of the Bible. Concordism, in this context, is the belief that Genesis 1 can be read as material ontology and still be in concordance with modern science.

Concordist views come in many forms, including young earth arguments and old earth arguments. For an old earth argument, the most common approach is to place long periods of time either between the “days” of creation or between the first and second creation account. One main problem with this kind of approach is that it ignores what we currently understand about the ancient Israelite culture. It is not that the concordist hypotheses are too far-fetched (though I would argue that they tend to stretch science and hermeneutics quite thin), but rather that they are missing the point of the story. I am not making an attempt to disprove the science in their arguments; I am trying to show that the science does not matter. The arguments do not seem to take into account the fact that the Bible was not written to us; it was written to the Israelites. Science as we know it today did not exist when the books of the Bible were written, therefore it does not make sense for the Bible to be written with science in mind. The efforts to reconcile modern knowledge with a material ontological reading cause the reader to lose sight of the intention of Genesis 1.

A young earth creationist (YEC) would tend to agree with many of my critiques to concordist views. They see the Bible as an absolute truth, and man should not invoke into the text his finite understanding of science and theology. If Genesis 1 speaks of 24-hour days (yom, in Hebrew), then the creation account must have taken place in 6 literal days, they argue. Attempts at stretching the meaning of words such as yom do not enrich the authority or veracity of the text by accommodating modern cosmology. They maintain that, because we have a finite understanding of the world and certainly of God, perhaps the text of Genesis 1 should be taken as a literal account of material origins. This YEC argument seems fair, and it does not have the problem of concordism, but there are some major issues.

By affirming Genesis 1 as material ontology, YEC proponents are, by default, reading their culture into the Bible. The creation account only seems to suggest material ontology to a reader who has the cultural bias of empiricism. Those of us born into the 21st century Western World are encultured to see things in a physical and empirical manner. This becomes a problem when we read a work of literature from a culture that did not have this same mentality. A “literal” reading of Genesis 1 means something different for us today than it would have to the ancient Hebrews. The most “literal” reading of the text would be a reading that comprehends the text through the mind of the author. The best way to attain this understanding is to study the culture and recognize the biases that would be present in the author’s writing. In the case of the ancient Hebrew literature, there would be a cultural bias against physical descriptions. We must take into account the cosmological and epistemological views of a culture when we read the literature. Along with eschewing modern scientific understandings of the world, this absence of culture interpretation is perhaps the biggest failure of YEC theology.

The ancient Israelites were not concerned with the physical details of creation, and a Genesis 1 would not be written as material ontology. The Israelites were concerned with who created them and why mankind is on earth. A functional ontological reading of Genesis 1 answers these questions and clears up most of the modern day cosmogony confusion. When viewed as a functional account of origins, the age of the earth, which tends to be at the heart of many concordist beliefs, is not an issue. There is no longer a need for the Judeo-Christian God to be a charlatan, He does not need to hide in the gaps of knowledge, yom can mean a literal 24-hour day, evolution is no longer a threat, and the universe can be 13.7 billion years old. A functional ontological view allows Genesis 1 to succeed in its intention, namely, communicating to the reader (particularly the original audience of ancient Hebrews) who God is and the nature of his relationship to mankind.


We must be careful not to come to Genesis 1 thinking of it as a modern metaphor just because the language or structure is strange to us today. Metaphor and functional origins are qualitatively different characterizations. There are instances of figurative language within the creation story, but this does not mean the story itself is metaphor nor does it say anything about the meaning of Genesis 1. It is important to understand that this style of writing was the method of conveying truth in the ancient world. Today, we use an empirical method to convey truth; Hebrews did not see truth in this way, and used the meaning of the story to convey a truth about the nature of story’s subject. Many Native American tribes convey truth in a similar fashion. Chronological or historical matters are not of significance. Rather, what matters is what the story says about the subject’s character or its relationship to mankind.

Many instances of odd structuring or bizarre language likely occur because of the vast cultural differences. For example, the ordering of events in the creation account stems largely from the Hebrew use of block logic as opposed to the step logic to which we are accustomed. Similarly, early Hebrew writers emphasized theological points and were more concerned with the significance of events than they were with historical linearity. Historicity in Genesis would not have been in issue to the Israelites. Genesis was told as a story of functional ontology, expressing the importance of mankind’s place in relationship with the Creator. These differences in perspective and writing style do not make the story metaphorical or untrue, they simply express perspectives of the ancient Hebrew people.

Given our cultural disposition to seek empiricism, we must take careful deliberation when making assumptions about the meaning of Scripture. The reader must accept that the text was not written with their culture, including notions about the nature of truth, in mind. The text was written in a manner that reflects the culture of the time, thus the culture must be “translated” alongside the text. The ancient Hebrew audience would have understood the message that was being communicated through the creation story. It was written as a testament to their God’s power and glory. It enlightens the reader on who their God is and where mankind is in relation to him; mankind is on earth as the image-bearer of the divine. The mentioning of physical objects in Genesis 1 is to give the story context within the ancient Israelite culture, in much the same way that objects are employed to this end by modern authors. Reading Genesis 1 as an account of material origins is simply missing the point. In turn, it causes the text say something that was never meant to be communicated, and flies in the face of our current understanding of nature and cosmology. Christians today must approach Genesis 1 not as material ontology that their modern sociocultural context has shaped them to see, but as a functional ontology that reflects the views of its original audience.

For more reading on this subject, check out the following from John Walton:

Ancient Near Eastern Thought and the Old Testament: Introducing the Conceptual World of the Hebrew Bible

The Lost World of Genesis One: Ancient Cosmology and the Origins Debate.