skip to main |
skip to sidebar
Normal Human Occlusion
In 1967, a team of geneticists and anthropologists published an extensive study of a population of Brazilian hunter-gatherers called the Xavante (1). They made a large number of physical measurements, including of the skull and jaws. Of 146 Xavante examined, 95% had "ideal" occlusion, while the 5% with malocclusion had nothing more than mild crowding of the incisors (front teeth). The authors wrote: Characteristically, the Xavante adults exhibited broad dental arches, almost perfectly aligned teeth, end-to-end bite, and extensive dental attrition [tooth wear].
In the same paper, the author presents occlusion statistics for three other cultures. According to the papers he cites, in Japan, the prevalence of malocclusion was 59%, and in the US (Utah), it was 64%. He also mentions another native group living near the Xavante, part of the Bakairi tribe, living at a government post and presumably eating processed food. The prevalence of malocclusion was 45% in this group.
In 1998, Dr. Brian Palmer (DDS) published a paper describing some of the collections of historical skulls he had examined over the years (2): ...I reviewed an additional twenty prehistoric skulls, some dated at 70,000 years old and stored in the Anthropology Department at the University of Kansas. Those skulls also exhibited positive [good] occlusions, minimal decay, broad hard palates, and "U-shaped" arches.
The final evaluations were of 370 skulls preserved at the Smithsonian Institution in Washington, D.C. The skulls were those of prehistoric North American plains Indians and more contemporary American skulls dating from the 1920s to 1940s. The prehistoric skulls exhibited the same features as mentioned above, whereas a significant destruction and collapse of the oral cavity were evident in the collection of the more recent skulls. Many of these more recent skulls revealed severe periodontal disease, malocclusions, missing teeth, and some dentures. This was not the case in the skulls from the prehistoric periods...
The arch is the part of the upper jaw inside the "U" formed by the teeth. Narrow dental arches are a characteristic feature of malocclusion-prone societies. The importance of arch development is something that I'll be coming back to repeatedly. Dr. Palmer's paper includes the following example of prehistoric (L) and modern (R) arches:
Dr. Palmer used an extreme example of a modern arch to illustrate his point, however, arches of this width are not uncommon today. Milder forms of this narrowing affect the majority of the population in industrial nations.
In 1962, Dr. D.H. Goose published a study of 403 British skulls from four historical periods: Romano-British, Saxon, medieval and modern (3). He found that the arches of modern skulls were less broad than at any previous time in history. This followed an earlier study showing that modern British skulls had more frequent malocclusion than historical skulls (4). Goose stated that:
Although irregularities of the teeth can occur in earlier populations, for example in the Saxon skulls studied by Smyth (1934), the narrowing of the palate seems to have occurred in too short a period to be an evolutionary change. Hooton (1946) thinks it is a speeding up of an already long standing change under conditions of city life.
Dr. Robert Corruccini published several papers documenting narrowed arches in one generation of dietary change, or in genetically similar populations living rural or urban lifestyles (reviewed in reference #5). One was a study of Caucasians in Kentucky, in which a change from a traditional subsistence diet to modern industrial food habits accompanied a marked narrowing of arches and increase in malocclusion in one generation. Another study examined older and younger generations of Pima Native Americans, which again showed a reduction in arch width in one generation. A third compared rural and urban Indians living in the vicinity of Chandigarh, showing marked differences in arch breadth and the prevalence of malocclusion between the two genetically similar populations. Corruccini states:In Chandigarh, processed food predominates, while in the country coarse millet and locally grown vegetables are staples. Raw sugar cane is widely chewed for enjoyment rurally [interestingly, the rural group had the lowest incidence of tooth decay], and in the country dental care is lacking, being replaced by chewing on acacia boughs which clean the teeth and are considered medicinal.
Dr. Weston Price came to the same conclusion examining prehistoric skulls from South America, Australia and New Zealand, as well as their living counterparts throughout the world that had adhered to traditional cultures and foodways. From Nutrition and Physical Degeneration: In a study of several hundred skulls taken from the burial mounds of southern Florida, the incidence of tooth decay was so low as to constitute an immunity of apparently one hundred per cent, since in several hundred skulls not a single tooth was found to have been attacked by tooth decay. Dental arch deformity and the typical change in facial form due to an inadequate nutrition were also completely absent, all dental arches having a form and interdental relationship [occlusion] such as to bring them into the classification of normal.
Price found that the modern descendants of this culture, eating processed food, suffered from malocclusion and narrow arches, while another group from the same culture living traditionally did not. Here's one of Dr. Price's images from Nutrition and Physical Degeneration (p. 212). This skull is from a prehistoric New Zealand Maori hunter-gatherer:
Note the well-formed third molars (wisdom teeth) in both of the prehistoric skulls I've posted. These people had ample room for them in their broad arches. Third molar crowding is a mild form of modern face/jaw deformity, and affects the majority of modern populations. It's the reason people have their wisdom teeth removed. Urban Nigerians in Lagos have 10 times more third molar crowding than rural Nigerians in the same state (10.7% of molars vs. 1.1%, reference #6).
Straight teeth and good occlusion are the human evolutionary norm. They're also accompanied by a wide dental arch and ample room for third molars in many traditionally-living cultures. The combination of narrow arches, malocclusion, third molar crowding, small or absent sinuses, and a characteristic underdevelopment of the middle third of the face, are part of a developmental syndrome that predominantly afflicts industrially living cultures.(1) Am. J. Hum. Genet. 19(4):543. 1967. (free full text)(2) J. Hum. Lact. 14(2):93. 1998(3) Arch. Oral Biol. 7:343. 1962(4) Brash, J.C.: The Aetiology of Irregularity and Malocclusion of the Teeth. Dental Board of the United Kingdom, London, 1929.(5) Am J. Orthod. 86(5):419
(6) Odonto-Stomatologie Tropicale. 90:25. (free full text)
In his epic work Nutrition and Physical Degeneration, Dr. Weston Price documented the abnormal dental development and susceptibility to tooth decay that accompanied the adoption of modern foods in a number of different cultures throughout the world. Although he quantified changes in cavity prevalence (sometimes finding increases as large as 1,000-fold), all we have are Price's anecdotes describing the crooked teeth, narrow arches and "dished" faces these cultures developed as they modernized.
Price published the first edition of his book in 1939. Fortunately, Nutrition and Physical Degeneration wasn't the last word on the matter. Anthropologists and archaeologists have been extending Price's findings throughout the 20th century. My favorite is Dr. Robert S. Corruccini, currently a professor of anthropology at Southern Illinois University. He published a landmark paper in 1984 titled "An Epidemiologic Transition in Dental Occlusion in World Populations" that will be our starting point for a discussion of how diet and lifestyle factors affect the development of the teeth, skull and jaw (Am J. Orthod. 86(5):419)*.
First, some background. The word occlusion refers to the manner in which the top and bottom sets of teeth come together, determined in part by the alignment between the upper jaw (maxilla) and lower jaw (mandible). There are three general categories: - Class I occlusion: considered "ideal". The bottom incisors (front teeth) fit just behind the top incisors.
- Class II occlusion: "overbite." The bottom incisors are too far behind the top incisors. The mandible may appear small.
- Class III occlusion: "underbite." The bottom incisors are beyond the top incisors. The mandible protrudes.
Malocclusion means the teeth do not come together in a way that's considered ideal. The term "class I malocclusion" is sometimes used to describe crowded incisors when the jaws are aligning properly.
Over the course of the next several posts, I'll give an overview of the extensive literature showing that hunter-gatherers past and present have excellent occlusion, subsistence agriculturalists generally have good occlusion, and the adoption of modern foodways directly causes the crooked teeth, narrow arches and/or crowded third molars (wisdom teeth) that affect the majority of people in industrialized nations. I believe this process also affects the development of the rest of the skull, including the face and sinuses. In his 1984 paper, Dr. Corruccini reviewed data from a number of cultures whose occlusion has been studied in detail. Most of these cultures were observed by Dr. Corruccini personally. He compared two sets of cultures: those that adhere to a traditional style of life and those that have adopted industrial foodways. For several of the cultures he studied, he compared it to another that was genetically similar. For example, the older generation of Pima indians vs. the younger generation, and rural vs. urban Punjabis. He also included data from archaeological sites and nonhuman primates. Wild animals, including nonhuman primates, almost invariably show perfect occlusion. The last graph in the paper is the most telling. He compiled all the occlusion data into a single number called the "treatment priority index" (TPI). This is a number that represents the overall need for orthodontic treatment. A TPI of 4 or greater indicates malocclusion (the cutoff point is subjective and depends somewhat on aesthetic considerations). Here's the graph:
Every single urban/industrial culture has an average TPI of greater than 4, while all the non-industrial or less industrial cultures have an average TPI below 4. This means that in industrial cultures, the average person requires orthodontic treatment to achieve good occlusion, whereas most people in more traditionally-living cultures naturally have good occlusion.
The best occlusion was in the New Britain sample, a precontact Melanesian hunter-gatherer group studied from archaeological remains. The next best occlusion was in the Libben and Dickson groups, who were early Native American agriculturalists. The Pima represent the older generation of Native Americans that was raised on a somewhat traditional agricultural diet, vs. the younger generation raised on processed reservation foods. The Chinese samples are immigrants and their descendants in Liverpool. The Punjabis represent urban vs. rural youths in Northern India. The Kentucky samples represent a traditionally-living Appalachian community, older generation vs. processed food-eating offspring. The "early black" and "black youths" samples represent older and younger generations of African-Americans in the Cleveland and St. Louis area. The "white parents/youths" sample represents different generations of American Caucasians.
The point is clear: there's something about industrialization that causes malocclusion. It's not genetic; it's a result of changes in diet and/or lifestyle. A "disease of civilization". I use that phrase loosely, because malocclusion isn't really a disease, and some cultures that qualify as civilizations retain traditional foodways and relatively good teeth. Nevertheless, it's a time-honored phrase that encompasses the wide array of health problems that occur when humans stray too far from their ecological niche. I'm going to let Dr. Corruccini wrap this post up for me:
I assert that these results serve to modify two widespread generalizations: that imperfect occlusion is not necessarily abnormal, and that prevalence of malocclusion is genetically controlled so that preventive therapy in the strict sense is not possible. Cross-cultural data dispel the notion that considerable occlusal variation [malocclusion] is inevitable or normal. Rather, it is an aberrancy of modern urbanized populations. Furthermore, the transition from predominantly good to predominantly bad occlusion repeatedly occurs within one or two generations' time in these (and other) populations, weakening arguments that explain high malocclusion prevalence genetically.
* This paper is worth reading if you get the chance. It should have been a seminal paper in the field of preventive orthodontics, which could have largely replaced conventional orthodontics by now. Dr. Corruccini is the clearest thinker on this subject I've encountered so far.
In April of 1982, archaeologists from around the globe converged on Plattsburgh, New York for a research symposium. Their goal:...[to use] data from human skeletal analysis and paleopathology [the study of ancient diseases] to measure the impact on human health of the Neolithic Revolution and antecedent changes in prehistoric hunter-gatherer food economies. The symposium developed out of our perception that many widely debated theories about the origins of agriculture had testable but untested implications concerning human health and nutrition and our belief that recent advances in techniques of skeletal analysis, and the recent explosive increase in data available in this field, permitted valid tests of many of these propositions.
In other words, they got together to see what happened to human health as populations adopted agriculture. They were kind enough to publish the data presented at the symposium in the book Paleopathology at the Origins of Agriculture, edited by the erudite Drs. Mark Nathan Cohen and George J. Armelagos. It appears to be out of print, but luckily I have access to an excellent university library.There are some major limitations to studying human health by looking at bones. The most obvious is that any soft tissue pathology will have been erased by time. Nevertheless, you can learn a lot from a skeleton. Here are the main health indicators discussed in the book:- Mortality. Archaeologists are able to judge a person's approximate age at death, and if the number of skeletons is large enough, they can paint a rough picture of the life expectancy and infant mortality of a population.
- General growth. Total height, bone thickness, dental crowding, and pelvic and skull shape are all indicators of relative nutrition and health. This is particularly true in a genetically stable population. Pelvic depth is sensitive to nutrition and determines the size of the birth canal in women.
- Episodic stress. Bones and teeth carry markers of temporary "stress", most often due to starvation or malnutrition. Enamel hypoplasia, horizontal bands of thinned enamel on the teeth, is probably the most reliable marker. Harris lines, bands of increased density in long bones that may be caused by temporary growth arrest, are another type.
- Porotic hyperostosis and cribra orbitalia. These are both skull deformities that are caused by iron deficiency anemia, and are rather creepy to look at. They're typically caused by malnutrition, but can also result from parasites.
- Periosteal reactions. These are bone lesions resulting from infections.
- Physical trauma, such as fractures.
- Degenerative bone conditions, such as arthritis.
- Isotopes and trace elements. These can sometimes yield information about the nutritional status, diet composition and diet quality of populations.
- Dental pathology. My favorite! This category includes cavities, periodontal disease, missing teeth, abscesses, tooth wear, and excessive dental plaque.
The book presents data from 19 regions of the globe, representing Africa, Asia, the Middle East, Europe, South America, with a particular focus on North America. I'll kick things off with a fairly representative description of health in the upper Paleolithic in the Eastern Mediterranean. The term "Paleolithic" refers to the period from the invention of stone tools by hominids 2.5 million years ago, to the invention of agriculture roughly 10,000 years ago. The upper Paleolithic lasted from about 40,000 to 10,000 years ago. From page 59:In Upper Paleolithic times nutritional health was excellent. The evidence consists of extremely tall stature from plentiful calories and protein (and some microevolutionary selection?); maximum skull base height from plentiful protein, vitamin D, and sunlight in early childhood; and very good teeth and large pelvic depth from adequate protein and vitamins in later childhood and adolescence...
Adult longevity, at 35 years for males and 30 years for females, implies fair to good general health...
There is no clear evidence for any endemic disease.
The level of skeletal (including cranial and pelvic) development Paleolithic groups exhibited has remained unmatched throughout the history of agriculture. There may be exceptions but the trend is clear. Cranial capacity was 11% higher in the upper Paleolithic. You can see the pelvic data in this table taken from Paleopathology at the Origins of Agriculture. There's so much information in this book, the best I can do is quote pieces of the editor's summary and add a few remarks of my own. One of the most interesting things I learned from the book is that the diet of many hunter-gatherer groups changed at the end of the upper Paleolithic, foreshadowing the shift to agriculture. From pages 566-568: During the upper Paleolithic stage, subsistence seems focused on relatively easily available foods of high nutritional value, such as large herd animals and migratory fish. Some plant foods seem to have been eaten, but they appear not to have been quantitatively important in the diet. Storage of foods appears early in many sequences, even during the Paleolithic, apparently to save seasonal surpluses for consumption during seasons of low productivity.
As hunting and gathering economies evolve during the Mesolithic [period of transition between hunting/gathering and agriculture], subsistence is expanded by exploitation of increasing numbers of species and by increasingly heavy exploitation of the more abundant and productive plant species. The inclusion of significant amounts of plant food in prehistoric diets seems to correlate with increased use of food processing tools, apparently to improve their taste and digestibility. As [Dr. Mark Nathan] Cohen suggests, there is an increasing focus through time on a few starchy plants of high productivity and storability. This process of subsistence intensification occurs even in regions where native agriculture never developed. In California, for example, as hunting-gathering populations grew, subsistence changed from an early pattern of reliance on game and varied plant resources to to one with increasing emphasis on collection of a few species of starchy seeds and nuts.
...As [Dr. Cohen] predicts, evolutionary change in prehistoric subsistence has moved in the direction of higher carrying capacity foods, not toward foods of higher-quality nutrition or greater reliability. Early nonagricultural diets appear to have been high in minerals, protein, vitamins, and trace nutrients, but relatively low in starch. In the development toward agriculture there is a growing emphasis on starchy, highly caloric food of high productivity and storability, changes that are not favorable to nutritional quality but that would have acted to increase carrying capacity, as Cohen's theory suggests.
Why am I getting the feeling that these archaeologists have a better grasp of human nutrition than the average medical doctor or nutritionist? They have the Price-esque understanding that comes from comparing the diets and multi-generational health of diverse human populations.One of the interesting things I learned from the book is that Mesolithic populations, groups that were halfway between farming and hunting-gathering, were generally as healthy as hunter-gatherers: ...it seems clear that seasonal and periodic physiological stress regularly affected most prehistoric hunting-gathering populations, as evidenced by the presence of enamel hypoplasias and Harris lines. What also seems clear is that severe and chronic stress, with high frequency of hypoplasias, infectious disease lesions, pathologies related to iron-deficiency anemia, and high mortality rates, is not characteristic of these early populations. There is no evidence of frequent, severe malnutrition, so the diet must have been adequate in calories and other nutrients most of the time. During the Mesolithic, the proportion of starch in the diet rose, to judge from the increased occurrence of certain dental diseases [with exceptions to be noted later], but not enough to create an impoverished diet... There is a possible slight tendency for Paleolithic people to be healthier and taller than Mesolithic people, but there is no apparent trend toward increasing physiological stress during the mesolithic.
Cultures that adopted intensive agriculture typically showed a marked decline in health indicators. This is particularly true of dental health, which usually became quite poor.Stress, however, does not seem to have become common and widespread until after the development of high degrees of sedentism, population density, and reliance on intensive agriculture. At this stage in all regions the incidence of physiological stress increases greatly, and average mortality rates increase appreciably. Most of these agricultural populations have high frequencies of porotic hyperostosis and cribra orbitalia, and there is a substantial increase in the number and severity of enamel hypoplasias and pathologies associated with infectious disease. Stature in many populations appears to have been considerably lower than would be expected if genetically-determined maxima had been reached, which suggests that the growth arrests documented by pathologies were causing stunting... Incidence of carbohydrate-related tooth disease increases, apparently because subsistence by this time is characterized by a heavy emphasis on a few starchy food crops.
Infectious disease increased upon agricultural intensification:Most [studies] conclude that infection was a more common and more serious problem for farmers than for their hunting and gathering forebears; and most suggest that this resulted from some combination of increasing sedentism, larger population aggregates, and the well-established synergism between infection and malnutrition.
There are some apparent exceptions to the trend of declining health with the adoption of intensive agriculture. In my observation, they fall into two general categories. In the first, health improves upon the transition to agriculture because the hunter-gatherer population was unhealthy to begin with. This is due to living in a marginal environment or eating a diet with a high proportion of wild plant seeds. In the second category, the culture adopted rice. Rice is associated with less of a decline in health, and in some cases an increase in overall health, than other grains such as wheat and corn. In chapter 21 of the book Ancient Health: Bioarchaeological Interpretations of the Human Past, Drs. Michelle T Douglas and Michael Pietrusewsky state that "rice appears to be less cariogenic [cavity-promoting] than other grains such as maize [corn]."One pathology that seems to have decreased with the adoption of agriculture is arthritis. The authors speculate that it may have more to do with strenuous activity than other aspects of the lifestyle such as diet. Another interpretation is that the hunter-gatherers appeared to have a higher arthritis rate because of their longer lifespans:The arthritis data are also complicated by the fact that the hunter-gatherers discussed commonly displayed higher average ages at death than did the farming populations from the same region. The hunter-gatherers would therefore be expected to display more arthritis as a function of age even if their workloads were comparable [to farmers].
In any case, it appears arthritis is normal for human beings and not a modern degenerative disease. And the final word:Taken as a whole, these indicators fairly clearly suggest an overall decline in the quality-- and probably in the length-- of human life associated with the adoption of agriculture.
Vegetarians deserve our respect. They're usually thoughtful, conscientious people who make sacrifices for environmental and ethical reasons. I was vegetarian for a while myself, until I decided I could find ethical meat.Vegetarianism and especially veganism can get pretty ideological sometimes. People who have strong beliefs like to think that their belief system is best for all aspects of their lives and the world, not just bits and pieces. Many vegetarians believe their way of eating is healthier than omnivory or carnivory. It's easy to believe, since mainstream nutrition research has a distinctly pro-vegetarian slant. One of the classic arguments for vegetarianism goes something like this: our closest living relatives, chimpanzees and bonobos, are mostly vegetarian, therefore that's the diet to which we're adapted as well. Here's the problem with that argument:
Where are chimps (Pan troglodytes) on this chart? They aren't on it, for two related reasons: they aren't in the genus Homo, and they diverged from us about 5 million years ago. Homo erectus diverged from our lineage about 1.5 million years ago. I don't know if you've ever seen a Homo erectus skull, but 1.5 million years is clearly enough time to do some evolving. Homo erectus hunted and ate animals as a significant portion of its diet.If you look at the chart above, Homo rhodesiensis (typically considered a variant of Homo heidelbergensis) is our closest ancestor, and our point of divergence with neanderthals (Homo neanderthalensis). Some archaeologists believe H. heidelbergensis was the same species as modern Homo sapiens. I haven't been able to find any direct evidence of the diet of H. heidelbergensis from bone isotope ratios, but the indirect evidence indicates that they were capable hunters who probably got a large proportion of their calories from meat. In Europe, they hunted now-extinct megafauna such as wooly rhinos. These things make modern cows look like chicken nuggets, and you can bet their fat was highly saturated.H. heidelbergensis was a skilled hunter and very athletic. They were top predators in their ecosystems, judged by the fact that they took their time with carcasses, butchering them thoroughly and extracting marrow from bones. No predator or scavenger was capable of driving them away from a kill.Our closest recent relative was Homo neanderthalensis, the neanderthal. They died out around 30,000 years ago. There have been several good studies on the isotope ratios of neanderthal bones, all indicating that neanderthals were basically carnivores. They relied both on land and marine animals, depending on what was available. Needless to say, neanderthals are much more closely related to humans than chimpanzees, having diverged from us less than 500,000 years ago. That's less than one-tenth the time between humans and chimpanzees. I don't think this necessarily means humans are built to be carnivores, but it certainly blows away the argument that we're built to be vegetarians. It also argues against the idea that we're poorly adapted to eating animal fat. Historical human hunter-gatherers had very diverse diets, but on average were meat-heavy omnivores. This fits well with the apparent diet of our ancestor H. heidelbergensis, except that we've killed most of the megafauna so modern hunter-gatherers have to eat frogs, bugs and seeds.
Europe once teemed with large mammals, including species of elephant, lion, tiger, bear, moose and bison.
America was also home to a number of huge and unusual animals: mammoths, dire wolves, lions, giant sloths and others.
The same goes for Australia, where giant kangaroos, huge wombats and marsupial 'lions' once roamed.
What do these extinctions have in common? They all occurred around when humans arrived. The idea that humans caused them is hotly debated, because they also sometimes coincided with climactic and vegetation changes. However, I believe the fact that these extinctions occurred on several different continents about when humans arrived points to an anthropogenic explanation.
A recent archaeological study from the island of Tasmania off the coast of Australia supports the idea that humans were behind the Australian extinctions. Many large animals went extinct around the time when humans arrived in Australia, but that time also coincided with a change in climate. What the new study shows is that the same large animals survived for another 5,000 years in Tasmania... until humans arrived there from the mainland. Then they promptly went extinct. That time period didn't correspond to a major climate change, so it's hard to explain it away.It's a harsh reality that our big brains and remarkable adaptability give us the power to be exceptionally destructive to the environment. We're good at finding the most productive niches available, and exploiting them until they implode. Jared Diamond wrote an excellent book on the subject called Collapse, which details how nearly every major civilization collapse throughout history was caused at least in part by environmental damage. It's been a hallmark of human history since the beginning.
I don't think it will take much to convince you that the trend has accelerated in modern times. Ocean life, our major source of nutrient-rich wild food, has already been severely depleted. The current extinction rate is estimated to be over 1,000 times the baseline, pre-modern level, and rising.
Humans have always been top-level predators. We kill and eat nutrient-dense prey that is often much larger than we are. But today, the extinction of such walking meat lockers has caused us to eat down the food chain. We're turning to jellyfish and sea cucumbers and... gasp... lobsters!
While it's true that we've probably always eaten things like shellfish and insects, I find it disturbing that we've depleted the oceans to the point where we can no longer sustainably eat formerly abundant carnivorous fish like tuna. We need to make a concerted effort to preserve these species because extinction is permanent.
I don't want to live in a future where the only thing on the menu is bacteria patties, the other other other other white meat.
If paleolithic people were healthier than us due to their hunter-gatherer lifestyle, why did they have a shorter life expectancy than we do today? I was just reminded by Scott over at Modern Forager about some data on paleolithic (pre-agriculture) vs. neolithic (post-agriculture) life expectancy and growth characteristics. Here's a link to the table, which is derived from an article in the text Paleopathology at the Origins of Agriculture. The reason the table is so interesting is it allows us to ask the right question. Instead of "why did paleolithic people have a shorter life expectancy than we do today?", we should ask "how did the life expectancy of paleolithic people compare to that of pre-industrial neolithic people?" That's what will allow us to tease the effects of lifestyle apart from the effects of modern medicine. The data come from age estimates of skeletons from various archaeological sites representing a variety of time periods in the Mediterranean region. Paleolithic skeletons indicated a life expectancy of 35.4 years for men and 30.0 years for women, which includes a high rate of infant mortality. This is consistent with data from the Inuit that I posted a while back (life expectancy excluding infant mortality = 43.5 years). With modest fluctuations, the life expectancy of humans in this Mediterranean region remained similar from paleolithic times until the last century. I suspect the paleolithic people died most often from warfare, accidents and infectious disease, while the neolithic people died mostly from chronic disease, and infectious diseases that evolved along with the domestication of animals (zoonotic diseases). But I'm just speculating based on what I know about modern populations, so you can take that at face value.The most interesting part of the table is actually not the life expectancy data. It also contains numbers for average stature and pelvic inlet depth. These are both markers of nutritional status during development. Pelvic inlet depth is a measure of the size of the pelvic canal through which a baby would pass during birth. It can be measured in men and women, but obviously its implications for birth only apply to women. As you can see in the table, stature and pelvic inlet depth declined quite a bit with the adoption of agriculture, and still have not reached paleolithic levels to this day. The idea that a grain-based diet interferes with normal skeletal development isn't new. It's well-accepted in the field of archaeology that the adoption of grains coincided with a shortening of stature, thinner bones and crooked, cavity-ridden teeth. This fact is so well accepted that these sorts of skeletal changes are sometimes used as evidence that grains were adopted in a particular region historically. Weston Price saw similar changes in the populations he studied, as they transitioned from traditional diets to processed-food diets rich in white wheat flour, sweets and other processed foods. The change in pelvic inlet depth is also very telling. Modern childbirth is so difficult, it makes you wonder why our bodies have evolved to make it so drawn-out and lethal. Without the aid of modern medicine, many of the women who now get C-sections and other birth interventions would not make it. My feeling is that we didn't evolve to make childbirth so lethal. It's more difficult in modern times, at least partially because we have a narrower pelvic inlet than our ancestors. Another thing Weston Price commented on was the relative ease of childbirth in many of the traditional societies he visited. Here's an exerpt from Nutrition and Physical Degeneration:A similar impressive comment was made to me by Dr. Romig, the superintendent of the government hospital for Eskimos and Indians at Anchorage, Alaska. He stated that in his thirty-six years among the Eskimos, he had never been able to arrive in time to see a normal birth by a primitive Eskimo woman. But conditions have changed materially with the new generation of Eskimo girls, born after their parents began to use foods of modern civilization. Many of them are carried to his hospital after they had been in labor for several days. One Eskimo woman who had married twice, her last husband being a white man, reported to Dr. Romig and myself that she had given birth to twenty-six children and that several of them had been born during the night and that she had not bothered to waken her husband, but had introduced him to the new baby in the morning.
Now that's what I call fertility!
I recently read this book after discovering it on another health site. It's a compilation of chapters written by several researchers in the fields of comparative biology, paleontology, archaeology and zoology. It's sometimes used as a textbook.
I've learned some interesting things, but overall it was pretty disappointing. The format is disjointed, with no logical flow between chapters. I also would not call it comprehensive, which is one of the things I look for in a textbook. Here are some of the interesting points: - Humans in industrial societies are the only mammals to commonly develop hypertension, and are the only free-living primates to become overweight.
- The adoption of grains as a primary source of calories correlated with a major decrease in stature, decrease in oral health, decrease in bone density, and other problems. This is true for wheat, rice, corn and other grains.
- Cranial capacity has also declined 11% since the late paleolithic, correlating with a decrease in the consumption of animal foods and an increase in grains.
- According to carbon isotope ratios of teeth, corn did not play a major role in the diet of native Americans until 800 AD. Over 15% of the teeth of post-corn South American cultures showed tooth decay, compared with less than 5% for pre-corn cultures (many of which were already agricultural, just not eating corn).
- Childhood mortality seems to be similar among hunter-gatherers and non-industrial agriculturists and pastoralists.
- Women may have played a key role in food procurement through foraging. This is illustrated by a group of modern hunter-gatherers called the Hadza. While men most often hunt, which supplies important nutrients intermittently, women provide a steady stream of calories by foraging for tubers.
- We have probably been eating starchy tubers for between 1.5 and 2 million years, which precedes our species. Around that time, digging tools, (controversial) evidence of controlled fire and changes in digestive anatomy all point to use of tubers and cooked food in general. Tubers make sense because they are a source of calories that is much more easily exploited than wild grains in most places.
- Our trajectory as a species has been to consume a diet with more calories per unit fiber. As compared to chimps, who eat leaves and fruit all day and thus eat a lot of fiber to get enough calories, our species and its recent ancestors ate a diet much lower in fiber.
- Homo sapiens has always eaten meat.
The downside is that some chapters have a distinct low-fat slant. One chapter attempted to determine the optimal diet for humans by comparing ours to the diets of wild chimps and other primates. Of course, we eat more fat than a chimp, but I don't think that gets us anywhere. Especially since one of our closest relatives, the neanderthal, was practically a carnivore.
They consider the diet composition of modern hunter-gatherers that eat low-fat diets, but don't include data on others with high-fat diets like the Inuit.
There's some good information in the book, if you're willing to dig through a lot of esoteric data on the isotope ratios of extinct hominids and that sort of thing.
You've heard me say that I believe grains aren't an ideal food for humans. Part of the reason rests on the assertion that we have not been eating grains for long enough to have adapted to them. In this post, I'll go over what I know about the human diet before and after agriculture, and the timeline of our shift to a grain-based diet. I'm not an archaeologist so I won't claim that all these numbers are exact, but I think they are close enough to make my point.
As hunter-gatherers, we ate some combination of the following: land mammals (including organs, fat and marrow), cooked tubers, seafood (fish, mammals, shellfish, seaweed), eggs, nuts, fruit, honey, "vegetables" (stems, leaves, etc.), mushrooms, assorted land animals, birds and insects. The proportion of each food varied widely between groups and even seasons. This is pretty much what we've been living on since we evolved as a species, and even before, for a total of 1.5 million years or so (this number is controversial but is supported by multiple lines of evidence). There are minor exceptions, including the use of wild grains in a few areas, but for the most part, that's it.
The first evidence of a calorically important domesticated crop I'm aware of was about 11,500 years ago in the fertile crescent. They were cultivating an early ancestor of wheat called emmer. Other grains popped up independently in what is now China (rice; ~10,000 years ago), and central America (corn; ~9,000 years ago). That's why people say humans have been eating grains for about 10,000 years.
The story is more complicated than the dates suggest, however. Although wheat had its origin 11,500 years ago, it didn't become widespread in Western Europe for another 4,500 years. So if you're of European descent, your ancestors have been eating grains for roughly 7,000 years. Corn was domesticated 9,000 years ago, but according to the carbon ratios of human teeth, it didn't become a major source of calories until about 1,200 years ago! Many American groups did not adopt a grain-based diet until 100-300 years ago, and in a few cases they still have not. If you are of African descent, your ancestors have been eating grains for 9,000 to 0 years, depending on your heritage. The change to grains was accompanied by a marked decrease in dental health that shows up clearly in the archaeological record.
Practically every plant food contains some kind of toxin, but grains produce a number of nasty ones that humans are not well adapted to. Grains contain a large amount of phytic acid for example, which strongly inhibits the absorption of a number of important minerals. Tubers, which were our main carbohydrate source for about 1.5 million years before agriculture, contain less of it. This may have been a major reason why stature decreased when humans adopted grain-based agriculture. There are a number of toxins that occur in grains but not in tubers, such as certain heat-resistant lectins.
Non-industrial cultures often treated their seeds, including grains, differently than we do today. They used soaking, sprouting and long fermentation to decrease the amount of toxins found in grains, making them more nutritious and digestible. Most grain staples are not treated in this way today, and so we bear the brunt of their toxins even more than our ancestors did.
From an evolutionary standpoint, even 11,500 years is the blink of an eye. Add to that the fact that many people descend from groups that have been eating grains for far less time than that, and you begin to see the problem. There is no doubt that we have begun adapting genetically to grains. All you have to do to understand this is look back at the archaeological record, to see the severe selective pressure (read: disease) that grains placed on its early adopters. But the question is, have we had time to adapt sufficiently to make it a healthy food? I would argue the answer is no.
There are a few genetic adaptations I'm aware of that might pertain to grains: the duplication of the salivary amylase gene, and polymorphisms in the angiotensin-converting enzyme (ACE) and apolipoprotein B genes. Some groups duplicated a gene that secretes the enzyme amylase into the saliva, increasing its production. Amylase breaks down starch, indicating a possible increase in its consumption. The problem is that we were getting starch from tubers before we got it from grains, so it doesn't really argue for either side in my opinion. The ACE and apolipoprotein B genes may be more pertinent, because they relate to blood pressure and LDL cholesterol. Blood pressure and blood cholesterol are both factors that respond well to low-carbohydrate (and thus low-grain) diets, suggesting that the polymorphisms may be a protective adaptation against the cardiovascular effects of grains.
The fact that up to 1% of people of European descent may have full-blown celiac disease attests to the fact that 7,000 years have not been enough time to fully adapt to wheat on a population level. Add to that the fact that nearly half of genetic Europeans carry genes that are associated with celiac, and you can see that we haven't been weeded out thoroughly enough to tolerate wheat, the oldest grain!
Based on my reading, discussions and observations, I believe that rice is the least problematic grain, wheat is the worst, and everything else is somewhere in between. If you want to eat grains, it's best to soak, sprout or ferment them. This activates enzymes that break down most of the toxins. You can soak rice, barley and other grains overnight before cooking them. Sourdough bread is better than normal white bread. Unfermented, unsprouted whole wheat bread may actually be the worst of all.