Showing posts with label minerals. Show all posts
Showing posts with label minerals. Show all posts

Saturday, December 11, 2010

Dr. Mellanby's Tooth Decay Reversal Diet

I have a lot of admiration for Drs. Edward and May Mellanby. A husband-and-wife team, they discovered vitamin D, and determined that rickets is caused by poor calcium (or phosphorus) status, typically due to vitamin D deficiency. They believed that an ideal diet is omnivorous, based on whole foods, and offers an adequate supply of fat-soluble vitamins and easily absorbed minerals. They also felt that grain intake should be modest, as their research showed that unsoaked whole grains antagonize the effect of vitamins D and A.

Not only did the Mellanbys discover vitamin D and end the rickets epidemic that was devastating Western cities at the time, they also discovered a cure for early-stage tooth decay that has been gathering dust in medical libraries throughout the world since 1924.

It was in that year that Dr. May Mellanby published a summary of the results of the Mellanby tooth decay reversal studies in the British Medical Journal, titled "Remarks on the Influence of a Cereal-free Diet Rich in Vitamin D and Calcium on Dental Caries in Children". Last year, I had to specially request this article from the basement of the University of Washington medical library (1). Thanks to the magic of the internet, the full version of the paper is now freely available online (2).

You don't need my help to read the study, but in this post I offer a little background, a summary and my interpretation.

In previous studies, the Mellanbys used dogs to define the dietary factors that influence tooth development and repair. They identified three, which together made the difference between excellent and poor dental health (from Nutrition and Disease):
  1. The diet's mineral content, particularly calcium and phosphorus
  2. The diet's fat-soluble vitamin content, chiefly vitamin D
  3. The diet's content of inhibitors of mineral absorption, primarily phytic acid
Once they had defined these factors, they set about testing their hypotheses in humans. They performed eight trials, each one in children in an institutionalized setting where diet could be completely controlled. The number of cavities in each child's mouth was noted at the beginning and end of the period. I'll only discuss the three most informative, and only the most successful in detail. First, the results:

I'll start with diet 1. Children on this diet ate the typical fare, plus extra oatmeal. Oatmeal is typically eaten as an unsoaked whole grain (and soaking it isn't very effective in any case), and so it is high in phytic acid, which effectively inhibits the absorption of a number of minerals including calcium. These children formed 5.8 cavities each and healed virtually none-- not good!

Diet number 2 was similar to diet 1, except there was no extra oatmeal and the children received a large supplemental dose of vitamin D. Over 28 weeks, only 1 cavity per child developed or worsened, while 3.9 healed. Thus, simply adding vitamin D to a reasonable diet allowed most of their cavities to heal.

Diet number 3 was the most effective. This was a grain-free diet plus supplemental vitamin D. Over 26 weeks, children in this group saw an average of only 0.4 cavities form or worsen, while 4.7 healed. The Mellanbys considered that they had essentially found a cure for this disorder in its early stages.

What exactly was this diet? Here's how it was described in the paper (note: cereals = grains):
...instead of cereals- for example, bread, oatmeal, rice, and tapioca- an increased allowance of potatoes and other vegetables, milk, fat, meat, and eggs was given. The total sugar, jam, and syrup intake was the same as before. Vitamin D was present in abundance in either cod-liver oil or irradiated ergosterol, and in egg yolk, butter, milk, etc. The diet of these children was thus rich in those factors, especially vitamin D and calcium, which experimental evidence has shown to assist calcification, and was devoid of those factors- namely, cereals- which interfere with the process.
Carbohydrate intake was reduced by almost half. Bread and oatmeal were replaced by potatoes, milk, meat, fish, eggs, butter and vegetables. The diet is reminiscent of what Dr. Weston Price used to reverse tooth decay in his dental clinic in Cleveland, although Price's diet did include rolls made from freshly ground whole wheat. Price also identified the fat-soluble vitamin K2 MK-4 as another important factor in tooth decay reversal, which would have been abundant in Mellanby's studies due to the dairy. The Mellanbys and Price were contemporaries and had parallel and complementary findings. The Mellanbys did not understand the role of vitamin K2 in mineral metabolism, and Price did not seem to appreciate the role of phytic acid from unsoaked whole grains in preventing mineral absorption.

Here are two sample meals provided in Dr. Mellanby's paper. I believe the word "dinner" refers to the noon meal, and "supper" refers to the evening meal:
Breakfast- Omelette, cocoa, with milk.
Lunch- Milk.
Dinner- Potatoes, steamed minced meat, carrots, stewed fruit, milk.
Tea- Fresh fruit salad, cocoa made with milk.
Supper- Fish and potatoes fried in dripping, milk.

Breakfast- Scrambled egg, milk, fresh salad.
Dinner- Irish stew, potatoes, cabbage, stewed fruit, milk.
Tea- Minced meat warmed with bovril, green salad, milk.
Supper- Thick potato soup made with milk.
In addition, children received vitamin D daily. Here's Dr. Mellanby's summary of their findings:
The tests do not indicate that in order to prevent dental caries children must live on a cereal-free diet, but in association with the results of the other investigations on animals and children they do indicate that the amount of cereal eaten should be reduced, particularly during infancy and in the earlier years of life, and should be replaced by an increased consumption of milk, eggs, butter, potatoes, and other vegetables. They also indicate that a sufficiency of vitamin D and calcium should be given from birth, and before birth, by supplying a suitable diet to the pregnant mother. The teeth of the children would be well formed and more resistant to dental caries instead of being hypoplastic and badly calcified, as were those in this investigation.
If I could add something to this program, I would recommend daily tooth brushing and flossing, avoiding sugar, and rinsing the mouth with water after each meal.

This diet is capable of reversing early stage tooth decay. It will not reverse advanced decay, which requires professional dental treatment as soon as possible. It is not a substitute for dental care in general, and if you try using diet to reverse your own tooth decay, please do it under the supervision of a dentist. And while you're there, tell her about Edward and May Mellanby!

Preventing Tooth Decay
Reversing Tooth Decay
Images of Tooth Decay Healing due to an Improved Diet
Dental Anecdotes

Tuesday, July 20, 2010

Real Food XI: Sourdough Buckwheat Crepes

Buckwheat was domesticated in Southeast Asia roughly 6,000 years ago. Due to its unusual tolerance of cool growing conditions, poor soils and high altitudes, it spread throughout the Northern latitudes of Eurasia, becoming the staple crop in many regions. It's used to a lesser extent in countries closer to the equator. It was also a staple in the Northeastern US until it was supplanted by wheat and corn.

Buckwheat isn't a grain: it's a 'pseudograin' that comes from a broad-leaved plant. As such, it's not related to wheat and contains no allergenic gluten. Like quinoa, it has some unusual properties that make it a particularly nutritious food. It's about 16 percent protein by calories, ranking it among the highest protein grains. However, it has an advantage over grains: it contains complete protein, meaning it has a balance of essential amino acids similar to animal foods. Buckwheat is also an exceptional source of magnesium and copper, two important nutrients that may influence the risk of insulin resistance and cardiovascular disease (1, 2).

However, like all seeds (including grains and nuts), buckwheat is rich in phytic acid. Phyic acid complexes with certain minerals, preventing their absorption by the human digestive tract. This is one of the reasons why traditional cultures prepare their grains carefully (3). During soaking, and particularly fermentation of raw batters, an enzyme called phytase goes to work breaking down the phytic acid. Not all seeds are endowed with enough phytase to break down phytic acid in a short period of time. Buckwheat contains a lot of phytase, and consequently fermented buckwheat batters contain very little phytic acid (4, 5). It's also high in astringent tannins, but thorough soaking in a large volume of water removes them.

Buckwheat is fermented in a number of traditional cultures. In Bhutan, it's fermented to make flatbreads and alcoholic drinks (6). In Brittany (Bretagne; Northwestern France), sourdough buckwheat flour pancakes are traditional. Originally a poverty food, it is now considered a delicacy.

The following simple recipe is based on my own experimentation with buckwheat. It isn't traditional as far as I know, however it is based on traditional methods used to produce sourdough flatbreads in a number of cultures. I used the word 'crepe' to describe it, but I typically make something more akin to a savory pancake or uttapam. You can use it to make crepes if you wish, but this recipe is not for traditional French buckwheat crepes.

It's important that the buckwheat be raw and whole for this recipe. Raw buckwheat is light green to light brown (as in the photo above). Kasha is toasted buckwheat, and will not substitute properly. It's also important that the water be dechlorinated and the salt non-iodized, as both will interfere with fermentation.

For a fermentation starter, you can use leftover batter from a previous batch (although it doesn't keep very long), or rice soaking water from this method (7).

Ingredients and Materials


  • 2-3 cups raw buckwheat groats
  • Dechlorinated water (filtered, boiled, or rested uncovered overnight)
  • Non-iodized salt (sea salt, pickling salt or kosher salt), 2/3 tsp per cup of buckwheat
  • Fermentation starter (optional), 2 tablespoons
  • Food processor or blender
Recipe
  1. Cover buckwheat with a large amount of dechlorinated water and soak for 9-24 hours. Raw buckwheat is astringent due to water-soluble tannins. Soaking in a large volume of water and giving it a stir from time to time will minimize this. The soaking water will also get slimy. This is normal.
  2. Pour off the soaking water and rinse the buckwheat thoroughly to get rid of the slime and residual tannins.
  3. Blend the buckwheat, salt, dechlorinated water and fermentation starter in a food processor or blender. Add enough water so that it reaches the consistency of pancake batter. The smoother you get the batter, the better the final product will be.
  4. Ferment for about 12 hours, a bit longer or shorter depending on the temperature and whether or not you used a starter. The batter may rise a little bit as the microorganisms get to work. The smell will mellow out. Refrigerate it after fermentation.
  5. In a greased or non-stick skillet, cook the batter at whatever thickness and temperature you prefer. I like to cook a thick 'pancake' with the lid on, at very low heat, so that it steams gently.
Dig in! Its mild flavor goes with almost anything. Batter will keep for about four days in the fridge.

Thanks to Christaface for the CC licensed photo (Flickr).

Wednesday, June 16, 2010

Low Micronutrient Intake may Contribute to Obesity

Lower Micronutrient Status in the Obese

Investigators have noted repeatedly that obese people have a lower blood concentration of a number of nutrients, including vitamin A, vitamin D, vitamin K, several B vitamins, zinc and iron (1). Although there is evidence that some of these may influence fat mass in animals, the evidence for a cause-and-effect relationship in humans is generally slim. There is quite a bit of indirect evidence that vitamin D status influences the risk of obesity (2), although a large, well-controlled study found that high-dose vitamin D3 supplementation does not cause fat loss in overweight and obese volunteers over the course of a year (3). It may still have a preventive effect, or require a longer timescale, but that remains to be determined.

Hot off the Presses

A new study in the journal Obesity, by Y. Li and colleagues, showed that compared to a placebo, a low-dose multivitamin caused obese volunteers to lose 7 lb (3.2 kg) of fat mass in 6 months, mostly from the abdominal region (4). The supplement also reduced LDL by 27%, increased HDL by a whopping 40% and increased resting energy expenditure. Here's what the supplement contained:

Vitamin A(containing natural mixed b-carotene) 5000 IU
Vitamin D 400 IU
Vitamin E 30 IU
Thiamin 1.5 mg
Riboflavin 1.7 mg
Vitamin B6 2 mg
Vitamin C 60 mg
Vitamin B12 6 mcg
Vitamin K1 25 mcg
Biotin 30 mcg
Folic acid 400 mcg
Nicotinamide 20 mg
Pantothenic acid 10 mg
Calcium 162 mg
Phosphorus 125 mg
Chlorine 36.3 mg
Magnesium 100 mg
Iron 18 mg
Copper 2 mg
Zinc 15 mg
Manganese 2.5 mg
Iodine 150 mcg
Chromium 25 mcg
Molybdenum 25 mcg
Selenium 25 mcg
Nickel 5 mcg
Stannum 10 mcg
Silicon 10 mcg
Vanadium 10 mcg

Although the result needs to be repeated, if we take it at face value, it has some important implications:
  • The nutrient density of a diet may influence obesity risk, as I speculated in my recent audio interview and related posts (5, 6, 7, 8, 9).
  • Many nutrients act together to create health, and multiple insufficiencies may contribute to disease. This may be why single nutrient supplementation trials usually don't find much.
  • Another possibility is that obesity can result from a number of different nutrient insufficiencies, and the cause is different in different people. This study may have seen a large effect because it corrected many different insufficiencies.
  • This result, once again, kills the simplistic notion that body fat is determined exclusively by voluntary food consumption and exercise behaviors (sometimes called the "calories in, calories out" idea, or "gluttony and sloth"). In this case, a multivitamin was able to increase resting energy expenditure and cause fat loss without any voluntary changes in food intake or exercise, suggesting metabolic effects and a possible downward shift of the body fat "setpoint" due to improved nutrient status.
Practical Implications

Does this mean we should all take multivitamins to stay or become thin? No. There is no multivitamin that can match the completeness and balance of a nutrient-dense, whole food, omnivorous diet. Beef liver, leafy greens and sunlight are nature's vitamin pills. Avoiding refined foods instantly doubles the micronutrient content of the typical diet. Properly preparing whole grains by soaking and fermentation is equivalent to taking a multi-mineral along with conventionally prepared grains, as absorption of key minerals is increased by 50-300% (10). Or you can eat root vegetables instead of grains, and enjoy their naturally high mineral availability. Or both.

Tuesday, May 4, 2010

Traditional Preparation Methods Improve Grains' Nutritive Value

Soaking or Germinating Grains

The most basic method of preparing grains is prolonged soaking in water, followed by cooking. This combination reduces the level of water-soluble and heat-sensitive toxins and anti-nutrients such as tannins, saponins, digestive enzyme inhibitors and lectins, as well as flatulence factors. It also partially degrades phytic acid, which is a potent inhibitor of mineral absorption, an inhibitor of the digestive enzyme trypsin and an enemy of dental health (1). This improves the digestibility and nutritional value of grains as well as legumes.

I prefer to soak all grains and legumes for at least 12 hours in a warm location, preferably 24. This includes foods that most people don't soak, such as lentils. Soaking does not reduce phytic acid at all in grains that have been heat-treated, such as oats and kasha (technically not a grain), because they no longer contain the phytic acid-degrading enzyme phytase. Cooking without soaking first also does not have much effect on phytic acid.

The next level of grain preparation is germination. After soaking, rinse the grains twice per day for an additional day or two. This activates the grains' sprouting program and further increases their digestibility and vitamin content. When combined with cooking, it reduces phytic acid, although modestly. Therefore, most of the minerals in sprouted whole grains will continue to be inaccessible. Many raw sprouted grains and legumes are edible, but I wouldn't use them as a staple food because they retain most of their phytic acid as well as some heat-sensitive anti-nutrients (2).

Grinding and Fermenting Grains

Many cultures around the world have independently discovered fermentation as a way to greatly improve the digestibility and nutritive value of grains (3). Typically, grains are soaked, ground, and allowed to sour ferment for times ranging from 12 hours to several days. In some cases, a portion of the bran is removed before or after grinding.

In addition to the reduction in toxins and anti-nutrients afforded by soaking and cooking, grinding and fermentation goes much further. Grinding greatly increases the surface area of the grains and breaks up their cellular structure, releasing enzymes which are important for the transformation to come. Under the right conditions, which are easy to achieve, lactic acid bacteria rapidly acidify the batter. These bacteria are naturally present on grains, but adding a starter makes the process more efficient and reliable.

Due to some quirk of nature, grain phytase is maximally active at a pH of between 4.5 and 5.5, which is mildly acidic. This is why the Weston Price foundation recommends soaking grains in an acidic medium before cooking. The combination of grinding and sour fermentation causes grains to efficiently degrade their own phytic acid (as long as they haven't been heat treated first), making minerals much more available for absorption (4, 5, 6, 7). This transforms whole grains from a poor source of minerals into a good source.

The degree of phytic acid degradation depends on the starting amount of phytase in the grain. Corn, rice, oats and millet don't contain much phytase activity, so they require either a longer fermentation time, or the addition of high-phytase grains to the batter (8). Whole raw buckwheat, wheat, and particularly rye contain a large amount of phytase (9), although I feel wheat is problematic for other reasons.

As fermentation proceeds, bacteria secrete enzymes that begin digesting the protein, starch and other substances in the batter. Fermentation reduces lectin levels substantially, which are reduced further by cooking (10). Lectins are toxins that can interfere with digestion and may be involved in autoimmune disease, an idea championed by Dr. Loren Cordain. Grain lectins are generally heat-sensitive, but one notable exception is the nasty lectin wheat germ agglutinin (WGA). As its name suggests, WGA is found in wheat germ, and thus is mostly absent in white flour. WGA may have been another reason why DART participants who increased their wheat fiber intake had significantly more heart attacks than those who didn't. I don't know if fermentation degrades WGA.

One of the problems with grains is their poor protein quality. Besides containing a fairly low concentration of protein to begin with, they also don't contain a good balance of essential amino acids. This prevents their efficient use by the body, unless a separate source of certain amino acids is eaten along with them. The main limiting amino acid in grains is lysine. Legumes are rich in lysine, which is why cultures around the world pair them with grains. Bacterial fermentation produces lysine, often increasing its concentration by many fold and making grains nearly a "complete protein", i.e. one that contains the ideal balance of essential amino acids as do animal proteins (11, scroll down to see graph). Not very many plant foods can make that claim. Fermentation also increases the concentration of the amino acid methionine and certain vitamins.

Another problem with grain protein is it's poorly digested relative to animal protein. This means that a portion of it escapes digestion, leading to a lower nutritive value and a higher risk of allergy due to undigested protein hanging around in the digestive tract. Fermentation followed by cooking increases the digestibility of grain protein, bringing it nearly to the same level as meat (12, 13, 14, 15). This may relate to the destruction of protease inhibitors (trypsin inhibitors, phytic acid) and the partial pre-digestion of grain proteins by bacteria.

Once you delve into the research on traditional grain preparation methods, you begin to see why grain-eating cultures throughout the world have favored certain techniques. Proper grain processing transforms them from toxic to nutritious, from health-degrading to health-giving. Modern industrial grain processing has largely eschewed these time-honored techniques, replacing them with low-extraction milling, extrusion and quick-rise yeast strains.

Many people will not be willing to go through the trouble of grinding and fermentation to prepare grains. I can sympathize, although if you have the right tools, once you establish a routine it really isn't that much work. It just requires a bit of organization. In fact, it can even be downright convenient. I often keep a bowl of fermented dosa or buckwheat batter in the fridge, ready to make a tasty "pancake" at a moment's notice. In the next post, I'll describe a few recipes from different parts of the world.

Further reading:

How to Eat Grains
A Few Thoughts on Minerals, Milling, Grains and Tubers
Dietary Fiber and Mineral Availability
A New Way to Soak Brown Rice

Wednesday, April 28, 2010

Grains as Food: an Update

Improperly Prepared Grain Fiber can be Harmful

Last year, I published a post on the Diet and Reinfarction trial (DART), a controlled trial that increased grain fiber intake using whole wheat bread and wheat bran supplements, and reported long-term health outcomes in people who had previously suffered a heart attack (1). The initial paper found a trend toward increased heart attacks and deaths in the grain fiber-supplemented group at two years, which was not statistically significant.

What I didn't know at the time is that a follow-up study has been published. After mathematically "adjusting" for preexisting conditions and medication use, the result reached statistical significance: people who increased their grain fiber intake had more heart attacks than people who didn't during the two years of the controlled trial. Overall mortality was higher as well, but that didn't reach statistical significance. You have to get past the abstract of the paper to realize this, but fortunately it's free access (2).

Here's a description of what not to eat if you're a Westerner with established heart disease:
Those randomised to fibre advice were encouraged to eat at least six slices of wholemeal bread per day, or an equivalent amount of cereal fibre from a mixture of wholemeal bread, high-fibre breakfast cereals and wheat bran.
Characteristics of Grain Fiber

The term 'fiber' can refer to many different things. Dietary fiber is simply defined as an edible substance that doesn't get digested by the human body. It doesn't even necessarily come from plants. If you eat a shrimp with the shell on, and the shell comes out the other end (which it will), it was fiber.

Grain fiber is a particular class of dietary fiber that has specific characteristics. It's mostly cellulose (like wood; although some grains are rich in soluble fiber as well), and it contains a number of defensive substances and storage molecules that make it more difficult to eat. These may include phytic acid, protease inhibitors, amylase inhibitors, lectins, tannins, saponins, and goitrogens (3). Grain fiber is also a rich source of vitamins and minerals, although the minerals are mostly inaccessible due to grains' high phytic acid content (4, 5, 6).

Every plant food (and some animal foods) has its chemical defense strategy, and grains are no different*. It's just that grains are particularly good at it, and also happen to be one of our staple foods in the modern world. If you don't think grains are naturally inedible for humans, try eating a heaping bowl full of dry, raw whole wheat berries.

Human Ingenuity to the Rescue

Humans are clever creatures, and we've found ways to use grains as a food source, despite not being naturally adapted to eating them**. The most important is our ability to cook. Cooking deactivates many of the harmful substances found in grains and other plant foods. However, some are not deactivated by cooking. These require other strategies to remove or deactivate.

Healthy grain-based cultures don't prepare their grains haphazardly. Throughout the world, using a number of different grains, many have arrived at similar strategies for making grains edible and nutritious. The most common approach involves most or all of these steps:
  • Soaking
  • Grinding
  • Removing 50-75% of the bran
  • Sour fermentation
  • Cooking
But wait, didn't all healthy traditional cultures eat whole grains? The idea might make us feel warm and fuzzy inside, but it doesn't quite hit the mark. A recent conversation with Ramiel Nagel, author of the book Cure Tooth Decay, disabused me of that notion. He pointed out that in my favorite resource on grain preparation in traditional societies, the Food and Agriculture Organization publication Fermented Cereals: a Global Perspective, many of the recipes call for removing a portion of the bran (7). Some of these recipes probably haven't changed in thousands of years. It's my impression that some traditional cultures eat whole grains, while others eat them partially de-branned.

In the next post, I'll explain why these processing steps greatly improve the nutritional value of grains, and I'll describe recipes from around the world to illustrate the point.


* Including tubers. For example, sweet potatoes contain goitrogens, oxalic acid, and protease inhibitors. Potatoes contain toxic glycoalkaloids. Taro contains oxalic acid and protease inhibitors. Cassava contains highly toxic cyanogens. Some of these substances are deactivated by cooking, others are not. Each food has an associated preparation method that minimizes its toxic qualities. Potatoes are peeled, removing the majority of the glycoalkaloids. Cassava is grated and dried or fermented to inactivate cyanogens. Some cultures ferment taro.

** As opposed to mice, for example, which can survive on raw whole grains.

Thursday, April 15, 2010

Copper in Food

Sources of Copper

It isn't hard to get enough copper-- unless you live in an industrial nation. I've compiled a chart showing the copper content of various refined and unrefined foods to illustrate the point. The left side shows industrial staple foods, while the right side shows whole foods. I've incorporated a few that would have been typical of Polynesian and Melanesian cultures apparently free of cardiovascular disease. The serving sizes are what one might reasonably eat at a meal: roughly 200 calories for grains, tubers and whole coconut; 1/4 pound for animal products; 1/2 teaspoon for salt; 1 cup for raw kale; 1 oz for sugar.

Note that beef liver is off the chart at 488 percent of the USDA recommended daily allowance. I don't know if you'd want to sit down and eat a quarter pound of beef liver, but you get the picture. Beef liver is nature's multivitamin: hands down the Most Nutritious Food in the World. That's because it acts as a storage depot for a number of important micronutrients, as well as being a biochemical factory that requires a large amount of B vitamins to function. You can see that muscle tissue isn't a great source of copper compared to other organs, and this holds true for other micronutrients as well.

Beef liver is so full of micronutrients, it shouldn't be eaten every day. Think of it in terms of the composition of a cow's body. The edible carcass is mostly muscle, but a significant portion is liver. I think it makes sense to eat some form of liver about once per week.

Modern Agriculture Produces Micronutrient-poor Foods

The numbers in the graph above come from NutritionData, my main source of food nutrient composition. The problem with relying on this kind of information is it ignores the variability in micronutrient content due to plant strain, soil quality, et cetera.

The unfortunate fact is that micronutrient levels have declined substantially over the course of the 20th century, even in whole foods. Dr. Donald R. Davis has documented the substantial decline in copper and other micronutrients in American foods over the second half of the last century (1). An even more marked decrease has occurred in the UK (2), with similar trends worldwide. On average, the copper content of vegetables in the UK has declined 76 percent since 1940. Most of the decrease has taken place since 1978. Fruits are down 20 percent and meats are down 24 percent.

I find this extremely disturbing, as it will affect even people eating whole food diets. This is yet another reason to buy from artisanal producers, who are likely to use more traditional plant varieties and grow in richer soil. Grass-fed beef should be just as nutritious as it has always been. Some people may also wish to grow, hunt or fish their own food.

Tuesday, April 6, 2010

Copper and Cardiovascular Disease

In 1942, Dr. H. W. Bennetts dissected 21 cattle known to have died of "falling disease". This was the name given to the sudden, inexplicable death that struck herds of cattle in certain regions of Australia. Dr. Bennett believed the disease was linked to copper deficiency. He found that 19 of the 21 cattle had abnormal hearts, showing atrophy and abnormal connective tissue infiltration (fibrosis) of the heart muscle (1).

In 1963, Dr. W. F. Coulson and colleagues found that 22 of 33 experimental copper-deficient pigs died of cardiovascular disease. 11 of 33 died of coronary heart disease, the quintessential modern human cardiovascular disease. Pigs on a severely copper-deficient diet showed weakened and ruptured arteries (aneurysms), while moderately deficient pigs "survived with scarred vessels but demonstrated a tendency toward premature atherosclerosis" including foam cell accumulation (2). Also in 1963, Dr. C. R. Ball and colleagues published a paper describing blood clots in the heart and coronary arteries, heart muscle degeneration, ventricular calcification and early death in mice fed a lard-rich diet (3).

This is where Dr. Leslie M. Klevay enters the story. Dr. Klevay suspected that Ball's mice had suffered from copper deficiency, and decided to test the hypothesis. He replicated Ball's experiment to the letter, using the same strain of mice and the same diet. Like Ball, he observed abnormal clotting in the heart, degeneration and enlargement of the heart muscle, and early death. He also showed by electrocardiogram that the hearts of the copper-deficient mice were often contracting abnormally (arrhythmia).

But then the coup de grace: he prevented these symptoms by supplementing the drinking water of a second group of mice with copper (4). In the words of Dr. Klevay: "copper was an antidote to fat intoxication" (5). I believe this was his tongue-in-cheek way of saying that the symptoms had been misdiagnosed by Ball as due to dietary fat, when in fact they were due to a lack of copper.

Since this time, a number of papers have been published on the relationship between copper intake and cardiovascular disease in animals, including several showing that copper supplementation prevents atherosclerosis in one of the most commonly used animal models of cardiovascular disease (6, 7, 8). Copper supplementation also corrects abnormal heart enlargement-- called hypertrophic cardiomyopathy-- and heart failure due to high blood pressure in mice (9).

For more than three decades, Dr. Klevay has been a champion of the copper deficiency theory of cardiovascular disease. According to him, copper deficiency is the only single intervention that has caused the full spectrum of human cardiovascular disease in animals, including:
  • Heart attacks (myocardial infarction)
  • Blood clots in the coronary arteries and heart
  • Fibrous atherosclerosis including smooth muscle proliferation
  • Unstable blood vessel plaque
  • Foam cell accumulation and fatty streaks
  • Calcification of heart tissues
  • Aneurysms (ruptured vessels)
  • Abnormal electrocardiograms
  • High cholesterol
  • High blood pressure
If this theory is so important, why have most people never heard of it? I believe there are at least three reasons. The first is that the emergence of the copper deficiency theory coincided with the rise of the diet-heart hypothesis, whereby saturated fat causes heart attacks by raising blood cholesterol. Bolstered by some encouraging findings and zealous personalities, this theory took the Western medical world by storm, for decades dominating all other theories in the medical literature and public health efforts. My opinions on the diet-heart hypothesis aside, the two theories are not mutually exclusive.

The second reason you may not have heard of the theory is due to a lab assay called copper-mediated LDL oxidation. Researchers take LDL particles (from blood, the same ones the doctor measures as part of a cholesterol test) and expose them to a high concentration of copper in a test tube. Free copper ions are oxidants, and the researchers then measure the amount of time it takes the LDL to oxidize. I find this assay tiresome, because studies have shown that the amount of time it takes copper to oxidize LDL in a test tube doesn't predict how much oxidized LDL you'll actually find in the bloodstream of the person you took the LDL from (10, 11).

In other words, it's an assay that has little bearing on real life. But researchers like it because for some odd reason, feeding a person saturated fat causes their LDL to be oxidized more rapidly by copper in a test tube, even though that's not the case in the actual bloodstream (12). Guess which result got emphasized?

The fact that copper is such an efficient oxidant has led some researchers to propose that copper oxidizes LDL in human blood, and therefore dietary copper may contribute to heart disease (oxidized LDL is a central player in heart disease-- read more here). The problem with this theory is that there are virtually zero free copper ions in human serum. Then there's the fact that supplementing humans with copper actually reduces the susceptibility of red blood cells to oxidation (by copper in a test tube, unfortunately), which is difficult to reconcile with the idea that dietary copper increases oxidative stress in the blood (13).

The third reason you may never have heard of the theory is more problematic. Several studies have found that a higher level copper in the blood correlates with a higher risk of heart attack (14, 15). At this point, I could hang up my hat, and declare the animal experiments irrelevant to humans. But let's dig deeper.

Nutrient status is sometimes a slippery thing to measure. As it turns out, serum copper isn't a good marker of copper status. In a 4-month trial of copper depletion in humans, blood copper stayed stable, while the activity of copper-dependent enzymes in the blood declined (16). These include the important copper-dependent antioxidant, superoxide dismutase. As a side note, lysyl oxidase is another copper-dependent enzyme that cross-links the important structural proteins collagen and elastin in the artery wall, potentially explaining some of the vascular consequences of copper deficiency. Clotting factor VIII increased dramatically during copper depletion, perhaps predicting an increased tendency to clot. Even more troubling, three of the 12 women developed heart problems during the trial, which the authors felt was unusual:
We observed a significant increase over control values in the number of ventricular premature discharges (VPDs) in three women after 21, 63, and 91 d of consuming the low-copper diet; one was subsequently diagnosed as having a second-degree heart block.
In another human copper restriction trial, 11 weeks of modest copper restriction coincided with heart trouble in 4 out of 23 subjects, including one heart attack (17):
In the history of conducting numerous human studies at the Beltsville Human Nutrition Research Center involving participation by 337 subjects, there had previously been no instances of any health problem related to heart function. During the 11 wk of the present study in which the copper density of the diets fed the subjects was reduced from the pretest level of 0.57 mg/ 1000 kcal to 0.36 mg/1000 kcal, 4 out of 23 subjects were diagnosed as having heart-related abnormalities.
The other reason to be skeptical of the association between blood copper and heart attack risk is that inflammation increases copper in the blood (18, 19). Blood copper level correlates strongly with the marker of inflammation C-reactive protein (CRP) in humans, yet substantially increasing copper intake doesn't increase CRP (20, 21). This suggests that elevated blood copper is likely a symptom of inflammation, rather than its cause, and presents an explanation for the association between blood copper level and heart attack risk.

Only a few studies have looked at the relationship between more accurate markers of copper status and cardiovascular disease in humans. Leukocyte copper status, a marker of tissue status, is lower in people with cardiovascular disease (22, 23). People who die of heart attacks generally have less copper in their hearts than people who die of other causes, although this could be an effect rather than a cause of the heart attack (24). Overall, I find the human data lacking. I'd like to see more studies examining liver copper status in relation to cardiovascular disease, as the liver is the main storage organ for copper.

According to a 2001 study, the majority of Americans may have copper intakes below the USDA recommended daily allowance (25), many substantially so. This problem is exacerbated by the fact that copper levels in food have declined in industrial nations over the course of the 20th century, something I'll discuss in the next post.

Sunday, April 4, 2010

Magnesium and Vitamin D Metabolism

Ted Hutchinson posted a link in the comments section of my last post, pointing to a page on the Vitamin D Council's website where Dr. John Cannell discusses cofactors required for proper vitamin D metabolism. It's actually the site's home page, highlighting how important he feels this matter is. In this case, 'cofactor' simply means another nutrient that's required for the efficient production and use of vitamin D. They include:
  • Magnesium
  • Zinc
  • Vitamin K2
  • Vitamin A
  • Boron
And probably others we aren't yet aware of. On another page, Dr. Cannell links to two papers that review the critical interaction between magnesium status and vitamin D metabolism (1, 2). Here's a quote from the abstract of the second paper:
Magnesium... is essential for the normal function of the parathyroid glands, metabolism of vitamin D and adequate sensitivity of target tissues to [parathyroid hormone] and active vitamin D metabolites. Magnesium deficit is usually associated with hypoparathyroidism, low production of active vitamin D metabolites, in particular 1,25(OH)2 vitamin D3 and resistance to PTH and vitamin D. On the contrary, magnesium excess, similar to calcium, inhibits PTH secretion. Bone metabolism is impaired under positive as well as under negative magnesium balance.
Magnesium status is critical for normal vitamin D metabolism, insulin sensitivity, and overall health. Supplemental magnesium blocks atherosclerosis in multiple animal models (3, 4). Most Americans don't get enough magnesium (5).

The bottom line is that no nutrient acts in a vacuum. The effect of every part of one's diet and lifestyle is dependent on every other part. I often talk about single nutrients on this blog, but my core philosophy is that a proper diet focuses on Real Food, not nutrients. Tinkering with nutritional status using supplements is potentially problematic. Despite what some people might tell you, our understanding of nutrition and human health is currently rather crude-- so it's best to respect the accumulated wisdom of cultures that don't get the diseases we're trying to avoid.

Monday, February 22, 2010

Magnesium and Insulin Sensitivity

From a paper based on US NHANES nutrition and health survey data (1):
During 1999–2000, the diet of a large proportion of the U.S. population did not contain adequate magnesium... Furthermore, racial or ethnic differences in magnesium persist and may contribute to some health disparities.... Because magnesium intake is low among many people in the United States and inadequate magnesium status is associated with increased risk of acute and chronic conditions, an urgent need exists to perform a current survey to assess the physiologic status of magnesium in the U.S. population.
Magnesium is an essential mineral that's slowly disappearing from the modern diet, as industrial agriculture and industrial food processing increasingly dominate our food choices. One of the many things it's necessary for in mammals is proper insulin sensitivity and glucose control. A loss of glucose control due to insulin resistance can eventually lead to diabetes and all its complications.

Magnesium status is associated with insulin sensitivity (2, 3), and a low magnesium intake predicts the development of type II diabetes in most studies (4, 5) but not all (6). Magnesium supplements largely prevent diabetes in a rat model* (7). Interestingly, excess blood glucose and insulin themselves seem to reduce magnesium status, possibly creating a vicious cycle.

In a 1993 trial, a low-magnesium diet reduced insulin sensitivity in healthy volunteers by 25% in just four weeks (8). It also increased urinary thromboxane concentration, a potential concern for cardiovascular health**.

At least three trials have shown that magnesium supplementation increases insulin sensitivity in insulin-resistant diabetics and non-diabetics (9, 10, 11). In some cases, the results were remarkable. In type II diabetics, 16 weeks of magnesium supplementation improved fasting glucose, calculated insulin sensitivity and HbA1c*** (12). HbA1c dropped by 22 percent.

In insulin resistant volunteers with low blood magnesium, magnesium supplementation for four months reduced estimated insulin resistance by 43 percent and decreased fasting insulin by 32 percent (13). This suggests to me that magnesium deficiency was probably one of the main reasons they were insulin resistant in the first place. But the study had another very interesting finding: magnesium improved the subjects' blood lipid profile remarkably. Total cholesterol decreased, LDL decreased, HDL increased and triglycerides decreased by a whopping 39 percent. The same thing had been reported in the medical literature decades earlier when doctors used magnesium injections to treat heart disease, and also in animals treated with magnesium. Magnesium supplementation also suppresses atherosclerosis (thickening and hardening of the arteries) in animal models, a fact that I may discuss in more detail at some point (14, 15).

In the previous study, participants were given 2.5 g magnesium chloride (MgCl2) per day. That's a bit more than the USDA recommended daily allowance (MgCl2 is mostly chloride by weight), in addition to what they were already getting from their diet. Most of a person's magnesium is in their bones, so correcting a deficiency by eating a nutritious diet may take a while.

Speaking of nutritious diets, how does one get magnesium? Good sources include halibut, leafy greens, chocolate and nuts. Bone broths are also an excellent source of highly absorbable magnesium. Whole grains and beans are also fairly good sources, while refined grains lack most of the magnesium in the whole grain. Organic foods, particularly artisanally produced foods from a farmer's market, are richer in magnesium because they grow on better soil and often use older varieties that are more nutritious.

The problem with seeds such as grains, beans and nuts is that they also contain phytic acid which prevents the absorption of magnesium and other minerals (16). Healthy non-industrial societies that relied on grains took great care in their preparation: they soaked them, often fermented them, and also frequently removed a portion of the bran before cooking (17). These steps all served to reduce the level of phytic acid and other anti-nutrients. I've posted a method for effectively reducing the amount of phytic acid in brown rice (18). Beans should ideally be soaked for 24 hours before cooking, preferably in warm water.

Industrial agriculture has systematically depleted our soil of many minerals, due to high-yield crop varieties and the fact that synthetic fertilizers only replace a few minerals. The mineral content of foods in the US, including magnesium, has dropped sharply in the last 50 years. The reason we need to use fertilizers in the first place is that we've broken the natural nutrient cycle in which minerals always return to the soil in the same place they were removed. In 21st century America, minerals are removed from the soil, pass through our toilets, and end up in the landfill or in waste water. This will continue until we find an acceptable way to return human feces and urine to agricultural soil, as many cultures do to this day****.

I believe that an adequate magnesium intake is critical for proper insulin sensitivity and overall health.


* Zucker rats that lack leptin signaling

** Thromboxane A2 is an omega-6 derived eicosanoid that potently constricts blood vessels and promotes blood clotting. It's interesting that magnesium has such a strong effect on it. It indicates that fatty acid balance is not the only major influence on eicosanoid production.

*** Glycated hemoglobin. A measure of the average blood glucose level over the past few weeks.

**** Anyone interested in further reading on this should look up The Humanure Handbook

Tuesday, February 9, 2010

Saturated Fat and Insulin Sensitivity

Insulin sensitivity is a measure of the tissue response to insulin. Typically, it refers to insulin's ability to cause tissues to absorb glucose from the blood. A loss of insulin sensitivity, also called insulin resistance, is a core part of the metabolic disorder that affects many people in industrial nations.

I don't know how many times I've seen the claim in journal articles and on the internet that saturated fat reduces insulin sensitivity. The idea is that saturated fat reduces the body's ability to handle glucose effectively, placing people on the road to diabetes, obesity and heart disease. Given the "selective citation disorder" that plagues the diet-health literature, perhaps this particular claim deserves a closer look.

The Evidence

I found a review article from 2008 that addressed this question (1). I like this review because it only includes high-quality trials that used reliable methods of determining insulin sensitivity*.

On to the meat of it. There were 5 studies in which non-diabetic people were fed diets rich in saturated fat, and compared with a group eating a diet rich in monounsaturated (like olive oil) or polyunsaturated (like corn oil) fat. They ranged in duration from one week to 3 months. Four of the five studies found that fat quality did not affect insulin sensitivity, including one of the 3-month studies.

The fifth study, which is the one that's nearly always cited in the diet-health literature, requires some discussion. This was the KANWU study (2). Over the course of three months, investigators fed 163 volunteers a diet rich in either saturated fat or monounsaturated fat.
The SAFA diet included butter and a table margarine containing a relatively high proportion of SAFAs. The MUFA diet included a spread and a margarine containing high proportions of oleic acid derived from high-oleic sunflower oil and negligible amounts of trans fatty acids and n-3 fatty acids and olive oil.
Yummy. After three months of these diets, there was no significant difference in insulin sensitivity between the saturated fat group and the monounsaturated fat group. Yes, you read that right. Even the study that's selectively cited as evidence that saturated fat causes insulin resistance found no significant difference between the diets. You might not get this by reading the misleading abstract. I'll be generous and acknowledge that the (small) difference was almost statistically significant (p = 0.053).

What the authors decided to focus on instead is the fact that insulin sensitivity declined slightly but significantly on the saturated fat diet compared with the pre-diet baseline. That's why this study is cited as evidence that saturated fat impairs insulin sensitivity. But anyone who has a basic science background will see where this reasoning is flawed (warning: nerd attack. skip the rest of the paragraph if you're not interested). You need a control group for comparison, to take into account normal fluctuations caused by such things as the season, eating mostly cafeteria food, and having a doctor hooking you up to machines. That control group was the group eating monounsaturated fat. The comparison between diet groups was the 'primary outcome', in statistics lingo. That's the comparison that matters, and it wasn't significant. To interpret the study otherwise is to ignore the basic conventions of statistics, which the authors were happy to do. There's a name for it: 'moving the goalpost'. The reviewers shouldn't have let this kind of shenanigans slide.

So we have five studies through 2008, none of which support the idea that saturated fat reduces insulin sensitivity in non-diabetics. Since the review paper was published, I know of one subsequent study that asked the same question (3). Susan J. van Dijk and colleagues fed volunteers with abdominal overweight (beer gut) a diet rich in either saturated fat or monounsaturated fat. I e-mailed the senior author and she said the saturated fat diet was "mostly butter". The specific fats used in the diets weren't mentioned anywhere in the paper, which is a major omission**. In any case, after 8 weeks, insulin sensitivity was virtually identical between the two groups. This study appeared well controlled and used the gold standard method for assessing insulin sensitivity, called the euglycemic-hyperinsulinemic clamp technique***.

The evidence from controlled trials is rather consistent that saturated fat has no appreciable effect on insulin sensitivity.

Why Are We so Focused on Saturated Fat?

Answer: because it's the nutrient everyone loves to hate. As an exercise in completeness, I'm going to mention three dietary factors that actually reduce insulin sensitivity, and get a lot less air time than saturated fat.

#1: Caffeine. That's right, controlled trials show that your favorite murky beverage reduces insulin sensitivity (4, 5). Is it actually relevant to real life? I doubt it. The doses used were large and the studies short-term.

#2: Magnesium deficiency. A low-magnesium diet reduced insulin sensitivity by 25% over the course of three weeks (6). I think this is probably relevant to long-term insulin sensitivity and overall health, although it would be good to have longer-term data. Magnesium deficiency is widespread in industrial nations, due to our over-reliance on refined foods such as sugar, white flour and oils.

#3: Sugar. Fructose reduces insulin sensitivity in humans, along with many other harmful effects (7).

As long as we continue to focus our energy on indicting saturated fat, it will continue distracting us from the real causes of disease.


* For the nerds: euglycemic-hyperinsulinemic clamp (the gold standard), insulin suppression test, or intravenous glucose tolerance test with Minimal Model. They didn't include studies that reported HOMA as their only measure, because it's not very accurate.

** There's this idea that pervades the diet-health literature that all saturated fats are roughly equivalent, all monounsaturated fats are equivalent, etc., therefore it doesn't matter what the source was. This is beyond absurd and reflects our cultural obsession with saturated fat. It really irks me that the reviewers didn't demand this information.

*** They did find that markers of inflammation in fat tissue were higher after the saturated fat diet.

Saturday, May 2, 2009

Iodine

I recently saw a post on Dr. Davis's Heart Scan Blog that reminded me I intended to write about iodine. Iodine is an essential trace mineral. It's required for the formation of activated thyroid hormones T3 and T4. The amount of thyroid hormones in circulation, and the body's sensitivity to them, strongly influences metabolic rate. Iodine deficiency can lead to weight gain and low energy. In more severe cases, it can produce goiter, an enlargement of the thyroid gland.

Iodine deficiency is also the most common cause of preventable mental retardation worldwide. Iodine is required for the development of the nervous system, and also concentrates in a number of other tissues including the eyes, the salivary glands and the mammary glands.

There's a trend in the alternative health community to use unrefined sea salt rather than refined iodized salt. Personally, I use unrefined sea salt on principle, although I'm not convinced refined iodized salt is a problem. But the switch removes the main source of iodine in most peoples' diets, creating the potential for deficiency in some areas. Most notably, the soil in the midwestern United States is poor in iodine and deficiency was common before the introduction of iodized salt.

The natural solution? Sea vegetables. They're rich in iodine, other trace minerals, and flavor. I like to add a 2-inch strip of kombu to my beans. Kombu is a type of kelp. It adds minerals, and is commonly thought to speed the cooking and improve the digestion of beans and grains.

Dulse is a type of sea vegetable that's traditionally North American. It has a salty, savory flavor and a delicate texture. It's great in soups or by itself as a snack.

And then there's wakame, which is delicious in miso soup. Iodine is volatile so freshness matters. Store sea vegetables in a sealed container. It may be possible to overdo iodine, so it's best to eat sea vegetables regularly but in moderation like the Japanese.

Seafood such as fish and shellfish are rich in iodine, especially if fish heads are used to make soup stock. Dairy is a decent source in areas that have sufficient iodine in the soil.

Cod liver oil is another good source of iodine, or at least it was before the advent of modern refining techniques. I don't know if refined cod liver oil contains iodine. I suspect that fermented cod liver oil is still a good source of iodine because it isn't refined.

Omega-6 Linoleic Acid Suppresses Thyroid Signaling


Saturday, April 4, 2009

A New Way to Soak Brown Rice

I've been looking for a way to prepare whole brown rice that increases its mineral availability without changing its texture. I've been re-reading some of the papers I've accumulated on grain processing and mineral availability, and I've found a simple way to do it.

In the 2008 paper "
Effects of soaking, germination and fermentation on phytic acid, total and in vitro soluble zinc in brown rice", Dr. Robert J. Hamer's group found that soaking alone didn't have much of an effect on phytic acid in brown rice. However, fermentation was highly effective at degrading it. What I didn't realize the first time I read the paper is that they fermented intact brown rice rather than grinding it. This wasn't clear from the description in the methods section but I confirmed it by e-mail with the lead author Dr. Jianfen Liang. He added that the procedure comes from a traditional Chinese recipe for rice noodles. The method they used is very simple:
  1. Soak brown rice in dechlorinated water for 24 hours at room temperature without changing the water. Reserve 10% of the soaking liquid (should keep for a long time in the fridge). Discard the rest of the soaking liquid; cook the rice in fresh water.
  2. The next time you make brown rice, use the same procedure as above, but add the soaking liquid you reserved from the last batch to the rest of the soaking water.
  3. Repeat the cycle. The process will gradually improve until 96% or more of the phytic acid is degraded at 24 hours.
This process probably depends on two factors: fermentation acidifies the soaking medium, which activates the phytase (phytic acid-degrading enzyme) already present in the rice; and it also cultivates microorganisms that produce their own phytase. I would guess the latter factor is the more important one, because brown rice doesn't contain much phytase.

You can probably use the same liquid to soak other grains and beans.

Wednesday, April 1, 2009

Reversing Tooth Decay

In the last post, I discussed the research of Drs. Edward and May Mellanby on the nutritional factors affecting tooth formation. Dr. Mellanby is the man who discovered vitamin D and identified the cause of rickets. Nutrition has a profound effect on tooth structure, and well-formed teeth are inherently resistant to decay. But is there anything you can do if your teeth are already formed?

Teeth are able to heal themselves. That's how traditional cultures such as the Inuit can wear their teeth down to the pulp due to chewing leather and sand-covered dried fish, yet still have an exceptionally low rate of tooth decay. It's also how the African Wakamba tribe can file their front teeth into sharp points without causing decay. Both cultures lost their resistance to tooth decay after adopting nutrient-poor Western foods such as white flour and sugar.

Teeth are made of four layers.
Enamel is the hardest, most mineralized outer shell. Dentin is another protective mineralized layer that's below the enamel. Below the dentin is the pulp, which contains blood vessels and nerves. The roots are coated with cementum, another mineralized tissue.

When enamel is poorly formed and the diet isn't adequate, enamel dissolves and decay sets in. Tooth decay is an opportunistic infection that takes advantage of poorly built or maintained teeth. If the diet remains inadequate, the tooth has to be filled or removed, or the person risks more serious complications.

Fortunately, a decaying or broken tooth has the ability to heal itself. Pulp contains cells called odontoblasts, which form new dentin if the diet is good. Here's what Dr. Edward Mellanby had to say about his wife's research on the subject. This is taken from Nutrition and Disease:
Since the days of John Hunter it has been known that when the enamel and dentine are injured by attrition or caries, teeth do not remain passive but respond to the injury by producing a reaction of the odontoblasts in the dental pulp in an area generally corresponding to the damaged tissue and resulting in a laying down of what is known as secondary dentine. In 1922 M. Mellanby proceeded to investigate this phenomenon under varying nutritional conditions and found that she could control the secondary dentine laid down in the teeth of animals as a reaction to attrition both in quality and quantity, independently of the original structure of the tooth. Thus, when a diet of high calci­fying qualities, ie., one rich in vitamin D, calcium and phosphorus was given to the dogs during the period of attrition, the new secondary dentine laid down was abundant and well formed whether the original structure of the teeth was good or bad. On the other hand, a diet rich in cereals and poor in vitamin D resulted in the production of secondary dentine either small in amount or poorly calcified, and this happened even if the primary dentine was well formed.
Thus, in dogs, the factors that affect tooth healing are the same factors that affect tooth development:
  1. The mineral content of the diet, particularly calcium and phosphorus
  2. The fat-soluble vitamin content of the diet, chiefly vitamin D
  3. The availability of minerals for absorption, determined largely by the diet's phytic acid content (prevents mineral absorption)
What about humans? Drs. Mellanby set out to see if they could use their dietary principles to cure tooth decay that was already established. They divided 62 children with cavities into three different diet groups for 6 months. Group 1 ate their normal diet plus oatmeal (rich in phytic acid). Group 2 ate their normal diet plus vitamin D. Group 3 ate a grain-free diet and took vitamin D.

In group 1, oatmeal prevented healing and encouraged new cavities, presumably due to its ability to prevent mineral absorption. In group 2, simply adding vitamin D to the diet caused most cavities to heal and fewer to form. The most striking effect was in group 3, the group eating a grain-free diet plus vitamin D, in which nearly all cavities healed and very few new cavities developed. Grains are the main source of phytic acid in the modern diet, although we can't rule out the possibility that grains were promoting tooth decay through another mechanism as well.

Dr. Mellanby was quick to point out that diet 3 contained some carbohydrate (~45% reduction) and was not low in sugar: "Although [diet 3] contained no bread, porridge or other cereals, it included a moderate amount of carbohydrates, for plenty of milk, jam, sugar, potatoes and vegetables were eaten by this group of children." This study was published in the British Medical Journal (1) and
the British Dental journal. Here's Dr. Edward Mellanby again:
The hardening of carious areas that takes place in the teeth of children fed on diets of high calcifying value indicates the arrest of the active process and may result in “healing” of the infected area. As might be surmised, this phenomenon is accompanied by a laying down of a thick barrier of well-formed secondary denture... Summing up these results it will be clear that the clinical deductions made on the basis of the animal experiments have been justified, and that it is now known how to diminish the spread of caries and even to stop the active carious process in many affected teeth.
Dr. Mellanby first began publishing studies showing the reversal of cavities in humans in 1924. Why has such a major medical finding, published in high-impact peer-reviewed journals, faded into obscurity?

Dr. Weston Price also had success curing tooth decay using a similar diet. He fed underprivileged children one very nutritious meal a day and monitored their dental health. From Nutrition and Physical Degeneration (p. 290):
About four ounces of tomato juice or orange juice and a teaspoonful of a mixture of equal parts of a very high vitamin natural cod liver oil and an especially high vitamin butter was given at the beginning of the meal. They then received a bowl containing approximately a pint of a very rich vegetable and meat stew, made largely from bone marrow and fine cuts of tender meat: the meat was usually broiled separately to retain its juice and then chopped very fine and added to the bone marrow meat soup which always contained finely chopped vegetables and plenty of very yellow carrots; for the next course they had cooked fruit, with very little sweetening, and rolls made from freshly ground whole wheat, which were spread with the high-vitamin butter. The wheat for the rolls was ground fresh every day in a motor driven coffee mill. Each child was also given two glasses of fresh whole milk. The menu was varied from day to day by substituting for the meat stew, fish chowder or organs of animals.
Dr. Price provides before and after X-rays showing re-calcification of cavity-ridden teeth on this program. His intervention was not exactly the same as Drs. Mellanby, but it was similar in many ways. Both diets were high in minerals, rich in fat-soluble vitamins (including D), and low in phytic acid.

Price's diet was not grain-free, but used rolls made from freshly ground whole wheat. Freshly ground whole wheat has a high phytase (the enzyme that degrades phytic acid) activity, thus in conjunction with the long yeast rises common in Price's time, it would have broken down nearly all of its own phytic acid. This would have made it a source of minerals rather than a sink for them. He also used high-vitamin pastured butter in conjunction with cod liver oil. We now know that the vitamin K2 in pastured butter is important for bone and tooth development and maintenance. This was something that Dr. Mellanby did not understand at the time, but modern science has corroborated Price's finding that K2 is synergistic with vitamin D in promoting skeletal and dental health.

If I were to design the ultimate dietary program to heal cavities that incorporates the successes of both doctors, it would look something like this:
  • Rich in animal foods, particularly full-fat pastured dairy products (if tolerated). Also meat, organs, fish, bone broths and eggs.
  • Fermented grains only; no unfermented grains such as oatmeal, breakfast cereal, crackers, etc. No breads except true sourdough (ingredients should not list lactic acid). Or even better, no grains at all.
  • Limited nuts; beans in moderation, only if they're soaked overnight or longer in warm water (due to the phytic acid).
  • Starchy vegetables such as potatoes and sweet potatoes.
  • A limited quantity of fruit (one piece per day or less), but no refined sweets.
  • Cooked and raw vegetables.
  • Sunlight, high-vitamin cod liver oil or vitamin D3 supplements.
  • A generous amount of pastured butter.
  • No industrially processed food.
This diet would maximize mineral absorption while providing abundant fat-soluble vitamins. It probably isn't necessary to follow it strictly. For example, if you eat more mineral-rich foods such as dairy and bone broths, you can probably get away with more phytic acid. Or you might be able to heal cavities eating like this for only one or two meals a day, as Dr. Price demonstrated.

Saturday, March 28, 2009

Preventing Tooth Decay

Meet Sir Edward Mellanby, the discoverer of vitamin D. Along with his wife, Dr. May Mellanby, he identified dietary factors that control the formation and repair of teeth and bones. He also identified the cause of rickets (vitamin D deficiency) and the effect of phytic acid on mineral absorption. Truly a great man! This research began in the 1910s and continued through the 1940s.

What he discovered about tooth and bone formation is profound, disarmingly simple and largely forgotten. I remember going to the dentist as a child. He told me I had good teeth. I informed him that I tried to eat well and stay away from sweets. He explained to me that I had good teeth because of genetics, not my diet. I was skeptical at the time, but now I realize just how ignorant that man was.

Tooth structure is determined during growth. Well-formed teeth are highly resistant to decay while poorly-formed teeth are cavity-prone. Drs. Mellanby demonstrated this by showing a strong correlation between tooth enamel defects and cavities in British children. The following graph is drawn from several studies he compiled in the book Nutrition and Disease (1934). "Hypoplastic" refers to enamel that's poorly formed on a microscopic level.
The graph is confusing, so don't worry if you're having a hard time interpreting it. If you look at the blue bar representing children with well-formed teeth, you can see that 77% of them have no cavities, and only 7.5% have severe cavities (a "3" on the X axis). Looking at the green bar, only 6% of children with the worst enamel structure are without cavities, while 74% have severe cavities. Enamel structure is VERY strongly related to cavity prevalence.

What determines enamel structure during growth? Drs. Mellanby identified three dominant factors:
  1. The mineral content of the diet
  2. The fat-soluble vitamin content of the diet, chiefly vitamin D
  3. The availability of minerals for absorption, determined largely by the diet's phytic acid content
Teeth and bones are a mineralized protein scaffold. Vitamin D influences the quality of the protein scaffold that's laid down. For the scaffold to mineralize, the diet has to contain enough minerals, primarily calcium and phosphorus. Vitamin D allows the digestive system to absorb the minerals, but it can only absorb them if they aren't bound by phytic acid. Phytic acid is an anti-nutrient found primarily in unfermented seeds such as grains. So the process depends on getting minerals (sufficient minerals in the diet and low phytic acid) and putting them in the right place (fat-soluble vitamins).

Optimal tooth and bone formation occurs only on a diet that is sufficient in minerals, fat-soluble vitamins, and low in phytic acid
. Drs. Mellanby used dogs in their experiments, which it turns out are a good model for tooth formation in humans for a reason I'll explain later. From Nutrition and Disease:
Thus, if growing puppies are given a limited amount of separated [skim] milk together with cereals, lean meat, orange juice, and yeast (i.e., a diet containing sufficient energy value and also sufficient proteins, carbohydrates, vitamins B and C, and salts), defectively formed teeth will result. If some rich source of vitamin D be added, such as cod-liver oil or egg-yolk, the structure of the teeth will be greatly improved, while the addition of oils such as olive... leaves the teeth as badly formed as when the basal diet only is given... If, when the vitamin D intake is deficient, the cereal part of the diet is increased, or if wheat germ [high in phytic acid] replaces white flour, or, again, if oatmeal [high in phytic acid] is substituted for white flour, then the teeth tend to be worse in structure, but if, under these conditions, the calcium intake is increased, then calcification [the deposition of calcium in the teeth] is improved.
Other researchers initially disputed the Mellanbys' results because they weren't able to replicate the findings in rats. It turns out, rats produce the phytic acid-degrading enzyme phytase in their small intestine, so they can extract minerals from unfermented grains better than dogs. Humans also produce phytase, but at levels so low they don't significantly degrade phytic acid. The small intestine of rats has about 30 times the phytase activity of the human small intestine, again demonstrating that humans are not well adapted to eating grains. Our ability to extract minerals from seeds is comparable to that of dogs, which shows that the Mellanbys' results are applicable to humans.

Drs. Mellanby found that the same three factors determine bone quality in dogs as well, which I may discuss in another post.

Is there anything someone with fully formed enamel can do to prevent tooth decay? Drs. Mellanby showed (in humans this time) that not only can tooth decay be prevented by a good diet, it can be almost completely reversed even if it's already present. Dr. Weston Price used a similar method to reverse tooth decay as well. I'll discuss that in my next post.

Thursday, February 26, 2009

Dietary Fiber and Mineral Availability

Mainstream health authorities are constantly telling us to eat more fiber for health, particularly whole grains, fruit and vegetables. Yet the only clinical trial that has ever isolated the effect of eating a high-fiber diet on overall risk of death, the Diet and Reinfarction Trial, came up with this graph:



Oops! How embarrassing. At two years, the group that doubled its fiber intake had a 27% greater chance of dying and a 23% greater chance of having a heart attack. The extra fiber was coming from whole grains. I should say, out of fairness, that the result wasn't quite statistically significant (p less than 0.05) at two years. But at the very least, this doesn't support the idea that increasing fiber will extend your life. I believe this the only diet trial that has ever looked at fiber and mortality, without also changing other variables at the same time.

Why might fiber be problematic? I read a paper recently that gave a pretty convincing answer to that question: "Dietary Fibre and Mineral Bioavailability", by Dr. Barbara F. Hartland. By definition, fiber is indigestible. We can divide it into two categories: soluble and insoluble. Insoluble fiber is mostly cellulose and it's relatively inert, besides getting fermented a bit by the gut flora. Soluble fiber is anything that can be dissolved in water but not digested by the human digestive tract. It includes a variety of molecules, some of which are quite effective at keeping you from absorbing minerals. Chief among these is phytic acid, with smaller contributions from tannins (polyphenols) and oxalates. The paper makes a strong case that phytic acid is the main reason fiber prevents mineral absorption, rather than the insoluble fiber fraction. This notion was confirmed here.

As a little side note, polyphenols are those wonderful plant antioxidants that are one of the main justifications for the supposed health benefits of vegetables, tea, chocolate, fruits and antioxidant supplements. The problem is, many of them are actually anti-nutrients. They reduce mineral absorption, reduce growth and feed efficiency in a number of species, and the antioxidant effect seen in human plasma after eating them is due largely to our own bodies secreting uric acid into the blood (a defense mechanism?), rather than the polyphenols themselves. The main antioxidants in plasma are uric acid, vitamin C and vitamin E, with almost no direct contribution from polyphenols. I'm open to the idea that some polyphenols could be beneficial if someone can show me convincing data, but in any case they are not the panacea they're made out to be. Thanks to Peter for cluing me in on this.

Whole grains would be a good source of water-soluble vitamins and minerals, if it weren't for their very high phytic acid content. Even though whole grains are full of minerals, replacing refined grains with whole grains in the diet (and especially adding extra bran) actually reduces the overall absorption of a number of minerals (free text, check out table 4). This has been confirmed repeatedly for iron, zinc, calcium, magnesium and phosphorus. That could well account for the increased mortality in the DART trial.

Refining grains gets rid of the vitamins and minerals but at least refined grains don't prevent you from absorbing the minerals in the rest of your food. Here's a comparison of a few of the nutrients in one cup of cooked brown vs. unenriched white rice (218 vs. 242 calories):

Brown rice would be quite nutritious if we could absorb all those minerals. There are a few ways to increase mineral absorption from whole grains. One way is to soak them in slightly acidic, warm water, which allows their own phytase enzyme to break down phytic acid. This doesn't seem to do much for brown rice, which doesn't contain much phytase.

A more effective method is to grind grains and soak them before cooking, which helps the phytase function more effectively, especially in gluten grains and buckwheat. The most effective method by far, and the method of choice among healthy traditional cultures around the world, is to soak, grind and ferment whole grains. This breaks down nearly all the phytic acid, making whole grains a good source of both minerals and vitamins.

The paper "Dietary Fibre and Mineral Bioavailability" listed another method of increasing mineral absorption from whole grains that I wasn't aware of. Certain foods can increase the absorption of minerals from whole grains high in phytic acid. These include: foods rich in vitamin C such as fruit or potatoes; meat including fish; and dairy.

Another point the paper made was that the phytic acid content of vegetarian diets is often very high, potentially leading to mineral deficiencies. The typical modern vegetarian diet containing brown rice and unfermented soy products is very high in phytic acid and thus very low in absorbable minerals. The more your diet depends on plant sources for minerals, the more careful you have to be about how you prepare your food.