An Ancestral Perspective on Vitamin D Status, Part 1: Problems With the “Naked Ape” Hypothesis of Optimal Serum 25(OH)D

December 18, 2013
by Chris Masterjohn

Full Story

It has become increasingly common for health enthusiasts and now even typical patients in the doctor’s office to have a lab estimate their vitamin D status by measuring serum 25-hydroxyvitamin D, abbreviated 25(OH)D, and use vitamin D supplements to bring this value into the “sufficient” range. While even this basic practice is problematic, the extraordinary influence of what I like to call the “naked ape hypothesis of optimal serum 25(OH)D” has created a paradigm wherein “deficiency” and “insufficiency” are together seen as the norm, and 25(OH)D levels that may actually be quite dangerous — especially when not adequately balanced by other nutrients in the diet — are specifically sought out as a panacea.

Indeed, this paradigm heavily influenced some of my early articles on the fat-soluble vitamins, such as my 2006 article “From Seafood to Sunshine: A New Understanding of Vitamin D Safety,” where I contributed original ideas about the interactions between the fat-soluble vitamins, but essentially took the popular ideas advocating high levels of serum 25(OH)D for granted. I now believe my rush to embrace the latter paradigm was quite hasty and that it deserves the same critical attention that I have put into my ideas about vitamin interactions.

In this series, I will argue that we should emphasize serum 25(OH)D much less and emphasize the nutrient density and nutrient balance of the overall diet and lifestyle much more. I will also offer some practical guidelines for interpreting 25(OH)D in a more nuanced and sophisticated way.

I’d like to begin by pointing out some not-so-obvious and yet profound problems with the “naked ape hypothesis of serum 25(OH)D.” For simplicity, I may refer to this hypothesis simply as the “naked ape hypothesis” through the remainder of this post.

If you’re anxious for the bottom line, you can watch the video summary. If you’d like to head straight into meat of the post, you can find it below.

The “Naked Ape Hypothesis” Defined

The naked ape hypothesis essentially holds that the requirement for a certain 25(OH)D concentration circulating in serum was fixed into our genome during some critical prehistoric window wherein humans exposed themselves to maximal levels of sunshine, that this level can be estimated by examining humans with maximal sun exposure today, that we should presume this level to be safe and optimal, and that the burden of evidence lies on anyone suggesting otherwise.

Dr. Reinhold Vieth has most commonly and popularly articulated the hypothesis using the phrase “naked ape” to describe our prehistoric ancestors. Here is a quote, for example, from a book chapter he co-authored with Dr. Gloria Sidhom in 2010 (1):

During the evolution of our species, requirements for vitamin D were satisfied by the life of the naked ape in the environment for which its genome was optimized through natural selection. The horn of Africa was the original, natural environment for the species, Homo sapiens. Our genome, our physiology, and hence our vitamin D requirement are not thought to have changed in the past 100,000 years. . . . Since early human evolution occurred under UV-rich conditions, typical 25(OH)D concentrations were surely higher than 100 nmol/L [40 ng/mL]. Levels like this are now seen in lifeguards, farmers, or people who sun tan. . . . Since our genome was selected under these conditions through evolution, it should be evident that our biology was optimized for a vitamin D supply that is far higher than we currently regard as normal.

Dr. Vieth articulated the same hypothesis in an incredibly popular and influential 1999 article (2) wherein he provided specific data supporting the high serum 25(OH)D levels referred to in the quote above. As of this writing, Google Scholar estimates that this paper has been cited a whopping 1,169 times. When I made similar points in my 2013 Ancestral Health Symposium presentation in Atlanta this past August, Google Scholar estimated it had been cited 1,126 times, suggesting it may have been cited in 43 additional papers just in the last four months. In the paper, Vieth articulated the hypothesis in a similar way to that quoted above:

Many arguments favoring higher intakes of calcium and other nutrients have been based on evidence about the diets of prehistoric humans. Likewise, the circulating 25-hydroxyvitamin D concentrations of early humans were surely far higher than what is now regarded as normal. Humans evolved as naked apes in tropical Africa. The full body surface of our ancestors was exposed to the sun almost daily. . . . Serum 25(OH)D concentrations of people living or working in sun-rich environments are summarized in Table 1.

And here is the data from Table 1, with the 25(OH)D values converted from nmol/L to ng/mL to make them comparable to the values Americans get back from clinical labs (my apologies to my international readership for this bias; click to enlarge):

The data from Table 1 of Vieth's 1999 paper.

While Dr. Vieth often refers to prehistoric humans as “naked apes,” others promote the same hypothesis without necessarily invoking that specific phrase. In a preliminary criticism of the 2010 vitamin D recommendations from the Institute of Medicine, Dr. Robert Heaney articulated the hypothesis as follows (3):

. . . I believe that the presumption of adequacy should rest with vitamin D intakes needed to achieve the serum 25(OH)D values (i.e., 40–60 ng/mL) that prevailed during the evolution of human physiology. Correspondingly, the burden of proof should fall on those maintaining that there is no preventable disease or dysfunction at lower levels. The [Institute of Medicine] has not met that standard.

Dr. Heaney later co-authored a paper with Dr. Michael Holick, “Why the IOM Recommendations Are Deficient,” wherein they reiterated the naked ape hypothesis, this time ramping the upper end of the range from 60 up to 80 ng/mL, and insisting with a more elaborate argument that the burden of proof lies on anyone who disagrees (4):

Beyond [the other errors and inconsistencies Heaney and Holick argued exist in the Institute of Medicine report], though, serious as they are, lies a much deeper flaw in the approach taken by the panel, exemplified by a quote from one of the panel members to the New York Times at the time of release of the report [reference]. The statement was simply that the “onus” (ie, burden of proof) fell on anyone who claimed benefits for intakes higher than the panel’s current recommendations. This is an approach that is correct for drugs, which are foreign chemicals and which do carry an appropriately heavy requirement for proof. For drugs, the position of privilege is given to the placebo. And in the current IOM report, the privilege is given to a serum 25(OH)D level that is effectively the status quo. We judge that this is exactly backward for nutrients. The privilege instead must be given to the intake that prevailed during the evolution of human physiology, the intake to which, presumably, that physiology is fine-tuned. So far as can be judged from numerous studies documenting the magnitude of the effect of sun exposure [references] the primitive intake would have been at least 4000 IU/day and probably two to three times that level, with corresponding serum 25(OH)D levels ranging from 40 to 80 ng/mL. The fact that primitive levels would have been higher than current IOM recommendations does not, of course, prove their necessity today. But such intakes should be given the presumption of correctness, and the burden of proof must be placed on those who propose that lower intakes (and lower serum levels) are without risk of preventable dysfunction or disease. The IOM, in its report, has utterly failed to recognize or meet that standard.

While these arguments seem compelling on the surface, they rely on deeply problematic assumptions: first, that humans evolved as “naked apes” whose skin would be exposed to maximal sunshine; second, that we can estimate the 25(OH)D levels they would have achieved by examining modern populations exposed to maximal amounts of sun, such as lifeguards in Israel and Missouri; and third, that our requirement for a certain concentration of 25(OH)D circulating in our serum was somehow indelibly fixed into our genome “back when we evolved” in the era of the naked ape.

To elaborate on these assumptions graphically, the naked ape hypothesis assumes that there was some critical window between the loss of fur and the gain of clothing or other aspects of modern, sun-aversive lifestyles, and that this window constitutes the era of the naked ape (click to enlarge):

The critical window between the loss of fur and the gain of clothing.

During this critical window, moreover, our requirement for a precise concentration of 25(OH)D circulating in our serum was indelibly fixed into our genome, never to be altered again (after all, we had already evolved), and this very concentration can be determined quite easily by testing its value in modern lifeguards (click to enlarge):

The figure depicts the practice of determining our optimal and safe values for serum 25(OH)D by using modern lifeguards as a proxy for prehistoric "naked apes."Let’s take a closer look at some of these assumptions.

Were We Ever “Naked Apes”?

First, were we ever truly “naked apes”? In a recent paper (5), a group of scientists published their estimation of when humans started wearing clothing based on when head lice diverged from clothing lice. They made the reasonable assumption that clothing must have been in widespread use prior to this divergence. While I’m skeptical of these types of “molecular clock” estimates, and while I think certainty about these matters will always remain elusive, we need to use what evidence we have if we’re going to base a scientific hypothesis on our speculations about this era of prehistory, especially when those speculations are used as the basis for clinical recommendations about vitamin supplementation. The authors of the lice paper provided this useful timeline of evidence for the human use of clothing (click to enlarge):

The chart shows a timeline of evidence for the human use of clothing.

I modified the timeline by adding current estimates for the gain of dark skin pigmentation (as reviewed in reference 6).

Archeological evidence for complex, tailored clothing dates back 40,000 years, whereas such evidence for hide scrapers dates back 780,000 years. While the hide scrapers were not necessarily used for clothing, they were likely used to make some type of shelter that would have expanded the opportunities for shade-seeking behavior even beyond the abundance of such opportunities that are naturally available in the absence of human ingenuity. Molecular estimates for the divergence of head and clothing lice suggest clothing was in widespread use by at least 170,000 years ago. Altogether, these data suggest clothing was in use at least for most of the history of anatomically modern humans, and perhaps for most of human history.

When we consider that scientists estimate both the loss of fur and the gain of dark skin pigmentation to date back to about 1.2 million years ago, moreover, but estimate that light skin didn’t even begin evolving in Northern Europeans and East Asians until some 30,000 years ago (7), long after evidence for widespread use of clothing, it becomes clear that we were never truly “naked apes” at all. Taken at face value, the available evidence instead suggests that approximately as long ago as we lost the protection from the sun afforded by fur we gained the protection afforded by dark skin, and that we began clothing ourselves in more innovative ways long before we began losing the protection of dark skin.

Additionally, a 2002 review by Lawrence Barham pointed out that archaeological evidence for the use of colored pigments in Africa dates back 270,000 years and that the use of these pigments has remained from that time through the present a persistent presence in the archeological, historical, and ethnographic record (8). While we don’t know what humans were doing with these pigments a quarter million years ago, we know that nowadaysthey cover their skin for ritual purposes, offering at least intermittently an additional source of protection from the sun. This certainly could have been a feature of prehistoric human society.

Beyond this, isn’t it likely that our ancestors used botanical sunscreens? Weston Price, for example, reported that it was universal traditional practice in the Pacific Islands to use coconut oil as a sunscreen (9):

In several [Pacific] islands regulatory measures had been adopted requiring the covering of the body. This regulation had greatly reduced the primitive practice of coating the surface of the body with coconut oil, which had the effect of absorbing the ultra-violet rays thus preventing injury from the tropical sun. This coating of oil enabled them to shed the rain which was frequently torrential though of short duration.

We know that the polyphenols from coconut oil absorb ultraviolet light in the UV-B spectrum, the set of wavelengths responsible for producing vitamin D (10), so we can reasonably expect this practice to have decreased the synthesis of vitamin D in the skin. Plants in general, moreover, produce compounds that protect against UV-B exposure. In fact, the more plants are exposed to UV-B, the more of these compounds they produce (11). This would suggest that humans have always had a rich array of UV-B-protective botanical sunscreens at their fingertips, and that as climate change produced varying levels of UV-B exposure, botanical sunscreens would become proportionally more or less effective at affording such protection, perhaps helping humans reach an equilibrium with their environment by regulating their UV-B exposure as needed.

Thus, our prehistoric ancestors were unlikely to ever have been “naked apes” because they seem to always have been clothed by at least dark skin, and to have used animal hides, clothing, colored pigments, and perhaps botanical sunscreens throughout huge portions of their history, some of these things being used, according to current estimates, long before light skin ever developed.

Do Modern Lifeguards Provide Proxies for Prehistoric 25(OH)D?

Naked apes or not, humans certainly lived in prehistoric times. To what extent can modern lifeguards or any other group provide a proxy for the 25(OH)D levels of prehistoric humans?

If there is any such proxy, modern lifeguards certainly don’t fit the bill.

To begin with, lifeguards are paid to work throughout the day. Traditionally living humans, like other animals, seek shade from the hot sun in a way that lifeguards can’t, even if they have some sunscreen and an umbrella. One paper noted the universality of shade-seeking behavior among non-human primates such as gorillas, chimpanzees, and baboons, and suggested that humans dwelling in the African savannah would similarly have engaged in these behaviors throughout their evolution (12):

Reduced activity during the hottest period of the day is typical of the African hominoids Pan troglodytes and Gorilla gorilla and savannah-dwelling baboons of the genus Papio. In these living primates foraging behavior peaks in the early to mid-morning, and again in the late afternoon. Similar daily activity patterns would probably have been adopted by early hominids.

I think their suggestion about what “would probably have been adopted by early hominids” is awfully speculative, but we know that traditionally living humans in equatorial Africa engage in similar shade-seeking behavior. For example, scientists from Fritz Muskiet’s lab recently measured vitamin D status in Hadza hunter-gatherers and Maasai cattle-herders from this region, and described their behavior at mid-day as follows (13):

Maasai spend most of their days in the sun, wearing clothes that cover mainly their upper body and upper legs. It is important to note that, whenever possible, they avoid direct exposure to the sun and prefer a shady place, especially during midday. . . .

Hadzabe spend most of their days in the sun. Similar to the Maasai tribe, they avoid direct exposure to the fierce sun whenever possible, and most of their activities are planned in the early morning and late afternoon, while spending the middle part of the day sleeping, eating or talking in a cooler place under a tree or rock.

By contrast, how did the authors who wrote the paper showing that Israeli lifeguards have high 25(OH)D describe their behavior? Quite differently (14):

On the job lifeguards are exposed to heat and intense sunlight over almost their entire body surface, for at least eight hours per day, six months per year.

The authors made no qualification that these lifeguards spent mid-day in cool places under trees or rocks. Of course the authors didn’t describe their behavior in great detail, and one could suppose that lifeguards wear sunscreen and use umbrellas to protect themselves from the sun. The lifeguards nevertheless seem to be getting altogether too much sunshine. The most damning evidence that these lifeguards are terrible proxies for judging optimal and safe 25(OH)D, in fact, is their 20-fold increase in the incidence of kidney stones (click to enlarge):

The graph depicts the higher 25(OH)D and dramatically increased risk of kidney stones in Israeli lifeguards.

The 25(OH)D reported for Israeli lifeguards in this book chapter (53.4 ng/mL) is similar but somewhat lower than Vieth reported for the same reference in his 1999 paper (148 nmol/L or 59.2 ng/mL). It’s about twice as high as that found in controls matched for age, sex, and season. The lifeguards had 20-fold the risk of kidney stones compared to the general population in northern Israel. In southern Israel, the risk was only elevated 12-fold, but this is because the general population in southern Israel had twice the risk of kidney stones as the general population of northern Israel, consistent with observational studies cited in the paper showing that the closer you get to the equator, the more the risk of kidney stones increases, likely because calcification of the kidneys is the most sensitive sign of vitamin D toxicity (15).

These lifeguards did not have elevated levels of calcium in their blood, but they did have elevated levels of calcium in their urine, which the authors attributed to excess 25(OH)D; insufficient urinary output, which the authors attributed to dehydration; and elevated levels of uric acid in their blood, which the authors suggested resulted from “solar damage to the skin.” All three of these could be attributed to too much sun exposure, and the authors suggested that they “probably all contribute to the susceptibility of lifeguards to form kidney stones.”

These data seem to flatly contradict both the idea that vitamin D is only toxic at doses that cause hypercalcemia (contradicted also by animal experiments, including 15) and the idea that you can’t get too much vitamin D from the sun.

It seems unlikely that Dr. Vieth missed this point when he cited the high 25(OH)D of these lifeguards as safe and optimal when one considers that the title of the chapter is “Increased Incidence of Nephrolithiasis in Lifeguards in Israel,” and that “nephrolithiasis” is just a fancy name for kidney stones.

There’s another problem with the lifeguards, however. Like all the other groups cited in Vieth’s 1999 paper, including Puerto Rican farmers and hospital workers, lifeguards from Israel and Missouri have substantial European ancestry.

As I pointed out back in December of 2010 in my post, “Vitamin D — Problems With the Latitude Hypothesis,” a 2009 meta-analysis (16) that pooled together the results of 394 studies examining 25(OH)D levels in over 30,000 people across the globe found that while 25(OH)D declined in Caucasians with increasing distance from the equator, non-Caucasians had similar 25(OH)D levels regardless of their distance from the equator. As a result, Caucasians living quite far away from the equator had higher 25(OH)D than non-Caucasians living in equatorial environments. This hints that vitamin D metabolism could differ between Caucasians and non-Caucasians, and it strongly suggests that if we used people without substantial European ancestry as our model for 25(OH)D status in sun-rich environments, we would generate much lower estimates.

Indeed, a paper entitled “Low Vitamin D Status Despite Abundant Sun Exposure” (17) found that, among students and skateboarders with abundant exposure to the Hawaiian sun, whites had a mean 25(OH)D of 37 ng/mL while Asians had a mean value of only 25 ng/mL. Unfortunately the authors did not report sun exposure separately by race, but the average among all the subjects was over 22 hours per week exposing half the body to the sun without any sunscreen.

Consistent with the possibility that Asians have lower 25(OH)D than whites even when exposed to abundant tropical sun exposure, rural Indians who expose their face, chest, back, legs, arms, and forearms to the Indian sun for nine hours per day have even lower levels. One study reported a mean of 24 ng/mL in males and 19 ng/mL in females (18).

Of course if we really wanted to understand likely 25(OH)D values in our prehistoric African ancestors, we would do well to look at Africans rather than Asians. At first glance, the studies of the Maasai and Hadza from Fritz Muskiet’s lab (13) may seem to contradict the point I am making. After all, the means for these populations were between 40 and 50 ng/mL.

We can resolve these discrepancies, however, simply by adjusting the values to account for discrepancies in different methods of measuring 25(OH)D. I’ll explore this problem in more detail in a later post in the series, but for now I’ll just present the adjusted values (click to enlarge):

Once adjusted, the values suggested 30-40 ng/mL is natural among traditionally living non-white equatorial populations.

I adjusted the values to correspond to those of the DiaSorin radioimmunoassay using the discrepancies found in references 17and 19. While this does not necessarily make the values more accurate, it makes them easier to compare to one another, and since the DiaSorin assay is the one most commonly used in the literature, it makes it easier to situate these values in the context of the general body of scientific literature. What we see is that the variation in values among the different groups is significantly reduced after adjustment, and with the exception of rural Indians, the values lie between 30 and 40 ng/mL. This is considerably lower than the values presented in Dr. Vieth’s 1999 paper, which ranged from 42 to 65 ng/mL.

I suspect that the rural Indians have lower values because they have low intakes of bioavailable calcium. I will explore the effect of calcium intake on 25(OH)D status in a future post in this series.

Having found some commonalities among non-white populations inhabiting tropical regions, have we finally arrived at our best proxies for prehistoric values?

No, for a rather simple reason: there are no true proxies.

Scientists estimate that the climate was much colder through most of our existence (20). We are currently living in an interglacial period of an ice age, and scientists estimate that for most of our history we lived in glacial periods. While reflective snow and ice cover could increase UV-B exposure during glacial periods in some parts of the world, and thus increase vitamin D synthesis in those regions, this would not have been the case in equatorial Africa. It would have been colder, encouraging a more extensive use of clothing, but not covered in ice. Aerosolized dust and salt were much higher in the colder periods, moreover, which would presumably have decreased UV-B exposure even further.

While no one has yet estimated prehistoric UV-B exposure in fine enough detail for us to reliably estimate the relative opportunities humans had for vitamin D synthesis in the prehistoric past, Barry Lomax, a paleoclimatologist from the University of Nottingham, told me in August that he has received funding to produce such estimates for the past half million years and that we should see such publications emerge in the next year or so. He agreed with me, pending hard data he expects to generate over the course of his research, that it is likely that prehistoric exposure was, on the whole, lower than it is now, with occasional localized pockets of higher exposure.

Overall, then, modern lifeguards with evidence of excessive sun exposure and an incomprehensibly elevated risk of kidney stones are not good proxies for estimating prehistoric serum 25(OH)D in equatorial Africa. Traditionally living non-whites in tropical regions have lower 25(OH)D, in the range of 30-40 ng/mL when adjusted to correspond to the most common assay used in the literature. These populations, however, are not good proxies for prehistoric 25(OH)D because the climate was different in the prehistoric past and vitamin D synthesis probably tended to be lower back then.

Was the Serum 25(OH)D Fixed Into Our Genome Long Ago?

Should we conclude, then, that a serum 25(OH)D requirement somewhat lower than 30-40 ng/mL was indelibly fixed into our genome long, long ago, when we existed as moderately clothed, painted, and sunscreened sophisticated ape-like creatures seeking shade at mid-day under trees, rocks, and shelters constructed out of leather in equatorial Africa?

No, because the evidence suggests that the requirement for 25(OH)D was never indelibly fixed into our genome.

As I pointed out in a previous post entitled “The Scientific Approach of Weston Price, Part 2: Problems With Comparing Different ‘Racial Stocks’ With Inuit Adaptations in Vitamin D Metabolism as an Example,” evidence suggests that the Inuit are adapted to a lower 25(OH)D status than whites with European ancestry (21). On their traditional diet, at least as roughly estimated by a high intake of seal oil, their 25(OH)D is only about 20 ng/mL, yet they have a higher than normal level of the more active form, calcitriol, and a lower than normal amount of parathyroid hormone, a hormone that increases during a deficiency of vitamin D or calcium. The evidence suggests they have heritable adaptations that allow them to boost the total biological activity of vitamin D despite a low supply of 25(OH)D without any sign of a true deficiency.

African Americans may have a similar adaptation (22). We know that at least part of the reason African Americans have lower 25(OH)D than whites is because of genetic differences in vitamin D metabolism (23), and when we consider that despite lower 25(OH)D than whites they have higher levels of the more active calcitriol and greater bone mass, it seems they are likely adapted to a lower level of 25(OH)D.

I will explore the effects of genetics on 25(OH)D status in more detail in future posts. The purpose of briefly presenting this evidence here is simply to point out that there seem to be variations in the normal or optimal 25(OH)D status between different populations. This suggests that the 25(OH)D requirement was not indelibly fixed into our genome at some time in the distant prehistoric past before humans ever emerged out of Africa. It suggests that, on the contrary, the requirement has continued to evolve over time.

The Naked Ape Hypothesis Bites the Dust

The evidence suggests we were never truly naked apes, that modern lifeguards and other populations dominated by European ancestry are not good proxies for the 25(OH)D levels of our prehistoric ancestors inhabiting equatorial Africa, and that the 25(OH)D requirement was never indelibly fixed into our genome but has continued to evolve over time.

We can summarize the problems with the naked ape hypothesis graphically as follows (click to enlarge):

The many problems with the naked ape hypothesis summarized.

What the evidence from modern lifeguards tells us is that it seems to be possible to get too much vitamin D even from the sun, and their 20-fold elevated risk of kidney stones could perhaps be a harbinger of worse things to come, like an increased risk of heart disease from arterial calcification.

As always, of course, context is critical. Perhaps the levels of 25(OH)D found in Israeli lifeguards are harmless or even beneficial in the context of a nutrient-dense and balanced diet rich in all of vitamin D’s synergistic partners. But where does the burden of evidence lie? Drs. Heaney and Holick maintain that “the burden of proof must be placed on those who propose that lower intakes (and lower serum levels) are without risk of preventable dysfunction or disease.” Why shouldn’t the burden of “proof” lie on those who advocate high levels of 25(OH)D as safe and optimal when they are considerably higher than those found in numerous populations with abundant sun exposure and when they are known to be associated with large increases in the risk of soft tissue calcification?

Do we really need to prove a negative — that no problems exist at lower levels — when we have positive evidence of harm at higher levels?

Towards a Better Approach

If we are to construct a more balanced, objective, and useful approach to assessing vitamin D status and determining strategies to remedy insufficiency, we first need to understand vitamin D metabolism in a more nuanced and sophisticated way. I will eventually conclude this series with a practical how-to guide, but we have a bit of ground to cover. The next couple of posts will explain why 25(OH)D is not a specific marker of vitamin D status, despite being consistently used in this way.

I hope you stay tuned and share your thoughts in the comments along the way!

References

1. Vieth R and Sidhom G, Basic Aspects of Vitamin D Nutrition, in Osteoporosis: Pathophysiology and Clinical Management, edited by Robert A. Adler, 2010. [Google Books]

2. Vieth R. Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety. Am J Clin Nutr. 1999;69:842-56. [PubMed]

3. GrassrootsHealth. Quotes on the State of Vitamin D Science, Reference to IOM Report, from the D*action Panel of Vitamin D Scientists/Researchers. Published November 2010. Accessed December 14, 2013. [Source]

4. Heaney RP, Holick MF. Why the IOM recommendations for vitamin D are deficient. J Bone Miner Res. 2011;26(3):455-7. [PubMed]

5. Toups MA, Kitchen A, Light JE, Reed DL. Origin of Clothing Lice Indicates Early Clothing Use by Anatomically Modern Humans in Africa. Mol Biol. Evol. 2011;28(1):29-32. [PubMed]

6. Elias PM and Williams ML. Re-apparaisal of current theories for the development and loss of epidermal pigmentation in hominids and modern humans. J Hum Evol. 2013;64:687-92. [PubMed]

7. Beleza S, Santos AM, McEvoy B, Alves I, Martinho C, Cameron E, Shriver MD, Parra EJ, Rocha J. The timing of pigmentation lightening in Europeans. Mol Biol Evol. 2013;30(1):24-35. [PubMed]

8. Barham LS. Systematic Pigment Use in the Middle Pleistocene of South-Central Africa. Current Anthropology. 2002;43(1):181-90. [JSTOR]

9. Price, WA. Nutrition and Physical Degeneration. (1939, 1945). pp. 104-7. [Text available at Journeytoforever.org]

10. Nevin KG, Rajamohan T. Virgin coconut oil supplemented diet increases the antioxidant status in rats. Food Chem. 2006;99:260-6. [Science Direct]

11. Bjorn LO, McKenzie RL. Attempts to probe the ozone layer and the ultraviolet-B levels of the past. Ambio. 2007;36(5):366-71. [PubMed]

12. Wheeler PE. The thermoregulatory advantages of heat storage and shade-seeking behavior to hominids foraging in equatorial savannah environments. Journal of Human Evolution. 1994;26:339-350. [Science Direct]

13. Luxwolda MF, Kuipers RS, Kema IP, Dijck-Brouwer DAJ, Muskiet FAJ. Traditionally living populations in East Africa have a mean serum 25-hydroxyvitamin D concentration of 115 nmol/L. BJN. 2012;108(9):1557-61. [PubMed]

14. Better OS, et al. Increased Incidence of Nephrolithiasis in Lifeguards in Israel. in Massry et al., eds. Phosphate and Minerals in Health and Disease. Plenum Press, New York, 1980. [Springer]

15. Morrissey RL, Cohn RM, Empson RN Jr, Greene HL, Taunton OD, Ziporin ZZ. Relative toxicity and metabolic effects of cholecalciferol and 25-hydroxycholecalciferol in chicks. J Nutr. 1977;107(6):1027-34.

16. Hagenau T, Vest R, Gissel TN, Poulsen CS, Erlandsen M, Mosekilde L, Vestergaard P. Global vitamin D levels in relation to age, gender skin pigmentation and latitude: an ecologic meta-regression analysis. Osteoporosis Int. 2009;20(1):133-40. [PubMed]

17. Binkley N, Novotny R, Krueger D, Kawahara T, Daida YG, Lensmeyer G, Hollish BW, Drezner MK. Low vitamin D status despite abundant sun exposure. J Clin Endocrinol Metab. 2007;92(6):2130-5. [PubMed]

18. Harinarayan CV, Ramalakshmi T, Prasad UV, Sudhakar D. Vitamin D status in Andhra Pradesh: a population based study. Indian J Med Res. 2008;127(3):211-8. [PubMed]

19. de Koning L, Al-Turkmani MR, Berg AH, Shkreta A, Law T, Kellogg MD. Variation in clinical vitamin D status by DiaSorin Liaison and LC-MS/MS in the presence of elevated 25-OH vitamin D2. Clin Chim Acta. 2013;415:54-8. [PubMed]

20. Petit JR, Jouzel J, Raynaud D, Barkov NI, Barnola J-M, Basile I, Bender M, Chappellaz J, Davis M, Delaygue G, Delmotte M, Kotlyakov VM, Legrand M, Lipenkov VY, Lorius C, Pepin L, Ritz C, Saltzman E, Stievenard M. Climate and atmospheric history of the past 420,000 years from the Vostok ice core, Antarctica. Nature. 1999;399:429-36. [Nature]

21. Rejnmark L, Jorgensen ME, Pedersen MB, Hansen JC, Heickendorff L, Lauridsen AL, Mulvad G, Siggaard C, Skjoldborg H, Sorensen TB, Pedersen EB, Mosekilde L. Vitamin D insufficiency in Greenlanders on a westernized fare: ethnic differences in calcitropic hormones between Greenlanders and Danes. Calcif Tissue Int. 2004;74(3):255-63. [PubMed]

22. Weaver CM, McCabe LD, McCabe GP, Braun M, Martin BR, Dimeglio LA, Peacock M. Vitamin D status and calcium metabolism in adolescent black and white girls on a range of controlled calcium intakes. J Clin Endocrinol Metab. 2008;93(10):3907-14. [PubMed]

23. Signorello LB, Shi J, Cai Q, Zheng W, Williams SM, Long J, Cohen SS, Li G, Hollis BW, Smith JR, Blot WJ. Common variation in vitamin D pathway genes predicts circulating 25-hydroxyvitamin D Levels among African Americans. PLoS One. 2011;6(12):e28623. [PubMed]

This entry was posted in WAPF Blog and tagged ,. Bookmark thepermalink.

 

Comments Are Closed