To Meat, or Not to Meat? (Is That Really the Question?)

by Kathleen Nay

After eight years of keeping a vegetarian diet, I’m compelled to ask myself: why am I still a vegetarian? And more intriguingly, why are my former-vegan and -vegetarian friends not?

Photo: Pexels.com

Photo: Pexels.com

Eight years ago, transitioning to a vegetarian diet was my New Year’s resolution. I’d just finished reading Jonathan Safran Foer’s book Eating Animals about the dark side of animal agriculture, and I’d been with my partner—a lifelong vegetarian—for three years. At that point making the swap seemed inevitable, and I’ve pretty much been vegetarian ever since.

It wasn’t a difficult transition. My dad had become vegetarian when I was a pre-teen, and we never had much meat in the house to begin with. Meat was a “special occasion” food, or something I’d order at a restaurant, but rarely prepared at home. For me, the choice was convenient and socially acceptable. I felt convinced that a vegetarian diet was best for the planet, and it neatly sidestepped the complex feelings I had around causing harm to sentient animals and the workers who kill and process them.

But I’ve never lost that particular craving for meat that substitutes just don’t quite satisfy. Some people seem to get over this; my dad, for example, always said that he eventually stopped craving it, and no longer enjoys the taste or texture. Not so for me. If we’re operating on strict definitions of vegetarianism, then I’m technically not one—I sample a bit of turkey at the requisite holiday gatherings, and occasionally give in to a craving for a roast beef sandwich when I need a quick lunch away from home. I try not to hold myself to such high definitional standards, however, and usually identify as a plant-based eater. If I’m honest with myself, I’ve fleetingly thought about abandoning my vegetarianism, though I know that if I were to return to eating meat, I would struggle with the dissonance between my values—the social and environmental benefits of a low-impact diet—and my tastes.

I certainly wouldn’t be the first to experience such turmoil over my diet. I know several individuals who just couldn’t make a plant-based diet stick, and Internet listicles abound with people sharing how they lost their “veginity.” Reportedly, even celebrities once famed for being vegan—Bill Clinton, Anne Hathaway, Natalie Portman, and others—have ended their exclusive plant-food affairs.

So I got curious. Why do so many people, once persuaded to give up meat, transition back to it? How do those reasons compare with their motivations for avoiding animal products in the first place? Do they experience guilt or social pressures around their dietary choices, and why?

Much research has been done on factors that predict the likelihood of someone converting to a vegan or vegetarian diet. For example, being female, having greater educational achievement, and higher IQ scores in childhood have each been linked with greater likelihood of becoming vegan or vegetarian as an adult. Some research has linked feminism with vegetarianism. Other work has demonstrated that people who are oriented toward social dominance—that is, those who believe that hierarchical systems should be maintained, a personality trait that predicts social and political attitudes—are actually less likely to become vegan or vegetarian, and are also likely to view vegetarianism as a social threat.

However, the research into factors predicting lapses from vegetarianism is scant, though there are some studies beginning to appear in the literature. One very recent study by Hodson and Earle (2017) looked at whether ideology plays a role in returning to meat consumption. They found that political conservatism tends to predict lapses from vegetarian/vegan diets, particularly among eaters for whom reasons of justice (animal welfare, environmental concerns) are weakest, and for those who do not have strong social support for their dietary choices.

I wondered what I would find if I surveyed my networks. I created a survey of 25 questions for former vegetarians and vegans about why they went vegetarian in the first place; how long they adhered to a vegetarian diet; and what caused them to revert back to eating animal products. In comparison to Hodson and Earle’s work, my investigation is perhaps less academically rigorous and more qualitative in nature, but still valuable for understanding former vegetarians’ dietary motivations.

Through conversations around Friedman I’ve gathered that there are a fair number of us who once identified as vegetarian and no longer do. But I didn’t limit my query to Friedman students or alumni. A large number of people in my life are or once were vegetarian for religious purposes. Having been raised Seventh-Day Adventist, a Protestant Christian denomination whose adherents are well known for abstaining from meat, alcohol and cigarettes, it was once more common for me to meet lifelong vegetarians than to meet someone who regularly consumed meat. As I’m still well connected with this community, my survey skewed slightly toward former vegetarians who were raised with dietary restrictions and/or people who adhered to a vegetarian diet because of religious affiliation.

About 200 former vegetarians and vegans responded to my survey. Most respondents—around 77%—were female, while 18% and 4% identified as male and nonbinary, respectively (this is in keeping with considerable research finding that women are more likely to adhere to a vegetarian diet than men). Respondents’ ages ranged from 20 to 63 years, with the median age being 33. People reported having followed a vegetarian diet for an average of 9.2 years, though actual duration ranged widely, from 6 months to 39 years. Overwhelmingly (85%) respondents specified that they had followed a vegetarian diet, as opposed to being vegan, pescatarian, or fluctuating between the three. (For simplicity, I use the word vegetarian in the rest of this article to encompass all of these terms together.)

Age at conversion to vegetarian/vegan diet

Age at conversion to vegetarian/vegan diet.

Age at conversion back to meat-inclusive diet.

Age at conversion back to meat-inclusive diet.

The largest groups of respondents said they became vegetarian during their teens (45%) and twenties (25%). Respondents also reported transitioning back to eating meat during their twenties (56%) and thirties (22%), potentially suggesting that your parents were right—going vegan in your teens was just a phase. This tracks with ongoing research into the development of the adolescent brain. In a recent episode of the podcast The Gist, journalist Dina Temple-Raston explains that the insular cortex, the area of our brains responsible for causing us to feel empathy, is on hyper alert during adolescence. In her interview with host Mike Pesca, she surmises that “this may explain why you want to save the mountain gorillas when you’re 16, or why you become a vegan.” (Catch Temple-Raston’s Gist interview here.)

Indeed, the most salient reason people gave for rejecting meat in the first place was out of concern for “animal welfare” (20% of received responses). The other most common motivators cited were “health” (17%) and “environment” (16%). That last one especially resonates with me; the enormous environmental footprint of animal agriculture compared to crops is what finally convinced me to give up meat.

But then we get to the crux of my question: what was it that ultimately persuaded my respondents to resume eating animals? Here’s where the data started to get interesting.

The top three reasons respondents provided for why they returned to consuming animal products were “personal taste preferences” (21%), “health” (20%), and “convenience” (16%). Interestingly, health was a significant motivator for transition both toward and away from vegetarianism.

Motivations for converting to vegetarian/vegan diet.

Motivations for converting to vegetarian/vegan diet.

Motivations for converting back to meat-inclusive diet.

Motivations for converting back to meat-inclusive diet.

That health showed up as a primary motivator in both places was really curious to me. I wanted to dig in there, so I filtered out all the responses from individuals who said that health motivated them to both adopt a vegetarian diet and to abandon it. Samples of their comments are reproduced in the tables at right.

Pro-vegetarian/vegan health motivators.


*A common response I received was that a vegan/vegetarian diet was used to hide or aid an eating disorder. In the words of one respondent: “I said I loved animals too much to eat them but I was also really excited about the opportunity to be able to decline to eat in front of other people with a legit excuse.” Fortunately, this respondent later said that they got therapy and learned coping mechanisms as they gradually reintroduced meat to their diet. However, it would be remiss not to acknowledge that the sudden elimination of entire food groups or adoption of dogmatic dietary practices can be red flags for disordered eating. For a brief exploration of this darker side of vegetarianism, read this Psychology Today article by Hal Herzog, Ph.D.

Pro-meat health motivators.

Above: pro-vegetarian/vegan health motivators vs. pro-meat health motivators.

Other questions that yielded interesting results were about convenience and perceived social/cultural pressures to eat meat. Aside from health concerns, frequently given reasons for reverting to omnivore diets included living or traveling abroad (also “living in the South” and living among First Nations peoples in northern British Columbia); not having the time or patience to prepare vegetarian meals; lack of available options on college campuses or at restaurants; causing conflict with loved ones (family members, partners); not wanting to inconvenience hosts or seem rude/ungrateful; unwillingness to “be constantly reading labels, turning down meal invites from friends”; the financial cost of keeping a vegetarian diet; employment (“I now work in a job where we encourage row crop producers to integrate livestock to regenerate soil health…” “I work in a restaurant”); and peer pressure (“Many of my friends ate meat,” “It was culturally weird among my friends… to not eat meat,” “social pressure around parenting”).

Finally, I asked respondents about whether they felt any guilt around eating animal products since resuming the inclusion of meat in their diets. Responses were about evenly split (48% Yes; 52% No). As expected, the majority of people mentioned feeling guilt over concerns about animal cruelty and environmental impact. Other common reasons included embarrassment for not sticking with what they felt was a positive lifestyle choice, unawareness of the meat’s origins, and contradicting their cultural upbringing or religious beliefs about the uncleanliness of certain meats. When asked how they alleviated their guilt or dealt with cognitive dissonance around choices to eat meat, most respondents said that they try to minimize or moderate their meat intake; attempt to source meat locally/ethically; look for alternate ways to reduce their carbon footprint; acknowledge the animal’s life; rationalize that meat is a necessary inclusion for their personal health; try not to think about it; or simply accept their guilt.

 

Having grown up a mostly-vegetarian Seventh-Day Adventist, and having later developed a more personal, moralized dietary identity, has caused me to reflect on my own cognitive dissonance when I sneak a turkey sandwich. What does my dietary identity even mean? Upon reflection, it actually means quite little in my case; as I admitted earlier, my interpretation of a vegetarian diet is increasingly more relaxed than the term might imply to others. But the distinction between calling myself plant-based as opposed to strictly vegetarian is relatively small—a difference of one or two meals per month, at most. Somehow, to say my diet is “plant-based” makes me feel as though I can hold on to my social/environmental values while giving myself wiggle-room to accommodate the irresistible pull of sensory memory and cultural pressure—in case I get caught with said turkey sandwich.

We adhere to dietary labels and self-imposed restrictions in order to project something about our selves and our values to the world. And yet, some 84% of vegetarians and vegans eventually return to eating meat. If my survey shows me anything, it’s that people’s reasons are vast, varied… and not altogether unreasonable. Now that we’re already a month into our 2018 New Year’s resolutions, I say it’s time to adopt another goal: to start being a little more forgiving of other people’s dietary choices—and our own.

Kathleen Nay is a third-year AFE/UEP dual degree student and co-editor of The Friedman Sprout. For being a vegetarian, she spends an unreasonable amount of time thinking about meat.

Advertisements

Gut Microbiota and the Developing Child

by Ayten Salahi

Undernutrition poses a formidable threat to the health and life trajectory of children around the world. A new study examines the role of gut microbiota in modulating nutritional status and early life development, and sheds light on bacterial transplants as a potential new method to tackle this longstanding challenge.

The human gut microbiome is the bacterial ecosystem that lives predominantly in the digestive tract and plays a significant role in our immune response, neurological networks, and both our mental and physical development throughout life. The delicate balance of ‘good’ and ‘bad’ gut bacteria – or gut maturity – partially determines a developing child’s ability to absorb critical nutrients through food. Without that ability during early life, or without medical interventions to restore that ability, children are likely to manifest long-term health consequences associated with childhood undernourishment, including physical stunting, immune dysfunction, and neurodevelopmental issues. Childhood undernourishment has also been linked to permanent impairments to health and human capital, that impact both extant and future generations.

The ‘solution’ to childhood undernutrition is multivariate. As scientific understanding of microbiota continues to evolve, researchers and healthcare practitioners have begun to shift their focus towards examining how the microenvironments of our gut bacteria impact our macroenvironments, and whether these microenvironments could signal potential new treatment targets to alleviate the global burden of childhood undernutrition.

Bacterial transplants have been identified as one potential treatment. A study from Blanton et al. examined whether developmental outcomes could be inherited through microbiota – specifically, through fecal transplants. They tested what would happen if germ-free mice were transplanted with the gut bacteria of both ‘healthy’ and  ‘severely stunted’ infants and children, all of which were fed a traditional Malawian diet of cornmeal, peanuts, and kidney beans. The results showed that when germ-free mice were transplanted with fecal transplants from severely undernourished children, the mice manifested stunted growth, impaired bone morphology, and metabolic abnormalities in the muscle, liver, brain, and immune system. This study therefore suggests that gut bacteria play a role in the transference of developmental outcomes.

Findings from the same study also suggest that microbiota transplants from healthy donors could potentially prevent growth impairments and undernourished health outcomes in recipient animals, depending on the age of the donor and the type of bacteria. When researchers co-housed mice that had just received microbiota from either 6-month-old healthy donors or severely undernourished donors, microbiota from the healthy donor group overpowered and displaced the microbiota from the undernourished donor group, and prevented developmental impairments in both groups.  More research is needed to confer these findings in humans, but the results of this study present the interesting possibility that gut immaturity can be prevented and repaired through transplantation of microbiota from healthy donors. Future research must also be conducted to examine whether bacterial transplants play a role in preventing long-term mental, physical, and socioeconomic consequences of early life undernourishment, or constitute any reduction to the global burden of childhood undernutrition.

Study of microbiota in the developing child offers a compelling new lens with which to examine health inequity at the microscopic scale, with macroscopic implications for therapeutic interventions in community health. Adequate nutrition is the cornerstone of human development, and a growing body of evidence suggests that gut microbiota play an important role in promoting early life nutritional status. The potential therapeutic use of bacterial transplants could have significant implications for global nutrition programs seeking to identify new levers to improve childhood nutrition, particularly in resource-poor settings. However, gut microbiota therapeutics constitute only a small and largely theoretical part of the much bigger and more complex picture that is global nutrition. Pervasive issues around sanitation, hygiene practices, and access to potable water and nutritious food continue to constitute some of the greatest challenges to global health worldwide.

Ayten Salahi is a first-year FPAN MS and RD candidate, and is dedicated to the future of policy, programming, and clinical practice in sustainable diets. Ayten came to Friedman after working as a molecular and clinical researcher in neuropharmacology and diabetes management for nearly 8 years.

Evaluating the Pinnertest: The Importance of Scientific Evidence

by Erin Child

So, you think you have a food intolerance? What do you do? You could call your doctor and set-up an appointment that is inevitably months away. Then you have a 10-minute meeting in which they only look at their computer and refer you to a specialist, THEN go through more testing, and finally (hopefully!) get some answers. Or, you could order an at-home kit that takes 10 minutes to complete and promises results that will get you feeling better, sooner. Which one do you choose? Read on and decide.

In our current world of food intolerances and hypersensitivities in which the best path to treatment is often a conundrum, the Pinnertest promises an easy solution to any dietary woes.  A few months ago, I started noticing ads for this new test popping up on social media. The Pinnertest is an over-the-counter food intolerance testing kit that uses microarray technology to test for IgG (Immunoglobulin G) mediated sensitivities for 200 common foods.

The classic manifestations of true food allergies (hives, oral discomfort, trouble breathing, anaphylaxis, etc) are mediated by overproduction of IgE antibodies. Like IgE, IgG is a type of antibody. And IgG is the most common antibody in the human body. (The immune system releases five types of antibodies: IgA, IgE, IgG, IgD, and IgM.) Instead of testing IgE mediated allergies, the Pinnertest producers claim that the microarray technology allows them to test for IgG mediated intolerances to 200 different foods—including lettuce, quail, and baking powder—using only a few drops of blood. It sounds scientific, but also seemed too good to be true. Was it?

I started my research by reaching out to the Pinnertest folks directly. My goal? To score a pro-bono test to try it out myself and see the results first hand. I was thrilled when a friendly representative at Pinner immediately reached out to set up a phone interview (calendar evite and everything). When the day came, I called—and was sent to voicemail. Twenty minutes and five tries later, I knew I had been ghosted. My subsequent emails were ignored, and my quest to learn first-hand about the scientific evidence backing their product was squashed.

So, I began researching on my own. The Pinnertest website sports a cluttered page of medical study citations that cover work on food allergies, intolerances and Celiac Disease—but none of which provide any evidence of using IgG testing for food intolerances.  My own PubMed search [IgG + food intolerance; Immunoglobulin G + food intolerance] yielded little, but did include one recently retracted 2016 article linking IgG testing to food allergies. The rest of the Pinnertest website leads you down a rabbit-hole of B-list celebrity endorsements and every Friedman student’s favorite—Dr. Oz videos! Interestingly, nowhere on the site does it tell you the cost of the test. To find out pricing, you must first enter your information (“Hi, my name is Newt Trition”) before you discover that the test will run you a whopping $490.

To further explore if this test has any scientific merit, and is worth the hefty price tag, I reached out the Boston Food Allergy Center (BFAC). Dr. John Leung, MD, the founder and CEO of the BFAC, and the Director of the Food Allergy Center at Tufts Medical Center and Co-Director of the Food Allergy Center at Floating Hospital for Children, took some time out of his day to answer my questions. Dr. Leung said, “We have patients coming into our office on a weekly basis with that kind of report [IgG], who pay out of pocket no matter what insurance they have. [Insurance doesn’t cover the test] because there is a statement from both the American and European Societies for Allergy saying that this test has no clinical significance.”

This is something to consider in any area of medicine—if a test is not covered by insurance, it may be the first sign that clinical significance could be lacking.

My conversation with Dr. Leung was brisk, informative, and confirmed my gut reaction that this test was too good to be true. Furthermore, there is a body of literature providing evidence that IgG mediated reactions are a sign that a food intolerance does not exist, not the other way around. In a 2008 European Academy of Allergy and Clinical (EAACI) Task Force Report, the authors wrote, “food-specific IgG4 does not indicate (imminent) food allergy or intolerance, but rather a physiological response of the immune system after exposition to food components.” Simply put, IgG evidence can show that you’ve been eating that food, not that you are intolerant to it. The EAACI has been joined by their Canadian, American, and South African counterparts in clear statements that research does not support the use of IgG mediated testing for food intolerances at this time.

Having shadowed Dr. Leung at the BFAC, I know that he takes patients’ claims of food intolerances seriously, and is invested in using the best clinical practices and scientific evidence available to make the diagnosis. Concerning IgG mediated testing, he stated, “There is so little research, so from a clinical view it is not very useful, it doesn’t mean much. It is not diagnostic.” And yet, the Pinnertest website claims that the“Pinnertest is a common procedure in most European countries. In many cases, dietitians and nutritionists will ask for their client’s Pinnertest results before creating any kind of diet plan.” Since this approach directly contradicts the current EAACI recommendation, that’s highly unlikely.

I also had the opportunity to speak with Rachel Wilkinson, MS, RD, LDN, and Practice Manager of the BFAC. Rachel explained, “If patients come in concerned about food intolerances, we can do the hydrogen breath test for lactose, fructose or fructan [found in some vegetables, fruits and grains]. These are the three main ones [food intolerances] we can test for, because we actually have tests for those.” She went on to state, “What was interesting to me about the Pinnertest, was how they say they can specify one specific food–so not just a category. I honestly don’t understand how they would pinpoint a specific food. It makes more sense to me to listen to patient’s histories and to look at how their intestines are able to break down that particular group of sugars. So, I really would love to know how they [Pinnertest] are coming up with this.”

It is important to note that the Pinnertest is not just marketing itself as a food intolerance test. It is also presenting itself as a weight loss tool. Current Frances Stern Dietetic Intern and Masters Candidate Jocelyn Brault, interning at BFAC, indicated her concern: “I think this is also being marketed for weight loss, which you can see throughout their website. This is usually a good sign that we should dig deeper. Is this a proven weight loss method? This claim seemed out of nowhere to me.” Indeed, directly on the Pinnertest box it reads, “Discover which foods are making you sick or overweight.” If taken seriously, this test will result in unnecessary diet restrictions, and potential malnutrition if too many foods are unnecessarily eliminated. Rachel Wilkinson, RD noted, “if you’re going to be avoiding certain types of foods, you need to make sure your diet is still adequate. We do not want to see people over-restricting foods for no reason.”

Over the course of my research and conversations with Dr. Leung, Rachel, and Jocelyn, I confirmed that my initial gut reaction was correct: too good to be true. And here’s the kicker, so does The Pinnertest. In a tiny disclaimer at the bottom of their website, they write: “Quantification of specific IgE antibodies to foods and inhalants is an FDA-accepted diagnostic procedure for the assessment of allergies. However, the assessment of human IgG antibodies specific for individual food and inhalant antigens is not an FDA-recognized diagnostic indicator of allergy.”

It is a noble task to try to design an allergy test that does not require you to doctor hop, or wait months for an appointment, but the scientific evidence needed to back up the Pinnertest is lacking. Perhaps one day this will shift, and the body of evidence will improve. In the meantime, however, anyone who thinks they might have a food intolerance (or food allergy) is best served by going to their clinician (and then a dietitian). This at-home kit promises a quick fix, but is really just an expensive, dangerous distraction.

Erin Child is a second-semester NICBC student in the dual MS-DPD program. She is fascinated by the science of food allergy and intolerances, and will probably keep writing about them until someone tells her to stop.  With two weeks left in the semester, she would really like a nap. Like right now.

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

Putting a Pause on Peanut Butter Panic: New Guidelines Seek to Reduce Peanut Allergy Risk

by Erin Child

Do you like peanut butter? So do I. I’m kind of obsessed. Perhaps you add it to your smoothie bowl, drizzle it artfully on your Instagram worthy oatmeal, or, if you’re in grad school, it’s part of your PB&J. After all, that is the cheapest, easiest thing to make. But what if you had to take the PB out of the PB&J, and eliminate it from your diet and your life? This is a growing reality for many in the United States, with outdated, misinformed guidelines being blamed for the recent spike in peanut allergies. Read on to explore the revolutionary research that has spurred the creation of new guidelines, and why Americans need to change how we handle peanut exposure in childhood.

I recently stopped eating peanut butter in any way that could be deemed pretty or practical. Instead, you can find me in my room, with the door shut, maniacally shoveling peanut butter into my mouth with a plastic spoon.

This all started at the beginning of 2017. No, it is not some bizarre New Year’s resolution or diet trend. Rather, a new roommate moved in. She’s a great girl – kind, thoughtful, willing to learn how to properly load a dishwasher – and massively, catastrophically allergic to peanuts. She is also allergic to tree nuts and soy, but peanuts are THE BIG BAD. They are the reason why I spent the week before her arrival scrubbing my kitchen from top to bottom and running every dish and utensil (even the wooden ones, to my chagrin) through the dishwasher. And there is now an EpiPen® in our kitchen. Just as they are on some airlines, peanuts are now banned from the general living areas of my house, and thus I & my beloved jar of peanut butter have been sequestered to my room.

Many of you have probably dealt with peanut-free schools or day cares, or been informed to not consume any peanut products on your flight. Peanut allergy rates in children in the United States have quadrupled from the late 1990s (less than 0.5%) to about 2% today, and are the leading cause of anaphylaxis or death from food allergies. Thanks to my new-found awareness, I have become extremely self-conscious about eating peanut butter in public spaces. On the bus the other day some peanut butter dripped from my sandwich to the seat. I panicked, thinking “What is the chance this spill is going to wind up hurting some little kid?” (I hope they are not licking the seats on the bus, but still.)

Coupled with my new roommate’s arrival, I was fascinated to find that peanut allergies have been back in the news. On January 5th, 2017, the National Institute of Allergy and Infectious Disease (NIAID) published new guidelines for practitioners about when to introduce peanuts to high-risk, medium-risk, and low-risk infants. High-risk infants with severe eczema and/or an egg allergy should be introduced to peanuts between 4 to 6 months. Medium-risk infants with mild eczema should be introduced to peanuts by 6 months, and low-risk infants without eczema or other allergies can be introduced to peanuts any time after they have been introduced to solid foods.

These guidelines fit in with the dual-allergen exposure hypothesis. This suggests that children are first exposed to food particles through their skin as infants. This exposure primes their immune systems to treat the food proteins like invaders and build up defenses against it. If the food is eaten years later, the child has an acute allergic reaction because their immune system had ample time to prepare. Children with eczema have weakened skin barriers and are much more likely to experience repeated skin exposure to food allergens. This leads to an increased chance of an allergic reaction once they eat the food. Current research now supports this hypothesis, and also suggests that by shortening the time between skin exposure and ingestion, we will reduce the number of acute allergic reactions. The sooner an infant starts eating an allergen, the more likely the body will adjust to it without having time to bsuild up strong defenses against it.

These new guidelines on peanut exposure from NIAID seek to correct for guidelines set by the American Academy of Pediatrics in 2000. The 2000 guidelines were based on only a few tests done on hypoallergenic infant formula feeding, yet conclusively recommended that infants at high-risk for peanut allergies wait until 3 years of age to first try peanuts. Based on the newest findings, it appears that this advice was ill advised. My roommate, n=1, was born in the mid-1990s when delaying peanut exposure was coming into vogue. She had severe eczema an infant, and following doctors’ recommendations, wasn’t introduced to peanuts until somewhere between 18-24 months old. She is equally fascinated with the new research, and wishes there was some way to know if the outcome would have been different had she tried them at a younger age.

Peanut allergies are more common in the US, UK, and Australia, which are also the countries that have historically had the most stringent recommendations around peanut introduction. As doctors and researchers sought to figure out why peanut allergies were ballooning, they looked to countries with very low peanut allergy rates, like Israel, where infants are introduced to peanuts at early ages. In Israel, instead of Cheerios, infants are given a peanut based snack, called Bamba, as one of their first foods. In other developing countries, infants are exposed to peanuts early on—both in their environment and in their food. These other countries also have much lower allergy rates.

In 2015, NIAID funded the Learning Early About Peanut Allergy (LEAP) study to determine whether early exposure to peanuts would decrease the incidence of peanut allergies. The UK study was a randomized controlled trial including 640 infants between 4 and 11 months of age with severe eczema and/or egg allergy. The infants were split into two groups (based on skin prick test results for peanuts) and then randomized to either eat or avoid peanuts until 60 months old (5 years). For infants in the negative skin prick test group, 13.7% of those who avoided peanuts had developed an allergy and only 1.9% of those who ate peanuts developed an allergy (P<0.001). For infants in the positive skin prick test group, 35.3% who of those who avoided peanuts had developed an allergy and 10.6% of those who ate peanuts developed an allergy (P=0.004). These results were significant and stunning, prompting the formulation of the current NIAID guidelines.

So, should we all start slathering our babies in peanut butter? Maybe. (As always, talk to your pediatrician). Food allergy science is an evolving field, and what is true today may not hold true a decade down the line. But based on the significance of the current research and the lower peanut allergy rates in cultures and countries that do not limit peanut exposure, the evidence strongly indicates that parents in the United States should change their approach.

Only 20% of children diagnosed with peanut allergies will grow out of them. The vast majority, like my roommate, are allergic for life. For now, research on reducing peanut allergies in adults is limited, making it unlikely that we will be eliminating any allergies anytime soon. So for now, I will continue to eat my peanut butter in my room. Alone.

Erin Child is a second semester NICBC student in the dual MS-DPD program and this is her first article for the Sprout. She loves cooking (usually with Friends or Parks & Rec on in the background). She hates brownies. (Seriously.) As the Logistics Co-Chair for the Student Research Conference, she looks forward to seeing everyone there on April 8th!

WIC at the Crossroads of the Opioid Epidemic

by Danièle Todorov

The complexity and pervasiveness of the opioid epidemic has forced government agencies to be innovative with their resources. The Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) in a prime position to care for pregnant women affected by the epidemic and has stepped up to the plate.

In January of 2016, then Secretary of Agriculture Tom Vilsack was appointed by President Obama to lead an interagency taskforce to address the opioid epidemic in rural America. Secretary Vilsack, who’s been outspoken about his own mother’s struggle with prescription drug addiction, knew that compassion and collaboration would be vital. His agency, the USDA, has unique resources and relationships in rural areas, putting it in a prime position to address the epidemic.

Addressing the epidemic is no simple task. According to the CDC, 91 Americans died daily from opioid overdose in 2015. Nearly half of these deaths involved a prescription opioid, used in the treatment of pain. In a town hall meeting in Missouri last July, Secretary Vilsack stated that due to “the devastating toll that opioid misuse has taken on our communities, and particularly rural areas, I have tasked USDA with creatively using all of the resources at our disposal to stem the tide of this epidemic” [1]. Interestingly, Secretary Vilsack highlighted WIC as a resource that could be creatively used. “For many women”, he stated, “WIC is their first point of entry into the healthcare system, and we have an opportunity to intercept and potentially prevent dangerous health outcomes for both the mother and the child” [1].

Pain management is an important part of pregnancy care. The prescription of opioids for pain in pregnancy is increasingly common; 1 in 5 Medicaid-enrolled women were prescribed an opioid at some point during their pregnancy in 2014 [2]. However, the effect of opioids on birth outcomes is understudied. In utero opioid exposure may be associated with preterm delivery and low birth weight [3]. Exposed neonates may develop withdrawal symptoms, a condition known as neonatal abstinence syndrome, which is associated with increased risk of seizures and breathing difficulties [3]. Similarly understudied are the rates of opioid abuse during pregnancy. We do know that pregnant women with substance abuse problems are particularly vulnerable to food and job insecurities and unstable housing, which exacerbate potential health complications [4].

The healthcare system often stigmatizes and underserves pregnant women with substance abuse problems. However, WIC is increasing its ability to engage them in care. WIC’s mission is to promote the health of low-income women and their children by providing nutritious food, health education, and referrals. Starting in 2014, WIC agencies have increased staff training surrounding substance abuse [1]. Staff are better equipped to notice potential substance abuse, to educate WIC participants about the dangers of substance abuse during pregnancy and breastfeeding, and to connect them with local resources. These expanded roles align with WIC’s mission, not only because they aim to protect the health of the women they serve, but because WIC “acknowledges that substance use is incompatible with good nutrition” [5].

WIC is forming relationships with women at a promising point in time in their lives. In their staff training guide, WIC cites a study showing that women are “more motivated to improve their lifestyle and health habits during periods when they make the transition from one life situation or role to another… WIC participants are a natural target audience for substance use information because they are, by definition, in the life transition stage of pregnancy and new motherhood” [5].

WIC is playing an important part in the collaborative response to the epidemic. As the director of the USDA, Secretary Vilsack understood that a holistic response was the only effective solution and embraced President Obama’s mandate. “This disease isn’t a personal choice,” says Secretary Vilsack, “and it can’t be cured by willpower alone. It requires responses from whole communities, access to medical treatment, and an incredible amount of support. To me, our mandate is clear: don’t judge, just help” [6]. Secretary Vilsack’s endorsement of his replacement as Secretary of Agriculture, nominee Sonny Perdue, gives hope that the USDA will continue this vital endeavor.

Sources

  1. Agriculture Secretary Vilsack Announces Substance Misuse Prevention Resources for Low Income Pregnant Women and Mothers In Order to Battle the Opioid Epidemic, U. Office of Communications, Editor. 2016.
  2. Desai, R.J., et al., Increase in prescription opioid use during pregnancy among Medicaid-enrolled women. Obstetrics and gynecology, 2014. 123(5): p. 997.
  3. Patrick, S.W., et al., Prescription opioid epidemic and infant outcomes. Pediatrics, 2015. 135(5): p. 842-850.
  4. Sutter, M.B., S. Gopman, and L. Leeman, Patient-centered Care to Address Barriers for Pregnant Women with Opioid Dependence. Obstetrics and Gynecology Clinics of North America, 2017. 44(1): p. 95-107.
  5. Substance Use Prevention: Screening, Education, and Referral Resource Guide for Local WIC Agencies, F.a.N.S. U.S. Department of Agriculture, Editor. 2013.
  6. USDA. Addressing the Heroin and Prescription Opioid Epidemic. 2016 02/17/17].

Danièle Todorov is a first-year nutritional epidemiology student with a focus on pregnancy nutrition and birth outcomes.

 

What’s the Deal with Vitamin D?

by Katelyn Castro

There is always one nutrient that seems to linger in the media for a while. Lately, vitamin D has been the lucky winner! Considering that over 40% of Americans are vitamin D deficient, according to the National Health and Nutrition Examination Survey (NHANES), it’s worth taking a closer look at vitamin D.

Depression, cancer, heart disease, and type 1 diabetes are some of the many health conditions that have been linked to vitamin D deficiency. While it is too soon to point to vitamin D as a cure-all, this vitamin may be more important for our health than previously thought—especially during the winter months in New England!

Why is Vitamin D Important?

Vitamin D is most often known for its role in bone health, increasing calcium absorption and helping with bone mineralization alongside calcium and phosphorus. Historically, rickets in children and osteoporosis and bone fractures in adults have been the most common signs of vitamin D deficiency.

As a fat-soluble vitamin and a hormone, vitamin D is also involved in many other important metabolic processes. Did you know vitamin D activates over one thousand genes in the human genome? For example, vitamin D is needed for protein transcription within skeletal muscle, which may explain why vitamin D deficiency is associated with poor athletic performance. Vitamin D also regulates blood pressure by suppressing renin gene expression, supporting the possible relationship between vitamin D deficiency and risk of heart disease. Additionally, vitamin D status may alter immunity due to its role in cytokine production. Studies have found that vitamin D deficiency is associated with upper respiratory tract infections. While more research is needed to explore these connections, these findings continue to suggest that vitamin D plays an integral role in bone, muscle, cardiac, and immune health.

Where Do You Get Vitamin D?

Only a few foods are natural sources of vitamin D, including eggs and fatty fish like salmon, mackerel, tuna, and sardines. Instead, vitamin D-fortified foods like dairy products, juices, and breakfast cereals make up the majority of Americans’ vitamin D intake.

Sun exposure, on the other hand, can be the greatest source of vitamin D for some people–hence vitamin D’s nickname, the “sunshine vitamin.” Unlike any other vitamin, vitamin D can be synthesized in the body when the sun’s ultraviolet B rays reach the skin and convert cholesterol into a Vitamin D3, the precursor for vitamin D. Then, Vitamin D3 diffuses through the skin into the blood, where it is transported to the liver and kidneys and converted into vitamin D’s active form, 25(OH)D.

Research has found that exposing arms, legs, and face to the sun for 15 to 30 minutes twice a week provides about 1000 international units of vitamin D (equal to about 10 cups of milk!). Despite this robust source of vitamin D, deficiency is surprisingly common in the U.S.

Who is at Risk of Vitamin D Deficiency?

Many circumstances can alter vitamin D synthesis and absorption, increasing risk of vitamin D deficiency. Some of the factors that have been found to impact vitamin D status include the following:

  • Season: According to research, during the months of November to February, people living more than 37 degrees latitude north or south produce little or no vitamin D from the sun due of the angle of ultraviolet B sunrays. While vitamin D is stored in fat tissue and can be released into the blood when needed, our stores typically only last one to two months.
  • Limited Sun Exposure: Vitamin D synthesis can also be blocked when sunscreen is applied correctly or when long robes or head coverings are worn for religious reasons. For example, sunscreen with a sun protection factor (SPF) of 8 decreased vitamin D synthesis in skin by about 95% in one study.
  • Skin Color: People with darker skin pigmentation have also been found in research to have lower levels of vitamin D due to decreased synthesis. This is supported by the high prevalence of vitamin D deficiency among certain ethnicities, with 82% African Americans and 69% Hispanics found to be vitamin D deficient according to NHANES.
  • Weight: Studies also suggest that overweight and obese people may have higher Vitamin D requirements. Since they have more body fat and since vitamin D is a fat-soluble vitamin, vitamin D is more widely distributed in fat tissue, making it less bioavailable. As a result, more vitamin D may be needed for it to reach the blood stream for distribution in the body.
  • Age: Older adults have been found to have lower levels of the vitamin D, likely due to both decreased sun exposure and inefficient synthesis. One study found that 70 year-olds had about 25% of the vitamin D precursor compared to young adults, which decreased vitamin D synthesis in the skin by 75%.
  • Fat Malabsorption: When any gastrointestinal disorder or other health condition impairs fat absorption (i.e. liver disease, cystic fibrosis, celiac disease, or Crohn’s disease), vitamin D is also poorly absorbed and utilized since Vitamin D is a fat-soluble vitamin.

 Vitamin D deficiency can be especially concerning because symptoms like bone pain and muscle weakness may go undetected in the early stages of deficiency. Although physicians do not routinely check vitamin D levels, those at risk of deficiency may benefit from a serum 25(OH)D test. This is a simple test used to measure the level of vitamin D circulating in blood, with levels less 20 nanograms per milliliter commonly used to diagnose deficiency. However, some organizations like the Endocrine Society argue that levels greater than 30 nanograms per milliliter should be recommended for optimal bone and muscle metabolism.

How Much Vitamin D Do You Need?

Similar to vitamin D serum levels, no ideal vitamin D intake has been well established since many factors contribute to vitamin D status. The U.S. Institute of Medicine recommends 600 to 800 international units (IU) of vitamin D daily for adults, assuming minimal sun exposure. On the other hand, the National Osteoporosis Foundation recommends larger doses of 1000 to 1200 IU daily for adults to support adequate bone health. Although vitamin D toxicity is rare, an upper level of 4000 IU has been set by the Institute of Medicine since extremely high levels can lead to calcium buildup, and could cause poor appetite, nausea, vomiting, weakness, and kidney problems.

With limited amounts of vitamin D provided from food, even fortified foods, diet alone is usually inadequate to meet vitamin D needs. For example, you would need to drink about 8 cups of milk every day to reach 800 IU of vitamin D from diet alone! While sun exposure can supplement food intake to meet vitamin D needs, many Americans still fall short of their needs due the factors outlined above.

For the 40% of Americans who have been found to be vitamin D deficient, vitamin D supplementation can be an effective and safe way to meet needs. Whether you’re an avid sunscreen-user or living here in New England during these fall and winter months, a daily vitamin D supplement can ensure that vitamin D stores are adequate. Multivitamins typically provide 400 IU of vitamin D, but a separate vitamin D supplement (D2 or D3) with 800 or 1000 IU may be needed to meet daily intake recommendations.

Katelyn Castro is a second-year student in the Dietetic Internship/MS Nutrition Program at the Friedman School. During the summer, she enjoys soaking up the sun if only for an excuse to get her daily dose of Vitamin D. During the winter, you can find her trekking through the snow, bundled up like the boy in A Christmas Story, and contemplating whether she needs a D supplement.