Evaluating the Pinnertest: The Importance of Scientific Evidence

by Erin Child

So, you think you have a food intolerance? What do you do? You could call your doctor and set-up an appointment that is inevitably months away. Then you have a 10-minute meeting in which they only look at their computer and refer you to a specialist, THEN go through more testing, and finally (hopefully!) get some answers. Or, you could order an at-home kit that takes 10 minutes to complete and promises results that will get you feeling better, sooner. Which one do you choose? Read on and decide.

In our current world of food intolerances and hypersensitivities in which the best path to treatment is often a conundrum, the Pinnertest promises an easy solution to any dietary woes.  A few months ago, I started noticing ads for this new test popping up on social media. The Pinnertest is an over-the-counter food intolerance testing kit that uses microarray technology to test for IgG (Immunoglobulin G) mediated sensitivities for 200 common foods.

The classic manifestations of true food allergies (hives, oral discomfort, trouble breathing, anaphylaxis, etc) are mediated by overproduction of IgE antibodies. Like IgE, IgG is a type of antibody. And IgG is the most common antibody in the human body. (The immune system releases five types of antibodies: IgA, IgE, IgG, IgD, and IgM.) Instead of testing IgE mediated allergies, the Pinnertest producers claim that the microarray technology allows them to test for IgG mediated intolerances to 200 different foods—including lettuce, quail, and baking powder—using only a few drops of blood. It sounds scientific, but also seemed too good to be true. Was it?

I started my research by reaching out to the Pinnertest folks directly. My goal? To score a pro-bono test to try it out myself and see the results first hand. I was thrilled when a friendly representative at Pinner immediately reached out to set up a phone interview (calendar evite and everything). When the day came, I called—and was sent to voicemail. Twenty minutes and five tries later, I knew I had been ghosted. My subsequent emails were ignored, and my quest to learn first-hand about the scientific evidence backing their product was squashed.

So, I began researching on my own. The Pinnertest website sports a cluttered page of medical study citations that cover work on food allergies, intolerances and Celiac Disease—but none of which provide any evidence of using IgG testing for food intolerances.  My own PubMed search [IgG + food intolerance; Immunoglobulin G + food intolerance] yielded little, but did include one recently retracted 2016 article linking IgG testing to food allergies. The rest of the Pinnertest website leads you down a rabbit-hole of B-list celebrity endorsements and every Friedman student’s favorite—Dr. Oz videos! Interestingly, nowhere on the site does it tell you the cost of the test. To find out pricing, you must first enter your information (“Hi, my name is Newt Trition”) before you discover that the test will run you a whopping $490.

To further explore if this test has any scientific merit, and is worth the hefty price tag, I reached out the Boston Food Allergy Center (BFAC). Dr. John Leung, MD, the founder and CEO of the BFAC, and the Director of the Food Allergy Center at Tufts Medical Center and Co-Director of the Food Allergy Center at Floating Hospital for Children, took some time out of his day to answer my questions. Dr. Leung said, “We have patients coming into our office on a weekly basis with that kind of report [IgG], who pay out of pocket no matter what insurance they have. [Insurance doesn’t cover the test] because there is a statement from both the American and European Societies for Allergy saying that this test has no clinical significance.”

This is something to consider in any area of medicine—if a test is not covered by insurance, it may be the first sign that clinical significance could be lacking.

My conversation with Dr. Leung was brisk, informative, and confirmed my gut reaction that this test was too good to be true. Furthermore, there is a body of literature providing evidence that IgG mediated reactions are a sign that a food intolerance does not exist, not the other way around. In a 2008 European Academy of Allergy and Clinical (EAACI) Task Force Report, the authors wrote, “food-specific IgG4 does not indicate (imminent) food allergy or intolerance, but rather a physiological response of the immune system after exposition to food components.” Simply put, IgG evidence can show that you’ve been eating that food, not that you are intolerant to it. The EAACI has been joined by their Canadian, American, and South African counterparts in clear statements that research does not support the use of IgG mediated testing for food intolerances at this time.

Having shadowed Dr. Leung at the BFAC, I know that he takes patients’ claims of food intolerances seriously, and is invested in using the best clinical practices and scientific evidence available to make the diagnosis. Concerning IgG mediated testing, he stated, “There is so little research, so from a clinical view it is not very useful, it doesn’t mean much. It is not diagnostic.” And yet, the Pinnertest website claims that the“Pinnertest is a common procedure in most European countries. In many cases, dietitians and nutritionists will ask for their client’s Pinnertest results before creating any kind of diet plan.” Since this approach directly contradicts the current EAACI recommendation, that’s highly unlikely.

I also had the opportunity to speak with Rachel Wilkinson, MS, RD, LDN, and Practice Manager of the BFAC. Rachel explained, “If patients come in concerned about food intolerances, we can do the hydrogen breath test for lactose, fructose or fructan [found in some vegetables, fruits and grains]. These are the three main ones [food intolerances] we can test for, because we actually have tests for those.” She went on to state, “What was interesting to me about the Pinnertest, was how they say they can specify one specific food–so not just a category. I honestly don’t understand how they would pinpoint a specific food. It makes more sense to me to listen to patient’s histories and to look at how their intestines are able to break down that particular group of sugars. So, I really would love to know how they [Pinnertest] are coming up with this.”

It is important to note that the Pinnertest is not just marketing itself as a food intolerance test. It is also presenting itself as a weight loss tool. Current Frances Stern Dietetic Intern and Masters Candidate Jocelyn Brault, interning at BFAC, indicated her concern: “I think this is also being marketed for weight loss, which you can see throughout their website. This is usually a good sign that we should dig deeper. Is this a proven weight loss method? This claim seemed out of nowhere to me.” Indeed, directly on the Pinnertest box it reads, “Discover which foods are making you sick or overweight.” If taken seriously, this test will result in unnecessary diet restrictions, and potential malnutrition if too many foods are unnecessarily eliminated. Rachel Wilkinson, RD noted, “if you’re going to be avoiding certain types of foods, you need to make sure your diet is still adequate. We do not want to see people over-restricting foods for no reason.”

Over the course of my research and conversations with Dr. Leung, Rachel, and Jocelyn, I confirmed that my initial gut reaction was correct: too good to be true. And here’s the kicker, so does The Pinnertest. In a tiny disclaimer at the bottom of their website, they write: “Quantification of specific IgE antibodies to foods and inhalants is an FDA-accepted diagnostic procedure for the assessment of allergies. However, the assessment of human IgG antibodies specific for individual food and inhalant antigens is not an FDA-recognized diagnostic indicator of allergy.”

It is a noble task to try to design an allergy test that does not require you to doctor hop, or wait months for an appointment, but the scientific evidence needed to back up the Pinnertest is lacking. Perhaps one day this will shift, and the body of evidence will improve. In the meantime, however, anyone who thinks they might have a food intolerance (or food allergy) is best served by going to their clinician (and then a dietitian). This at-home kit promises a quick fix, but is really just an expensive, dangerous distraction.

Erin Child is a second-semester NICBC student in the dual MS-DPD program. She is fascinated by the science of food allergy and intolerances, and will probably keep writing about them until someone tells her to stop.  With two weeks left in the semester, she would really like a nap. Like right now.

Advertisements

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

Putting a Pause on Peanut Butter Panic: New Guidelines Seek to Reduce Peanut Allergy Risk

by Erin Child

Do you like peanut butter? So do I. I’m kind of obsessed. Perhaps you add it to your smoothie bowl, drizzle it artfully on your Instagram worthy oatmeal, or, if you’re in grad school, it’s part of your PB&J. After all, that is the cheapest, easiest thing to make. But what if you had to take the PB out of the PB&J, and eliminate it from your diet and your life? This is a growing reality for many in the United States, with outdated, misinformed guidelines being blamed for the recent spike in peanut allergies. Read on to explore the revolutionary research that has spurred the creation of new guidelines, and why Americans need to change how we handle peanut exposure in childhood.

I recently stopped eating peanut butter in any way that could be deemed pretty or practical. Instead, you can find me in my room, with the door shut, maniacally shoveling peanut butter into my mouth with a plastic spoon.

This all started at the beginning of 2017. No, it is not some bizarre New Year’s resolution or diet trend. Rather, a new roommate moved in. She’s a great girl – kind, thoughtful, willing to learn how to properly load a dishwasher – and massively, catastrophically allergic to peanuts. She is also allergic to tree nuts and soy, but peanuts are THE BIG BAD. They are the reason why I spent the week before her arrival scrubbing my kitchen from top to bottom and running every dish and utensil (even the wooden ones, to my chagrin) through the dishwasher. And there is now an EpiPen® in our kitchen. Just as they are on some airlines, peanuts are now banned from the general living areas of my house, and thus I & my beloved jar of peanut butter have been sequestered to my room.

Many of you have probably dealt with peanut-free schools or day cares, or been informed to not consume any peanut products on your flight. Peanut allergy rates in children in the United States have quadrupled from the late 1990s (less than 0.5%) to about 2% today, and are the leading cause of anaphylaxis or death from food allergies. Thanks to my new-found awareness, I have become extremely self-conscious about eating peanut butter in public spaces. On the bus the other day some peanut butter dripped from my sandwich to the seat. I panicked, thinking “What is the chance this spill is going to wind up hurting some little kid?” (I hope they are not licking the seats on the bus, but still.)

Coupled with my new roommate’s arrival, I was fascinated to find that peanut allergies have been back in the news. On January 5th, 2017, the National Institute of Allergy and Infectious Disease (NIAID) published new guidelines for practitioners about when to introduce peanuts to high-risk, medium-risk, and low-risk infants. High-risk infants with severe eczema and/or an egg allergy should be introduced to peanuts between 4 to 6 months. Medium-risk infants with mild eczema should be introduced to peanuts by 6 months, and low-risk infants without eczema or other allergies can be introduced to peanuts any time after they have been introduced to solid foods.

These guidelines fit in with the dual-allergen exposure hypothesis. This suggests that children are first exposed to food particles through their skin as infants. This exposure primes their immune systems to treat the food proteins like invaders and build up defenses against it. If the food is eaten years later, the child has an acute allergic reaction because their immune system had ample time to prepare. Children with eczema have weakened skin barriers and are much more likely to experience repeated skin exposure to food allergens. This leads to an increased chance of an allergic reaction once they eat the food. Current research now supports this hypothesis, and also suggests that by shortening the time between skin exposure and ingestion, we will reduce the number of acute allergic reactions. The sooner an infant starts eating an allergen, the more likely the body will adjust to it without having time to bsuild up strong defenses against it.

These new guidelines on peanut exposure from NIAID seek to correct for guidelines set by the American Academy of Pediatrics in 2000. The 2000 guidelines were based on only a few tests done on hypoallergenic infant formula feeding, yet conclusively recommended that infants at high-risk for peanut allergies wait until 3 years of age to first try peanuts. Based on the newest findings, it appears that this advice was ill advised. My roommate, n=1, was born in the mid-1990s when delaying peanut exposure was coming into vogue. She had severe eczema an infant, and following doctors’ recommendations, wasn’t introduced to peanuts until somewhere between 18-24 months old. She is equally fascinated with the new research, and wishes there was some way to know if the outcome would have been different had she tried them at a younger age.

Peanut allergies are more common in the US, UK, and Australia, which are also the countries that have historically had the most stringent recommendations around peanut introduction. As doctors and researchers sought to figure out why peanut allergies were ballooning, they looked to countries with very low peanut allergy rates, like Israel, where infants are introduced to peanuts at early ages. In Israel, instead of Cheerios, infants are given a peanut based snack, called Bamba, as one of their first foods. In other developing countries, infants are exposed to peanuts early on—both in their environment and in their food. These other countries also have much lower allergy rates.

In 2015, NIAID funded the Learning Early About Peanut Allergy (LEAP) study to determine whether early exposure to peanuts would decrease the incidence of peanut allergies. The UK study was a randomized controlled trial including 640 infants between 4 and 11 months of age with severe eczema and/or egg allergy. The infants were split into two groups (based on skin prick test results for peanuts) and then randomized to either eat or avoid peanuts until 60 months old (5 years). For infants in the negative skin prick test group, 13.7% of those who avoided peanuts had developed an allergy and only 1.9% of those who ate peanuts developed an allergy (P<0.001). For infants in the positive skin prick test group, 35.3% who of those who avoided peanuts had developed an allergy and 10.6% of those who ate peanuts developed an allergy (P=0.004). These results were significant and stunning, prompting the formulation of the current NIAID guidelines.

So, should we all start slathering our babies in peanut butter? Maybe. (As always, talk to your pediatrician). Food allergy science is an evolving field, and what is true today may not hold true a decade down the line. But based on the significance of the current research and the lower peanut allergy rates in cultures and countries that do not limit peanut exposure, the evidence strongly indicates that parents in the United States should change their approach.

Only 20% of children diagnosed with peanut allergies will grow out of them. The vast majority, like my roommate, are allergic for life. For now, research on reducing peanut allergies in adults is limited, making it unlikely that we will be eliminating any allergies anytime soon. So for now, I will continue to eat my peanut butter in my room. Alone.

Erin Child is a second semester NICBC student in the dual MS-DPD program and this is her first article for the Sprout. She loves cooking (usually with Friends or Parks & Rec on in the background). She hates brownies. (Seriously.) As the Logistics Co-Chair for the Student Research Conference, she looks forward to seeing everyone there on April 8th!

WIC at the Crossroads of the Opioid Epidemic

by Danièle Todorov

The complexity and pervasiveness of the opioid epidemic has forced government agencies to be innovative with their resources. The Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) in a prime position to care for pregnant women affected by the epidemic and has stepped up to the plate.

In January of 2016, then Secretary of Agriculture Tom Vilsack was appointed by President Obama to lead an interagency taskforce to address the opioid epidemic in rural America. Secretary Vilsack, who’s been outspoken about his own mother’s struggle with prescription drug addiction, knew that compassion and collaboration would be vital. His agency, the USDA, has unique resources and relationships in rural areas, putting it in a prime position to address the epidemic.

Addressing the epidemic is no simple task. According to the CDC, 91 Americans died daily from opioid overdose in 2015. Nearly half of these deaths involved a prescription opioid, used in the treatment of pain. In a town hall meeting in Missouri last July, Secretary Vilsack stated that due to “the devastating toll that opioid misuse has taken on our communities, and particularly rural areas, I have tasked USDA with creatively using all of the resources at our disposal to stem the tide of this epidemic” [1]. Interestingly, Secretary Vilsack highlighted WIC as a resource that could be creatively used. “For many women”, he stated, “WIC is their first point of entry into the healthcare system, and we have an opportunity to intercept and potentially prevent dangerous health outcomes for both the mother and the child” [1].

Pain management is an important part of pregnancy care. The prescription of opioids for pain in pregnancy is increasingly common; 1 in 5 Medicaid-enrolled women were prescribed an opioid at some point during their pregnancy in 2014 [2]. However, the effect of opioids on birth outcomes is understudied. In utero opioid exposure may be associated with preterm delivery and low birth weight [3]. Exposed neonates may develop withdrawal symptoms, a condition known as neonatal abstinence syndrome, which is associated with increased risk of seizures and breathing difficulties [3]. Similarly understudied are the rates of opioid abuse during pregnancy. We do know that pregnant women with substance abuse problems are particularly vulnerable to food and job insecurities and unstable housing, which exacerbate potential health complications [4].

The healthcare system often stigmatizes and underserves pregnant women with substance abuse problems. However, WIC is increasing its ability to engage them in care. WIC’s mission is to promote the health of low-income women and their children by providing nutritious food, health education, and referrals. Starting in 2014, WIC agencies have increased staff training surrounding substance abuse [1]. Staff are better equipped to notice potential substance abuse, to educate WIC participants about the dangers of substance abuse during pregnancy and breastfeeding, and to connect them with local resources. These expanded roles align with WIC’s mission, not only because they aim to protect the health of the women they serve, but because WIC “acknowledges that substance use is incompatible with good nutrition” [5].

WIC is forming relationships with women at a promising point in time in their lives. In their staff training guide, WIC cites a study showing that women are “more motivated to improve their lifestyle and health habits during periods when they make the transition from one life situation or role to another… WIC participants are a natural target audience for substance use information because they are, by definition, in the life transition stage of pregnancy and new motherhood” [5].

WIC is playing an important part in the collaborative response to the epidemic. As the director of the USDA, Secretary Vilsack understood that a holistic response was the only effective solution and embraced President Obama’s mandate. “This disease isn’t a personal choice,” says Secretary Vilsack, “and it can’t be cured by willpower alone. It requires responses from whole communities, access to medical treatment, and an incredible amount of support. To me, our mandate is clear: don’t judge, just help” [6]. Secretary Vilsack’s endorsement of his replacement as Secretary of Agriculture, nominee Sonny Perdue, gives hope that the USDA will continue this vital endeavor.

Sources

  1. Agriculture Secretary Vilsack Announces Substance Misuse Prevention Resources for Low Income Pregnant Women and Mothers In Order to Battle the Opioid Epidemic, U. Office of Communications, Editor. 2016.
  2. Desai, R.J., et al., Increase in prescription opioid use during pregnancy among Medicaid-enrolled women. Obstetrics and gynecology, 2014. 123(5): p. 997.
  3. Patrick, S.W., et al., Prescription opioid epidemic and infant outcomes. Pediatrics, 2015. 135(5): p. 842-850.
  4. Sutter, M.B., S. Gopman, and L. Leeman, Patient-centered Care to Address Barriers for Pregnant Women with Opioid Dependence. Obstetrics and Gynecology Clinics of North America, 2017. 44(1): p. 95-107.
  5. Substance Use Prevention: Screening, Education, and Referral Resource Guide for Local WIC Agencies, F.a.N.S. U.S. Department of Agriculture, Editor. 2013.
  6. USDA. Addressing the Heroin and Prescription Opioid Epidemic. 2016 02/17/17].

Danièle Todorov is a first-year nutritional epidemiology student with a focus on pregnancy nutrition and birth outcomes.

 

What’s the Deal with Vitamin D?

by Katelyn Castro

There is always one nutrient that seems to linger in the media for a while. Lately, vitamin D has been the lucky winner! Considering that over 40% of Americans are vitamin D deficient, according to the National Health and Nutrition Examination Survey (NHANES), it’s worth taking a closer look at vitamin D.

Depression, cancer, heart disease, and type 1 diabetes are some of the many health conditions that have been linked to vitamin D deficiency. While it is too soon to point to vitamin D as a cure-all, this vitamin may be more important for our health than previously thought—especially during the winter months in New England!

Why is Vitamin D Important?

Vitamin D is most often known for its role in bone health, increasing calcium absorption and helping with bone mineralization alongside calcium and phosphorus. Historically, rickets in children and osteoporosis and bone fractures in adults have been the most common signs of vitamin D deficiency.

As a fat-soluble vitamin and a hormone, vitamin D is also involved in many other important metabolic processes. Did you know vitamin D activates over one thousand genes in the human genome? For example, vitamin D is needed for protein transcription within skeletal muscle, which may explain why vitamin D deficiency is associated with poor athletic performance. Vitamin D also regulates blood pressure by suppressing renin gene expression, supporting the possible relationship between vitamin D deficiency and risk of heart disease. Additionally, vitamin D status may alter immunity due to its role in cytokine production. Studies have found that vitamin D deficiency is associated with upper respiratory tract infections. While more research is needed to explore these connections, these findings continue to suggest that vitamin D plays an integral role in bone, muscle, cardiac, and immune health.

Where Do You Get Vitamin D?

Only a few foods are natural sources of vitamin D, including eggs and fatty fish like salmon, mackerel, tuna, and sardines. Instead, vitamin D-fortified foods like dairy products, juices, and breakfast cereals make up the majority of Americans’ vitamin D intake.

Sun exposure, on the other hand, can be the greatest source of vitamin D for some people–hence vitamin D’s nickname, the “sunshine vitamin.” Unlike any other vitamin, vitamin D can be synthesized in the body when the sun’s ultraviolet B rays reach the skin and convert cholesterol into a Vitamin D3, the precursor for vitamin D. Then, Vitamin D3 diffuses through the skin into the blood, where it is transported to the liver and kidneys and converted into vitamin D’s active form, 25(OH)D.

Research has found that exposing arms, legs, and face to the sun for 15 to 30 minutes twice a week provides about 1000 international units of vitamin D (equal to about 10 cups of milk!). Despite this robust source of vitamin D, deficiency is surprisingly common in the U.S.

Who is at Risk of Vitamin D Deficiency?

Many circumstances can alter vitamin D synthesis and absorption, increasing risk of vitamin D deficiency. Some of the factors that have been found to impact vitamin D status include the following:

  • Season: According to research, during the months of November to February, people living more than 37 degrees latitude north or south produce little or no vitamin D from the sun due of the angle of ultraviolet B sunrays. While vitamin D is stored in fat tissue and can be released into the blood when needed, our stores typically only last one to two months.
  • Limited Sun Exposure: Vitamin D synthesis can also be blocked when sunscreen is applied correctly or when long robes or head coverings are worn for religious reasons. For example, sunscreen with a sun protection factor (SPF) of 8 decreased vitamin D synthesis in skin by about 95% in one study.
  • Skin Color: People with darker skin pigmentation have also been found in research to have lower levels of vitamin D due to decreased synthesis. This is supported by the high prevalence of vitamin D deficiency among certain ethnicities, with 82% African Americans and 69% Hispanics found to be vitamin D deficient according to NHANES.
  • Weight: Studies also suggest that overweight and obese people may have higher Vitamin D requirements. Since they have more body fat and since vitamin D is a fat-soluble vitamin, vitamin D is more widely distributed in fat tissue, making it less bioavailable. As a result, more vitamin D may be needed for it to reach the blood stream for distribution in the body.
  • Age: Older adults have been found to have lower levels of the vitamin D, likely due to both decreased sun exposure and inefficient synthesis. One study found that 70 year-olds had about 25% of the vitamin D precursor compared to young adults, which decreased vitamin D synthesis in the skin by 75%.
  • Fat Malabsorption: When any gastrointestinal disorder or other health condition impairs fat absorption (i.e. liver disease, cystic fibrosis, celiac disease, or Crohn’s disease), vitamin D is also poorly absorbed and utilized since Vitamin D is a fat-soluble vitamin.

 Vitamin D deficiency can be especially concerning because symptoms like bone pain and muscle weakness may go undetected in the early stages of deficiency. Although physicians do not routinely check vitamin D levels, those at risk of deficiency may benefit from a serum 25(OH)D test. This is a simple test used to measure the level of vitamin D circulating in blood, with levels less 20 nanograms per milliliter commonly used to diagnose deficiency. However, some organizations like the Endocrine Society argue that levels greater than 30 nanograms per milliliter should be recommended for optimal bone and muscle metabolism.

How Much Vitamin D Do You Need?

Similar to vitamin D serum levels, no ideal vitamin D intake has been well established since many factors contribute to vitamin D status. The U.S. Institute of Medicine recommends 600 to 800 international units (IU) of vitamin D daily for adults, assuming minimal sun exposure. On the other hand, the National Osteoporosis Foundation recommends larger doses of 1000 to 1200 IU daily for adults to support adequate bone health. Although vitamin D toxicity is rare, an upper level of 4000 IU has been set by the Institute of Medicine since extremely high levels can lead to calcium buildup, and could cause poor appetite, nausea, vomiting, weakness, and kidney problems.

With limited amounts of vitamin D provided from food, even fortified foods, diet alone is usually inadequate to meet vitamin D needs. For example, you would need to drink about 8 cups of milk every day to reach 800 IU of vitamin D from diet alone! While sun exposure can supplement food intake to meet vitamin D needs, many Americans still fall short of their needs due the factors outlined above.

For the 40% of Americans who have been found to be vitamin D deficient, vitamin D supplementation can be an effective and safe way to meet needs. Whether you’re an avid sunscreen-user or living here in New England during these fall and winter months, a daily vitamin D supplement can ensure that vitamin D stores are adequate. Multivitamins typically provide 400 IU of vitamin D, but a separate vitamin D supplement (D2 or D3) with 800 or 1000 IU may be needed to meet daily intake recommendations.

Katelyn Castro is a second-year student in the Dietetic Internship/MS Nutrition Program at the Friedman School. During the summer, she enjoys soaking up the sun if only for an excuse to get her daily dose of Vitamin D. During the winter, you can find her trekking through the snow, bundled up like the boy in A Christmas Story, and contemplating whether she needs a D supplement.

 

 

What is Intermittent Fasting, and Does It Really Work?

By Hannah Meier

You may have heard of caloric restriction and the myriad benefits it supposedly brings to the metabolic table. New research suggests that intermittent fasting could be a safe way for people to improve their health, but before you adopt this eating pattern, read up on six common mistakes to avoid.

The newest diet to gain popular attention isn’t much of a diet at all. It is something that most people who adhere to a traditional sleeping and waking cycle are already primed to do—and, proponents would argue, is something humans have been doing successfully for centuries. Intermittent Fasting (IF) has garnered support in the fitness community as a weight management tool for bodybuilders and other fitness enthusiasts. Recently, a growing portion of the scientific community has begun to also regard IF as a feasible way to improve metabolic health and perhaps even extend one’s lifespan.

Instead of eating many times throughout the day, between 6:00 am and 10:00 pm for example, Intermittent Fasters will couple periods of extended fasting (from 14 to 24 hours) with shorter periods of eating. This can be achieved by a change as simple as lengthening the overnight fast by a few hours each day. Different variations of IF propose reducing intake to 500-600 calories for just two days of the week; others recommend one full, 24-hour weekly fast. There are no particular restrictions on the type of foods allowed to be consumed, as long as meals are kept within the “eating window” and consumption does not surpass the feeling of comfortable fullness.

Experimental studies in rats have suggested that providing the body with an extended fast (up to 24 hours) is physiologically beneficial, potentially improving insulin sensitivity, decreasing resting heart rate and blood pressure and reducing body wide inflammation—all of which could contribute to a longer expected lifespan. Further, adapting to a shorter eating window may help to moderate overall calorie intake. Randomized controlled trials demonstrating benefits in humans have yet to be published. Because humans share an evolutionary adaptation to generations of unpredictable periods of fasting and feasting, however, scientists are eager to tease out this connection in future studies.

Still, many nutrition professionals are hesitant to advocate IF as superior to other diets or as a safe and effective approach to weight loss. At the end of the day, reducing calories consumed and increasing energy expended through physical activity is what matters for losing weight, and there are many ways to achieve this goal that do not require adopting a rigid eating schedule. It is important to consider your lifestyle, motivation, and sacrifices you are willing (or not willing) to make in order to reap the potential benefits of intermittent fasting. Like any diet, adherence is key to success. Here are six common mistakes to avoid if you think intermittent fasting sounds like something you want to try.

Six Mistakes Most People Make When They Begin Intermittent Fasting

  1. Giving up too soon

It is normal to feel more irritable or sluggish as the body adapts to a longer fasting period and adjusts its hormonal signaling (most scientists believe this adaptation underlies many of the health benefits of IF). Intermittent Fasters will likely find that true hunger feels different than the hunger pangs and uptick in heartbeat associated with fluctuating blood sugar, which we experience when we are used to frequent eating—learn to recognize it.

  1. Forgetting about quality
The "Basic Seven" Developed by the USDA in 1943

The “Basic Seven” Developed by the USDA in the 1940’s

Even though IF does not restrict the type of foods allowed to be consumed during the eating period, it’s essential to maintain proper nutrition. Metabolic improvements like insulin sensitivity and reduced inflammation could very well be negated if fasters neglect nutritional balance and decide to eat foods high in salt, saturated fat, and refined carbohydrates exclusively, avoiding fruits, vegetables, whole grains and lean proteins. You may still lose weight if you’re consuming fewer calories overall, but the efficiency of your body systems will suffer—and you probably won’t feel too well, either.

 

 

  1. Forgetting to hydrate

Hydration is key, especially during periods of fasting. Adequate hydration is necessary for pretty much every function in the body and will keep you feeling energized and alert. During fasting periods water, tea, coffee and no- or low-calorie beverages are allowed (just watch out for added cream and sugar). Keep tabs on the color of your urine as a gauge for hydration status: if it is darker than the light-yellow of hay you need to drink more fluid.

  1. Exercising too much

Some athletes swear by intermittent fasting as a means to improve performance, burn more fat, and even increase endurance. However, none of these benefits have consistently been backed up with controlled human studies. In fact, many observational studies of Muslim athletes during Ramadan show evidence of decreased performance (some athletes practicing IF might not maintain a fasting pattern requiring them to train during a fasted state, so these experimental differences could be important in interpreting results). Moderate and consistent exercise is encouraged for general health, but excessive exercise on top of prolonged fasting may send the body in to a state of chronic stress which can lead to inflammation, lean tissue breakdown, insulin resistance and injury.

  1. Not working with your schedule

There are different variations of IF and the only thing that makes one program more effective than the next is whether or not you can stick to it. For example, don’t decide to fast for 24 hours if you know missing your nightly family dinner will cause mental and social strain. There are many methods for reducing calorie intake for weight loss, and intermittent fasting may not be right for you if it leads to feelings of isolation and reduced quality of life.

  1. Believing that if some is good, more is better

Just because a little bit of fasting may be healthy does not mean that a lot of fasting is healthy. Going too long without food can lead the body into a state known as “starvation mode,” which greatly slows the metabolic rate, begins breaking down muscle for energy, and stores a greater majority of consumed calories as fat. Further, fasting for too long can lead to severe feelings of deprivation and preoccupation with food, culminating in uncontrollable or disordered eating behavior including binging and even anorexia. If you sense your relationship with food is becoming abnormal because of IF, make necessary adjustments and seek help if needed. 

Because IF can represent a major shift in metabolism and routine, most nutrition professionals are hesitant to recommend it as an intervention for just anyone. It is important to work with a licensed professional who understands your needs and who can help you maintain optimal nutrition, physical activity, and mental health during periods of prolonged fasting. Preliminary studies show that IF, when done right, may be a great tool for improving health, but it is not the only option to boost endurance and lose weight.

Hannah Meier is a first-year, second-semester NUTCOM student, registered dietitian and aspiring spokesperson for honest, scientifically driven and individualized nutrition. 

Yoga: Treatment for Type II Diabetes?

by Connie Ray

Scrolling through your Facebook and Instagram feed, you might think yoga is just another avenue for the uber fit and flexible to show off their hot bods at the beach. But research has consistently shown yoga helps improve mood, reduce stress, and increase strength and flexibility.

800px-Downward-Facing-Dog.JPG

By Iveto (own work) via Wikimedia Commons

There is also a growing field of evidence that yogic interventions can help treat and prevent chronic diseases. Published earlier in 2016 in the Journal of Diabetes Research, Innes and Selfe’s systematic review concluded that yoga may be beneficial for managing Type II Diabetes.

Type II Diabetes (DM2) is characterized by high blood glucose levels (hyperglycemia) and insulin resistance. It affects approximately 366 million people worldwide, a number which has doubled over the last 30 years. By 2030, this number is projected to reach 552 million.

In the United States, the CDC estimates that 29.1 million people suffer from DM2. In 2010, diabetes was the seventh leading cause of death in the US, and the number of cases is believed to be underreported. DM2 is the country’s single most costly chronic disease, accounting for at least 10% of all US healthcare costs ($245 billion annually).

Rather than an inherited disease like Type I Diabetes, DM2 is a preventable disease, and while certain factors like race, age, and genetic predisposition play a role in its prevalence, it is believed that over 90% of DM2 cases are attributable to lifestyle factors.

Lifestyle factors that increase risk of DM2 include physical inactivity, overnutrition, and obesity as the primary factors, with sleep impairment, chronic stress, and smoking as secondary contributing factors.

Yoga may be an ideal intervention for treating a multifactorial condition like DM2. Yoga is a mind-body approach to exercise, with roots in India over 4,000 years ago. The traditional practice of yoga incorporates 8 “pillars” or aspects of teaching, and the physical poses you see on Instagram, or “asanas” are only one of the 8. As yoga has become more prevalent in Western culture, it has branched into many different styles, including Bikram (hot yoga), Ashtanga (power yoga), Vinyasa (flow yoga), Iyengar (basic hatha yoga), Kundalini (awareness yoga), and restorative yoga. Most yoga styles incorporate physical poses (asanas), with breath practice (pranayama), focus (dharana), and meditation (dhyana).

Authors Innes and Selfe believe yoga’s mind-body approach suits it to a multifactorial lifestyle disease like DM2, and the results of their systematic review uphold their hypothesis. They analyzed 33 papers from 25 original controlled trials investigating the impact of yogic interventions on adults with DM2, and the results overwhelmingly demonstrated the benefits of a regular yoga practice on several health-related outcomes. Trials varied in duration and frequency of practice from 2-3 times a week to daily practice, spanning 6 weeks to over a year. In all cases, they compared results of the yogic intervention to “standard” DM2 care and in some cases, to a second control group subject to a standard exercise program.

Insulin Resistance: 22/24 relevant trials, or 92%, reported statistically and clinically significant improvement in at least one biomarker for insulin resistance (PPBG, insulin, and HbA1c).

Lipid Profiles: 15/16 relevant trials, or 94%, reported significant improvement in one or more lipid indices, including reduction in total cholesterol, LDL cholesterol, VLDL cholesterol, or triglycerides, and/or increases in HDL (“the good cholesterol”).

Body Weight & Composition: 8/9 relevant trials found significant improvement in at least one measure of body weight/composition, including weight, BMI, and waist-hip ratio, even in trials that compared a yoga intervention to standard care with regular exercise.

Blood Pressure: 3/5 relevant trials reported significant drops in both systolic and diastolic blood pressure compared to standard care and standard care with walking.

Oxidative Stress: 5/5 relevant trials indicated positive change in oxidative stress with the yoga intervention groups as measured by increased glutathione/Vitamin C serum levels, increases in superoxide dismutase levels, and reductions in malondialdehyde.

Mood & Sleep Impairment: 3/4 relevant trials reported significant improvements in quality of life, psychological well-being, symptoms of distress, and insomnia. One notable study of 41 adults practicing yoga nidra (yogic sleep) daily saw a decrease in insomnia prevalence from 43% to 5%.

Nervous System Function: 3/3 relevant trials reported improvement in cardiac autonomic function and reductions in heart and respiratory rate, all suggesting that yoga shifts the body’s nervous system from sympathetic to parasympathetic.

Pulmonary Function: 2/2 relevant trials reported improved pulmonary function (increased expiratory volume, forced vital capacity, peak flow rate, and maximum voluntary ventilation).

Medication Use: 3/3 relevant trials reported significant reductions in diabetes medication use in yoga intervention groups compared to standard care and comprehensive exercise programs. In one trial of 154 adults with DM2, there were 26-40% reductions in medication use at a 3-month follow-up.

“Overall, findings of these studies suggest that yoga-based practices may have significant beneficial effects on multiple factors important in DM2 management and prevention, including glycemic control, insulin resistance, lipid profiles, body composition, and blood pressure,” Innes and Selfe concluded.

Naturally, any systematic review has its limitations, and this one is no exception. Innes and Selfe found that several of the studies suffered from methodological problems or poor reporting, and the heterogeneity of study design, duration, and subject make it difficult to compare results across trials.

Yet even with its limitations, it is clear: yoga has obvious benefits in treating and managing DM2. As awareness of its benefits spreads, one would hope that more health practitioners will incorporate a yoga recommendation into their standard diabetes treatment plans.

Connie Ray is a first-year MNSP student at the Friedman School. She currently lives in Virginia, where she raises her two sons and teaches yoga.