Evaluating the Pinnertest: The Importance of Scientific Evidence

by Erin Child

So, you think you have a food intolerance? What do you do? You could call your doctor and set-up an appointment that is inevitably months away. Then you have a 10-minute meeting in which they only look at their computer and refer you to a specialist, THEN go through more testing, and finally (hopefully!) get some answers. Or, you could order an at-home kit that takes 10 minutes to complete and promises results that will get you feeling better, sooner. Which one do you choose? Read on and decide.

In our current world of food intolerances and hypersensitivities in which the best path to treatment is often a conundrum, the Pinnertest promises an easy solution to any dietary woes.  A few months ago, I started noticing ads for this new test popping up on social media. The Pinnertest is an over-the-counter food intolerance testing kit that uses microarray technology to test for IgG (Immunoglobulin G) mediated sensitivities for 200 common foods.

The classic manifestations of true food allergies (hives, oral discomfort, trouble breathing, anaphylaxis, etc) are mediated by overproduction of IgE antibodies. Like IgE, IgG is a type of antibody. And IgG is the most common antibody in the human body. (The immune system releases five types of antibodies: IgA, IgE, IgG, IgD, and IgM.) Instead of testing IgE mediated allergies, the Pinnertest producers claim that the microarray technology allows them to test for IgG mediated intolerances to 200 different foods—including lettuce, quail, and baking powder—using only a few drops of blood. It sounds scientific, but also seemed too good to be true. Was it?

I started my research by reaching out to the Pinnertest folks directly. My goal? To score a pro-bono test to try it out myself and see the results first hand. I was thrilled when a friendly representative at Pinner immediately reached out to set up a phone interview (calendar evite and everything). When the day came, I called—and was sent to voicemail. Twenty minutes and five tries later, I knew I had been ghosted. My subsequent emails were ignored, and my quest to learn first-hand about the scientific evidence backing their product was squashed.

So, I began researching on my own. The Pinnertest website sports a cluttered page of medical study citations that cover work on food allergies, intolerances and Celiac Disease—but none of which provide any evidence of using IgG testing for food intolerances.  My own PubMed search [IgG + food intolerance; Immunoglobulin G + food intolerance] yielded little, but did include one recently retracted 2016 article linking IgG testing to food allergies. The rest of the Pinnertest website leads you down a rabbit-hole of B-list celebrity endorsements and every Friedman student’s favorite—Dr. Oz videos! Interestingly, nowhere on the site does it tell you the cost of the test. To find out pricing, you must first enter your information (“Hi, my name is Newt Trition”) before you discover that the test will run you a whopping $490.

To further explore if this test has any scientific merit, and is worth the hefty price tag, I reached out the Boston Food Allergy Center (BFAC). Dr. John Leung, MD, the founder and CEO of the BFAC, and the Director of the Food Allergy Center at Tufts Medical Center and Co-Director of the Food Allergy Center at Floating Hospital for Children, took some time out of his day to answer my questions. Dr. Leung said, “We have patients coming into our office on a weekly basis with that kind of report [IgG], who pay out of pocket no matter what insurance they have. [Insurance doesn’t cover the test] because there is a statement from both the American and European Societies for Allergy saying that this test has no clinical significance.”

This is something to consider in any area of medicine—if a test is not covered by insurance, it may be the first sign that clinical significance could be lacking.

My conversation with Dr. Leung was brisk, informative, and confirmed my gut reaction that this test was too good to be true. Furthermore, there is a body of literature providing evidence that IgG mediated reactions are a sign that a food intolerance does not exist, not the other way around. In a 2008 European Academy of Allergy and Clinical (EAACI) Task Force Report, the authors wrote, “food-specific IgG4 does not indicate (imminent) food allergy or intolerance, but rather a physiological response of the immune system after exposition to food components.” Simply put, IgG evidence can show that you’ve been eating that food, not that you are intolerant to it. The EAACI has been joined by their Canadian, American, and South African counterparts in clear statements that research does not support the use of IgG mediated testing for food intolerances at this time.

Having shadowed Dr. Leung at the BFAC, I know that he takes patients’ claims of food intolerances seriously, and is invested in using the best clinical practices and scientific evidence available to make the diagnosis. Concerning IgG mediated testing, he stated, “There is so little research, so from a clinical view it is not very useful, it doesn’t mean much. It is not diagnostic.” And yet, the Pinnertest website claims that the“Pinnertest is a common procedure in most European countries. In many cases, dietitians and nutritionists will ask for their client’s Pinnertest results before creating any kind of diet plan.” Since this approach directly contradicts the current EAACI recommendation, that’s highly unlikely.

I also had the opportunity to speak with Rachel Wilkinson, MS, RD, LDN, and Practice Manager of the BFAC. Rachel explained, “If patients come in concerned about food intolerances, we can do the hydrogen breath test for lactose, fructose or fructan [found in some vegetables, fruits and grains]. These are the three main ones [food intolerances] we can test for, because we actually have tests for those.” She went on to state, “What was interesting to me about the Pinnertest, was how they say they can specify one specific food–so not just a category. I honestly don’t understand how they would pinpoint a specific food. It makes more sense to me to listen to patient’s histories and to look at how their intestines are able to break down that particular group of sugars. So, I really would love to know how they [Pinnertest] are coming up with this.”

It is important to note that the Pinnertest is not just marketing itself as a food intolerance test. It is also presenting itself as a weight loss tool. Current Frances Stern Dietetic Intern and Masters Candidate Jocelyn Brault, interning at BFAC, indicated her concern: “I think this is also being marketed for weight loss, which you can see throughout their website. This is usually a good sign that we should dig deeper. Is this a proven weight loss method? This claim seemed out of nowhere to me.” Indeed, directly on the Pinnertest box it reads, “Discover which foods are making you sick or overweight.” If taken seriously, this test will result in unnecessary diet restrictions, and potential malnutrition if too many foods are unnecessarily eliminated. Rachel Wilkinson, RD noted, “if you’re going to be avoiding certain types of foods, you need to make sure your diet is still adequate. We do not want to see people over-restricting foods for no reason.”

Over the course of my research and conversations with Dr. Leung, Rachel, and Jocelyn, I confirmed that my initial gut reaction was correct: too good to be true. And here’s the kicker, so does The Pinnertest. In a tiny disclaimer at the bottom of their website, they write: “Quantification of specific IgE antibodies to foods and inhalants is an FDA-accepted diagnostic procedure for the assessment of allergies. However, the assessment of human IgG antibodies specific for individual food and inhalant antigens is not an FDA-recognized diagnostic indicator of allergy.”

It is a noble task to try to design an allergy test that does not require you to doctor hop, or wait months for an appointment, but the scientific evidence needed to back up the Pinnertest is lacking. Perhaps one day this will shift, and the body of evidence will improve. In the meantime, however, anyone who thinks they might have a food intolerance (or food allergy) is best served by going to their clinician (and then a dietitian). This at-home kit promises a quick fix, but is really just an expensive, dangerous distraction.

Erin Child is a second-semester NICBC student in the dual MS-DPD program. She is fascinated by the science of food allergy and intolerances, and will probably keep writing about them until someone tells her to stop.  With two weeks left in the semester, she would really like a nap. Like right now.

Advertisements

Nutrition in a Nutshell: Lessons Learned as a Dietetic Intern

by Katelyn Castro

I was one of those few teenagers who knew exactly what I wanted to be when I grew up. Now, after four years of college and two years of graduate school combined with a dietetic internship, a career as a registered dietitian is not far out of reach. While my passion for nutrition has never dwindled over these last six years, my approach nutrition has changed significantly.

Nutrition tips on the sidebar of Self magazine, an over-simplified nutrition lesson in a health class in middle school, and a quick nutrition lecture from my pediatrician, summed up my understanding of nutrition before entering college. Now­—six years of coursework and 2000+ hours of dietetic rotations later—I not only know the nitty-gritty details of nutrition science, but I also have learned some larger truths about nutrition that are not always talked about.

Beyond what you may read as you thumb through your social media feed, or even what you may learn from an introductory nutrition textbook, here are some of the lessons that I have acquired about nutrition along the way:

1- Nutrition is an evolving science.

First, let’s be clear that nutrition is a science that relies on concepts from biology, chemistry, anatomy, physiology, and epidemiology to study how nutrients impact health and disease outcomes. Understanding how diabetes alters carbohydrate metabolism allows people with diabetes to live without fear of dying from diabetic ketoacidosis or seizures due to unsafe blood glucose levels. Understanding how ulcerative colitis impacts mineral absorption and increases protein losses helps those with the condition manage nutrient deficiencies with adequate nutrition supplementation. These are only a few examples of the many ways our knowledge of nutrition science makes it possible to improve individuals’ health outcomes.

However, the more I learn about nutrition, the more I realize that the research still holds many unanswered questions. For example, previous nutrition guidelines, like when to introduce hypoallergenic food to children, are being disproven and questioned by more recent studies. On the other hand, research on the gut microbiota is just beginning to uncover how one’s diet interacts with their gut microbiota through hormonal and neural signaling. Staying up-to-date on the latest research and analyzing study results with a critical eye has been crucial as new scientific discoveries challenge our understanding of nutrition and physiology.

Who would have thought a career in nutrition would require so much detective work?

 2- Food is medicine, but it can’t cure everything.

The fact that half of the leading causes of death in the U.S. can be influenced by diet and physical activity highlights the importance of nutrition for long-term health. Using medical nutrition therapy for patients with variety of health problems, ranging from cancer and cardiovascular disease to cystic fibrosis and end-stage renal disease, has also allowed me to see nutrition powerfully impact the management and treatment of many health conditions. High cholesterol? Avoid trans fat and limit saturated fat in foods. Type 2 diabetes? Adjust the timing and type of carbohydrates eaten.

While making simple changes to eating habits can improve lab values and overall health, nutrition is often only one component of treatment accompanied by medication, surgery, therapy, sleep, and/or stress management. Interacting with patients of all ages and health problems, and working with health professionals from a range of disciplines has forced me to step out of my nutrition bubble and take a more comprehensive approach to patient care: Improving quality of life and overall health and wellbeing is always going to be more important than striving for a perfect nutrition plan.

3- Nutrition is political and nutrition messages can be misleading.

Back when the Academy of Nutrition and Dietetics was one of many health organizations sponsored by Coca-Cola and PepsiCo, I realized how much influence large food industries have on food advertising, marketing, and lobbying. With known health consequences of drinking too many sugary beverages, the concept of health organizations being sponsored by soda companies was perplexing to me. Learning more about the black box process of developing the government dietary guidelines has also made me more cognizant of government-related conflicts of interest with industries that can color the way nutrition recommendations are presented to the public.

Industry-funded nutrition research raises another issue with nutrition messaging. For example, only recently a study revealed that the sugar industry’s funded research 50 years ago downplayed the risks of sugar, influencing the debate over the relative risks of sugar in the years following. Unfortunately, industry-sponsored nutrition research continues to bias study results, highlighting positive outcomes, leaving out negative ones, or simply using poor study designs.  While sponsorships from big companies can provide a generous source of funding for research, as both a nutrition professional and a consumer, I’ve learned to take a closer look at the motives and potential bias of any industry-funded nutrition information.           

4- Nutrition is not as glamorous as it sounds, but it’s always exciting.

When the media is flooded with nutrition tips for healthy skin, food for a healthy gut, or nutrients to boost mood, the topic of nutrition can seem light and fluffy. With new diets and “superfoods” taking the spotlight in health magazines and websites, it’s easy to think of nutrition as nothing more than a trend.

However, any nutrition student or dietitian will prove you otherwise. In the words of one of my preceptors, “my job [as a dietitian nutritionist] is not as glamorous and sexy as it sounds.” Throughout my dietetic rotations, my conversations with patients and clients have gone into much more depth than just aesthetics and trendy nutrition topics. If I’m working with a patient with Irritable Bowel Syndrome, bowel movements (a.k.a poop) may dominate the conversation. If I’m counseling someone who has been yo-yo dieting, I may be crushing their expectations of fad diets while encouraging more realistic, sustainable healthy goals. If I’m speaking with a group of teenagers with eating disorders, I may not talk about nutrition at all and focus more on challenging unhealthy thoughts and behaviors about food. It is these conversations, discussing what really matters when it comes to food, nutrition, and overall health that make a career in nutrition ever-changing and always exciting.

Katelyn Castro is a second-year student graduating this May from the DI/MS Nutrition program at the Friedman School. She hopes to take advantage of her experiences at Tufts to make positive impact on individuals’ health and wellbeing through community nutrition outreach. You can follow on her journey as she blogs on all things relating to food and nutrition at nutritionservedsimply.com.

 

 

Finding Common Ground for Nutrition in a World of Alternative Facts

by Rachel Baer

Rachel Baer tackles the implications of the “post-truth” culture for the nutrition profession and poses 3 questions to consider about our response to the unending barrage of nutrition-related “alternative facts.”

As a registered dietitian, I can tell you this: Nutrition professionals know a thing or two about alternative facts. We spend our careers with textbooks and scientific journals in hand, waiting for the next misinformed food fad to go viral. We fight to defend the facts because we have always believed that if we could show people what is true, we could convince them that we have the best answers for their nutrition-related questions. But the concept of truth is losing popularity.

The Oxford English Dictionary declared the term “post-truth” to be the 2016 word-of-the-year. Post-truth is defined as “related to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” Let that sink in for a moment: emotional appeals are more influential than objective facts. While this concept is alarming on many levels, I am particularly concerned about its implications for health professionals who rely on scientific truths as the basis of their credibility.

Don’t get me wrong. I understand the frustration people feel as they watch seemingly contradictory nutrition headlines emerge at the very hint of new research findings. One day people are told to limit egg consumption to 3 yolks per week, the next, the one-yolk-per-day allowance is back. However, as nutrition professionals, we have a certain appreciation for the fact that science is ever-evolving. We hold our recommendations lightly because we believe in a scientific community that is always growing, and that new discoveries only sharpen our understanding of nutrition and physiology. The public, on the other hand, does not always share this appreciation.

Confusion over wavering nutrition claims is exacerbated by the inundation of un-credentialed, unschooled voices clamoring for attention in popular media. Social media has provided a proverbial soapbox for anyone with a passionate message to share, regardless of qualifications. Simultaneously, dietitians tend to hold back on making bold retorts, often waiting for consensus to catch up with the fads so that our recommendations are supported with the latest research. This seeming imbalance of voices alongside the emergence of the post-truth culture only perpetuates the proliferation of unfounded claims, or “alternative facts,” as they have become popularly known.

I have no easy answers for this predicament, but here are 3 questions that we could benefit from exploring as nutrition professionals:

1. How do we remain experts while also being compelling?

Dietitians have long been referred to as the “food police.” While I resent this reputation, it highlights a worthy question: Do nutrition professionals present information in a way that is relatable, realistic, and winsome to the people whose respect we want to gain?

We can no longer depend solely on the letters after our names to gain an audience with the public, particularly when we are pitted against wayward blog and media influencers using sensationalized language to win over vast groups of people who blindly follow their passionate advice. The internet is full of examples of people preferring to follow the advice of a persuasive friend or influencer over the advice of a knowing professional. While this situation is endlessly frustrating to those of us who see through their hyperbolic messages, is there anything we can learn from these blog/media personalities that may help us reach the audience they seem to have hooked? How do we successfully build rapport with the public while maintaining good science?

2. How do we talk about fundamentals in a world that wants controversy?

Let’s face it. Fundamentals don’t make great headlines. For decades, consensus research has revealed that a diet full of minimally-processed fruits, vegetables, whole grains, nuts/seeds, lean proteins, and healthy fats is unequivocally and unanimously the best diet for human health. Yet, people still search elsewhere looking for the latest and greatest weight-loss, risk-reducing, and health-enhancing diets. Could it be that balance is more challenging than we thought? Perhaps avoiding certain food groups or food ingredients altogether is easier than the amorphous concept of moderation? Our greatest challenge is not getting more people to consume health information, it is finding new and compelling ways to deliver the information we’ve known for decades, and this is no small task.

3. How do we overcome differences within the nutrition profession to present a united front to people lost in the sea of alternative facts?

In 2014, David Katz and Walter Willet co-chaired a conference sponsored by the non-profit Oldways*, titled “Finding Common Ground.” Oldways and the co-chairs assembled what they referred to as “the dream team of nutrition experts,” including Friedman’s own, Dariush Mozaffarian, as well as Dean Ornish, creator of the Ornish Diet; David Jenkins, founder of the glycemic index; Boyd Eaton, originator of the Paleolithic diet; Collin Campbell, author of The China Study; and a myriad of others. Known most commonly for their differences, this group of scientists gathered together for the sole purpose of coming to a consensus on the basic tenants of a healthy diet. In the end, the group agreed on 11 common denominators of the widely differing philosophies they espouse. The topics ranged from fruit and vegetable consumption, to sustainability, to food literacy.

Following the conference, David Katz published an article in Forbes where he said “…it is the controversies at the edge of what we know that interest experts most, but ask [experts] about the fundamentals, and the vast expanse of common ground is suddenly revealed.” The Common Ground committee’s decision to gather around a table, invite open dialogue, and pursue unity is something we could all learn a lesson from. Alternative facts will always provide fodder for hucksters and peddlers of over-simplified nutrition information, but the scientific community has a vast body of research that unites us. As nutrition professionals, we cannot forget that our voices will always be more powerful together than they ever will apart.

Rachel Baer is a registered dietitian and a first-year in the NICBC program at Friedman. Her favorite foods are Brussels sprouts and brownies, and she loves nothing more than cooking great meals and gathering people around a table.

*Editor’s Note, 5/1/17  2:09 PM: An earlier version of this article incorrectly spelled the name of the organization, “OldWays.” The correct spelling is Oldways, and the change has been made above.

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

The Dr. Oz Effect

by Julia Sementelli

With the beginning of the new year inevitably comes an onslaught of promotions and advertisements for miracle diets, detoxes, and supplements that vow to help you shed pounds, live longer, etc. And when you think of diets and supplements, most likely two words come to mind: “Dr. Oz.”  He is a doctor, but he is also a registered dietitian’s worst nightmare. While dietitians are out there teaching patients and clients that weight loss cannot be healthfully achieved in a pill or in a 2 week “cleanse,” Dr. Oz is preaching the opposite. Read on for the inside scoop of how Dr. Oz further complicates the already messy, ever-changing world of nutrition and health, including an interview with the man himself.

A recent client of mine, Mark (name changed for privacy), eats a fairly healthy diet: Greek yogurt and berries for breakfast, a salad with lean protein for lunch, and something from the Whole Foods salad bar for dinner (he doesn’t like to cook).  He says that his major downfalls are cookies and beer. Mark’s goal is to lose 30 pounds and improve his overall health given his family history of heart disease. “Give me a meal plan and I will follow it,” says Mark. I can work with that. He is actually a dietitian’s dream—someone who already doesn’t mind eating well and is motivated to lose weight. I thought his meal plan would be a breeze, until he said “Oh—I should tell you about my supplements.” I had expected a multivitamin and some daily vitamin D, but my hopes were dashed as Mark rattled off more than 15 supplements that he is currently taking, only one of them being a multivitamin. Among these supplements were resveratrol, an antioxidant found in red grape skins that he claims sheds years off of your life, and Conjugated Linoleic Acid (CLA), which apparently melts body fat. When I asked Mark where he learned about all of these supplements, he said “Dr. Oz.”

No two words can send angry chills up a dietitian’s spine quicker than Dr. Oz. While I am a fairly green registered dietitian, I have interacted with enough patients to see firsthand the power of Dr. Oz. Dr. Mehmet Oz started out as the resident expert on “The Oprah Winfrey Show” for five years before he was given his own spotlight, “The Dr. Oz Show.” He holds three degrees: a B.S. in biology from Harvard and an M.D. and M.B.A. from the University of Pennsylvania. He is vice-chairman of the department of surgery at the Columbia University College of Physicians and Surgeons in New York. He is also likeable. Consequently, he has become one of the most trusted doctors in the world and yet he uses words like “magical” and “miraculous” to promote supplements that promise to burn fat or prevent cancer. However, what the public may not understand is that a pill is not a miracle cure for anything. According to Stephanie Clarke, registered dietitian and co-owner of C&J Nutrition in New York City: “Most MDs get very little (or zero) nutrition education and background—so it’s a frustrating when they dole out nutrition advice or research without enough details or without thinking about how their messages will be interpreted by the public and related to real life eating.” But Americans continue to believe in the power of nutritional supplements recommended by a doctor that (most likely) has had minimal nutrition education and, more surprisingly, continue to buy them.  In fact, Americans spent more than $21 billion on vitamins and herbal supplements in 2015.  According to analyses, just the mention of a product on the Dr. Oz Show causes a surge in sales.

This phenomenon has been coined as “The Dr. Oz Effect.” Combine charismatic with a few letters after his name and you have someone who is more believable than the thousands of nutrition professionals that use science, not pseudoscience, to back up their recommendations. Even my own father, who has type 2 diabetes, an affinity for soy sauce (read: sodium), and meets my attempts to improve his diet with stubbornness, listens to Dr. Oz. Meanwhile, I have gone through four years of undergraduate education in nutrition, applying for competitive dietetic internships (50% acceptance rate), a one year unpaid dietetic internship, studying for and passing a comprehensive exam, and an additional two years of graduate work to get to where I am. And yet I still don’t have the influence that Dr. Oz does to change my father’s food behaviors.

As a dietitian, I strongly believe in balance. It is my goal to reduce the all-or-nothing thinking that surrounds eating and exercise. The media and people like Dr. Oz perpetuate this mindset, capitalizing on the public’s obsession with weight loss and diets by highlighting drastic regimens and alleged cure-all supplements. Diets do not work because they typically deprive a person of entire food groups, fats or carbohydrates, for example, and eventually the individual gives in and eats those food groups in excess since they have been denying themselves of them for so long.

The demonization of food, another spawn of the media, is the belief that particular foods are good or bad. It has resulted in mass confusion and further damage to peoples’ relationship with food. One of the most infuriating examples of this demonization is fruit. Yes, fruit. “I heard that the sugar in fruit is bad for you” or “I was told not to eat pineapple because it is high in sugar” are actual quotes that I have heard from clients. And not surprisingly, both clients attributed their beliefs to Dr. Oz. After some research, I discovered that, lo and behold, Dr. Oz did a segment titled “Can the Sugar in Fruit Make You Fat?” that most likely influenced these beliefs. Aside from vegetables, fruit is one of the most wholesome food groups, packed with fiber, antioxidants, vitamins, and minerals. Yet fruit cannot even avoid falling victim to the war on food. Conundrums like this exist for nearly every food: eggs, fish, coffee, potatoes…the list goes on. The only way to try to reverse the damage is to tell people that no food is off limits and remind them that there is no replacement for good eating and regular exercise. The only way that I have seen weight loss occur is with gradual and sustainable changes over time. And anyone that promises anything different is lying or worse, using pseudoscience to make outrageous claims.

Pseudoscience, the basis upon which Dr. Oz has constructed his lucrative empire, involves exaggerated and often contradictory claims that are not supported by reputable research. The media is also a culprit of using pseudoscience, composing articles and news stories from press releases of studies with small sample sizes or that use mice as their subjects. Just because it is effective or safe for mice, does not mean it will be safe for humans. Many writers for tabloids and mainstream magazines are stretched for time and are more concerned with quantity rather than quality given that their main goal is to make headlines that sell papers and magazines. Unfortunately, such writers and apparent health experts like Dr. Oz produce the majority of what the general public sees and uses to shape its food choices. However, according to a study published in the BMJ in 2014: “Consumers should be skeptical about any recommendations provided on television medical talk shows, as details are limited and only a third to one half of recommendations are based on believable or somewhat believable evidence.” That’s right—more than half of what Dr. Oz claims on his show regarding nutrition is not based on science. While the show has seen a dip in ratings, currently 1.8 million still tune into the Dr. Oz Show and are consequently exposed to information that is incorrect 50-67% of the time according to the 2014 study in the BMJ.

Dr. Oz has been criticized by a slew of medical professionals for his scam marketing, most notably in 2015 when ten physicians wrote a letter to the dean of health sciences at Columbia University requesting that Dr. Oz be removed as a faculty member due to his “egregious lack of integrity” on his TV show. Dr. Oz defends what he tells the public by claiming that “it’s not a medical show,” despite the fact that the show is titled The Dr. Oz show. Dr. Oz says that freedom of speech gives him the right to say what he wants to. But it is difficult to respect this freedom when he is a faculty member at a prestigious university that makes false claims on TV.

I reached out to the Dr. Oz team and received a response from Oz himself. When asked where he finds his nutrition information he said, “We obtain nutrition information from a wide variety of sources. We rely heavily on literature published in scientific journals as well as textbooks. In addition we consult a wide variety of experts including medical doctors and nutritionists. Our research staff is made up of myself a physician trained in preventive medicine as well as 3 medical students who take a year off to work with us. We evaluate all of the content on our show to ensure that viewers are getting accurate information. One of our researchers this year has a master’s degree in nutrition as well.” I am not sure which scientific journals Dr. Oz and his team are using, but when I researched “curcumin” and “oil of oregano,” two of the supplements that Dr. Oz has promoted on his show and that Mark, my client, is currently taking, the conclusion was that “the existing scientific evidence is insufficient to recommend their safe use.” In our interview, Dr. Oz said: “We also reach out to the Friedman school when we have difficult questions. I spent a day up at the school this summer meeting with a number of your faculty. Most recently I have spoken to an expert about fiber fortified foods and to your Dean about the current opinions on dietary fats.” He included a note that says that he and his team welcome interns to join them every month from September to June and students from Friedman are welcome to apply. *Insert eye roll*

When I asked about Dr. Oz and his team’s stance on nutritional supplements, he replied: “In general we believe that many have a place in people’s life to enhance nutrition. We always love to see more and better studies conducted on the utility of supplements in promoting health.” This is a nice response but when I begrudgingly watched a clip from the Dr. Oz show in which he says that Conjugated Linoleic Acid (CLA) can help to burn body fat, even without diet and exercise, I realized that what he says and what he does do not match. And aside from empty promises and putting people at risk with questionable pills, he is encouraging people to waste their money. This is what I told Mark in an effort curb his daily supplement cocktail. If the risk of taking his favorite “fat-melting” supplement won’t stop him, maybe the opportunity to save money will.

Dr. Oz is frustrating for many reasons, but for nutrition professionals it is the fact he uses his credentials as a physician to get away with promoting pseudoscience. Being a dietitian no longer involves simply telling people what to eat. It is trying to untangle the web of misinformation surrounding nutrition that clients have woven over the course of their lives and re-teach them what a healthy relationship with food should look like. While turning to supplements can seem like an easy fix, science shows that eating a diet based on whole foods like fruits, vegetables, whole grains, lean protein, and healthy fats, is the ideal diet. Science does not show that a pill is the secret to losing those last five pounds that keep hanging on. If scientists really found a cure for obesity, we would not be hearing about it at 4pm on a Tuesday afternoon. And unfortunately, the supplement industry is not going anywhere. The FDA and FTC regulate the supplement industry, but not very well. So it is up to trained and licensed nutritional professionals (i.e. registered dietitians) to educate the public about the dangers of supplements and listening to people who are simply “health experts.”

Julia Sementelli is a second-year Nutrition Communication & Behavior Change student and Boston-based registered dietitian who works in a local hospital and also counsels private clients.  You can find her on Instagram (@julia.the.rd.eats- Follow her!) where she strives to intercept confusing nutrition messages from self-proclaimed health experts with expert nutrition advice and tips (as well as some beautiful food photos if she does say so herself!).

 

 

What’s the Deal with Vitamin D?

by Katelyn Castro

There is always one nutrient that seems to linger in the media for a while. Lately, vitamin D has been the lucky winner! Considering that over 40% of Americans are vitamin D deficient, according to the National Health and Nutrition Examination Survey (NHANES), it’s worth taking a closer look at vitamin D.

Depression, cancer, heart disease, and type 1 diabetes are some of the many health conditions that have been linked to vitamin D deficiency. While it is too soon to point to vitamin D as a cure-all, this vitamin may be more important for our health than previously thought—especially during the winter months in New England!

Why is Vitamin D Important?

Vitamin D is most often known for its role in bone health, increasing calcium absorption and helping with bone mineralization alongside calcium and phosphorus. Historically, rickets in children and osteoporosis and bone fractures in adults have been the most common signs of vitamin D deficiency.

As a fat-soluble vitamin and a hormone, vitamin D is also involved in many other important metabolic processes. Did you know vitamin D activates over one thousand genes in the human genome? For example, vitamin D is needed for protein transcription within skeletal muscle, which may explain why vitamin D deficiency is associated with poor athletic performance. Vitamin D also regulates blood pressure by suppressing renin gene expression, supporting the possible relationship between vitamin D deficiency and risk of heart disease. Additionally, vitamin D status may alter immunity due to its role in cytokine production. Studies have found that vitamin D deficiency is associated with upper respiratory tract infections. While more research is needed to explore these connections, these findings continue to suggest that vitamin D plays an integral role in bone, muscle, cardiac, and immune health.

Where Do You Get Vitamin D?

Only a few foods are natural sources of vitamin D, including eggs and fatty fish like salmon, mackerel, tuna, and sardines. Instead, vitamin D-fortified foods like dairy products, juices, and breakfast cereals make up the majority of Americans’ vitamin D intake.

Sun exposure, on the other hand, can be the greatest source of vitamin D for some people–hence vitamin D’s nickname, the “sunshine vitamin.” Unlike any other vitamin, vitamin D can be synthesized in the body when the sun’s ultraviolet B rays reach the skin and convert cholesterol into a Vitamin D3, the precursor for vitamin D. Then, Vitamin D3 diffuses through the skin into the blood, where it is transported to the liver and kidneys and converted into vitamin D’s active form, 25(OH)D.

Research has found that exposing arms, legs, and face to the sun for 15 to 30 minutes twice a week provides about 1000 international units of vitamin D (equal to about 10 cups of milk!). Despite this robust source of vitamin D, deficiency is surprisingly common in the U.S.

Who is at Risk of Vitamin D Deficiency?

Many circumstances can alter vitamin D synthesis and absorption, increasing risk of vitamin D deficiency. Some of the factors that have been found to impact vitamin D status include the following:

  • Season: According to research, during the months of November to February, people living more than 37 degrees latitude north or south produce little or no vitamin D from the sun due of the angle of ultraviolet B sunrays. While vitamin D is stored in fat tissue and can be released into the blood when needed, our stores typically only last one to two months.
  • Limited Sun Exposure: Vitamin D synthesis can also be blocked when sunscreen is applied correctly or when long robes or head coverings are worn for religious reasons. For example, sunscreen with a sun protection factor (SPF) of 8 decreased vitamin D synthesis in skin by about 95% in one study.
  • Skin Color: People with darker skin pigmentation have also been found in research to have lower levels of vitamin D due to decreased synthesis. This is supported by the high prevalence of vitamin D deficiency among certain ethnicities, with 82% African Americans and 69% Hispanics found to be vitamin D deficient according to NHANES.
  • Weight: Studies also suggest that overweight and obese people may have higher Vitamin D requirements. Since they have more body fat and since vitamin D is a fat-soluble vitamin, vitamin D is more widely distributed in fat tissue, making it less bioavailable. As a result, more vitamin D may be needed for it to reach the blood stream for distribution in the body.
  • Age: Older adults have been found to have lower levels of the vitamin D, likely due to both decreased sun exposure and inefficient synthesis. One study found that 70 year-olds had about 25% of the vitamin D precursor compared to young adults, which decreased vitamin D synthesis in the skin by 75%.
  • Fat Malabsorption: When any gastrointestinal disorder or other health condition impairs fat absorption (i.e. liver disease, cystic fibrosis, celiac disease, or Crohn’s disease), vitamin D is also poorly absorbed and utilized since Vitamin D is a fat-soluble vitamin.

 Vitamin D deficiency can be especially concerning because symptoms like bone pain and muscle weakness may go undetected in the early stages of deficiency. Although physicians do not routinely check vitamin D levels, those at risk of deficiency may benefit from a serum 25(OH)D test. This is a simple test used to measure the level of vitamin D circulating in blood, with levels less 20 nanograms per milliliter commonly used to diagnose deficiency. However, some organizations like the Endocrine Society argue that levels greater than 30 nanograms per milliliter should be recommended for optimal bone and muscle metabolism.

How Much Vitamin D Do You Need?

Similar to vitamin D serum levels, no ideal vitamin D intake has been well established since many factors contribute to vitamin D status. The U.S. Institute of Medicine recommends 600 to 800 international units (IU) of vitamin D daily for adults, assuming minimal sun exposure. On the other hand, the National Osteoporosis Foundation recommends larger doses of 1000 to 1200 IU daily for adults to support adequate bone health. Although vitamin D toxicity is rare, an upper level of 4000 IU has been set by the Institute of Medicine since extremely high levels can lead to calcium buildup, and could cause poor appetite, nausea, vomiting, weakness, and kidney problems.

With limited amounts of vitamin D provided from food, even fortified foods, diet alone is usually inadequate to meet vitamin D needs. For example, you would need to drink about 8 cups of milk every day to reach 800 IU of vitamin D from diet alone! While sun exposure can supplement food intake to meet vitamin D needs, many Americans still fall short of their needs due the factors outlined above.

For the 40% of Americans who have been found to be vitamin D deficient, vitamin D supplementation can be an effective and safe way to meet needs. Whether you’re an avid sunscreen-user or living here in New England during these fall and winter months, a daily vitamin D supplement can ensure that vitamin D stores are adequate. Multivitamins typically provide 400 IU of vitamin D, but a separate vitamin D supplement (D2 or D3) with 800 or 1000 IU may be needed to meet daily intake recommendations.

Katelyn Castro is a second-year student in the Dietetic Internship/MS Nutrition Program at the Friedman School. During the summer, she enjoys soaking up the sun if only for an excuse to get her daily dose of Vitamin D. During the winter, you can find her trekking through the snow, bundled up like the boy in A Christmas Story, and contemplating whether she needs a D supplement.

 

 

Timing of your Meals–Does it Matter?

by Yifan Xia

How would you feel if you were told to not have dinner for the rest of your life? Skipping dinner every day might sound shocking to most of us, but it was once a very common practice in ancient China in the Han Dynasty. In fact, even today Buddhism and Traditional Chinese Medicine (TCM) promote this practice as a healthier choice than eating three meals per day. But does this practice have roots in science? Of course, controversy exists around this topic, but one thing that we can be certain of today is that the timing of our meals can have a much greater impact on our health than we originally thought.

Researchers investigating the circadian system (internal biological clock) have started looking at the effects of mealtime on our health. Surprisingly, preliminary evidence seems to support the claims of Buddhism and TCM, indicating that eating meals earlier in the day might help promote weight loss and reduce the risk of chronic disease.

What are circadian rhythms and the circadian system?

Circadian rhythms are changes in the body that follow a roughly 24-hour cycle in response to external cues such as light and darkness. Our circadian system, or internal biological clock, drives circadian rhythms and prepares us to function according to a 24-hour daily cycle, both physically and mentally.

Why do they matter to our health?

Our internal biological clock is involved in almost every aspect of our daily lives: it influences our sleep-and-wake cycle, determines when we feel most energetic or calm, and when we want to eat.

These days people don’t always rely on their biological clocks to tell them when to eat, and there are many distractions in the environment that can influence mealtime. We typically think how many calories we eat—and what we eat—are the major contributors to our weight and health, but researchers have found that eating at inappropriate times can disrupt the internal biological clock, harm metabolism, and increase the risk of obesity and chronic disease.

What does the research say?

Although currently the body of research evidence for this area is relatively small, there are several human studies worth highlighting. One randomized, open-label, parallel-arm study, conducted by Jakubowicz, D., et al and published in 2013, compared effects of two isocaloric weight loss diets on 93 obese/overweight women with metabolic syndrome. After 12 weeks, the group with higher caloric intake during breakfast showed greater weight loss and waist circumference reduction, as well as significantly greater decrease in fasting glucose and insulin level, than the group with higher caloric intake during dinner. Another study published in the same year with 420 participants noted that a 20-week weight-loss treatment was significantly more effective for early lunch eaters than late lunch eaters. In 2015, a randomized, cross-over trial, conducted in 32 women and published in International Journal of Obesity, showed that late eating pattern resulted in a significant decrease in pre-meal resting-energy expenditure, lower pre-meal utilization of carbohydrates, and decreased glucose tolerance, confirming the differential effects of meal timing on metabolic health. However, few studies were identified reporting negative findings, probably due to the fact that this is an emerging field and more research is needed to establish a solid relationship.

 So when should we eat? Is there a perfect mealtime schedule for everyone?

“There are so many factors that influence which meal schedules may be suitable for an individual (including biological and environmental) that I cannot give a universal recommendation,” says Gregory Potter, a PhD candidate in the Leeds Institute for Genetics, Health and Therapeutics (LIGHT) laboratory at the University of Leeds in the United Kingdom and lead author on the lab’s recent paper reviewing evidence of nutrition and the circadian systems, published in The British Journal of Nutrition in 2016. Potter also comments that regular mealtime seems to be more important than sticking to the same schedule as everyone else: “There is evidence that consistent meal patterns are likely to be superior to variable ones and, with everything else kept constant, it does appear that consuming a higher proportion of daily energy intake earlier in the waking day may lead to a lower energy balance and therefore body mass.”

Aleix Ribas-Latre, a PhD candidate at the Center for Metabolic and Degenerative Diseases at the University of Texas Health Science Center and lead author on another review paper investigating the interdependence of nutrient metabolism and the circadian systems, published in Molecular Metabolism in 2016, also agrees: “To find the appropriate meal time has to be something totally personalized, although [it] should not present [too] much difference.” Aleix especially noted that people who are born with a tendency to rise late, eat late, and go to bed late (“night owls” versus “early birds”) are more likely to be at risk for metabolic disease.

Do we have to eat three meals a day?

How many meals do you usually have? In fact, how much food makes a meal and how much is a snack? There is no universal definition, which makes these difficult questions to answer.

“To maintain a healthy attitude towards food, I think it is important to avoid being too rigid with eating habits … I do think consistency is important as more variable eating patterns may have adverse effects on metabolism,” says Potter. “Although there is evidence that time-of-day-restricted feeding (where food availability is restricted to but a few hours each day) has many beneficial effects on health in other animals such as mice, it is as yet unclear if this is true in humans. I’d also add that periodic fasting (going for one 24 hour period each week without energy containing foods and drinks) can confer health benefits for many individuals,” Potter comments.

[See Hannah Meier’s recent article on intermittent fasting for more.]

Based on their research, Ribais-Latre and his lab have a different opinion. “We should eat something every 3-4 hours (without counting 8 hours at night). Many people complain about that but then consume a huge percentage of calories during lunch or even worse at night, because they are very hungry. Eating a healthy snack prevents us [from] eating too [many] calories at once.” He suggests what he considers a healthier mealtime schedule:

–          6:00 am  Breakfast (30% total calories)

–          9:30 am  Healthy snack (10%)

–          1:00 pm  Lunch (35%)

–          4:30 pm  Healthy snack (10%)

–          8:00 pm  Dinner (15%)

What if you are a shift worker or your work requires you to travel across time zones a lot? Ribais-Latre’s advice is “not to impair more their lifestyle… at least it would be great if they are able to do exercise, eat healthy, sleep a good amount of hours.”

What does Traditional Chinese Medicine say?

There are historical reasons behind the no-dinner practice in ancient China in the Han Dynasty. First, food was not always available. Second, electricity hadn’t been invented, so people usually rested after sunset and they didn’t need much energy at what we now consider “dinner time.”

However, there are also health reasons behind this practice. In TCM theory, our internal clock has an intimate relationship with our organs. Each organ has its “time” for optimal performance, and we can reap many health benefits by following this clock. For example, TCM considers 1:00 am – 3:00 am the time of “Liver”. The theory says that is when the body should be in deep sleep so that the liver can help to rid toxins from our body and make fresh blood. Disruption at this time, such as staying up until 2:00 am, might affect the liver’s ability to dispel toxins, leading to many health problems, according to the theory.

Many Western researchers do not seem to be familiar with the TCM theory. When asked about the practice of skipping dinner, Potter comments, “I think that skipping dinner can be a perfectly healthy practice in some circumstances; in others, however, it may be ill advised if, for example, the individual subsequently has difficulty achieving consolidated sleep.”

On the flip side, Ribais-Latre says that “skipping a meal is not good at all. We should not eat more calories than those we need to [live], and in addition, the quality of these calories should be high… If you can split those calories [to] 5 times a day instead of three, I think this is healthier.”

Even though there is no universal agreement on mealtime, the tradition of “skipping dinner” did come back into style several years ago in China as a healthier way of losing weight, and was quite popular among Chinese college women. Yan, a sophomore from Shanghai and a friend of mine, said that she tried the method for six months but is now back to the three-meal pattern. “The first couple of days were tough, but after that, it was much easier and I felt my body was cleaner and lighter… I did lose weight, but that’s not the main goal anymore… I got up early every day feeling energetic. Maybe it’s because I only ate some fruits in the afternoon, I usually felt sleepy early and went to bed early, which made it easier to get up early the next day with enough sleep… I’m eating three meals now, but only small portions at dinner, and I think I will continue this practice for my health.”

So what’s the take-away?

Mealtime does seem to matter. But exactly how, why, and what we can do to improve our health remains a mystery. Researchers are now looking into the concept of “chrono-nutritional therapy,” or using mealtime planning to help people with obesity or other chronic diseases. When we resolve this mystery, the question of “When do you eat?” will not just be small talk, but perhaps a key to better health.

Yifan Xia is a second-year student studying Nutrition Communication and Behavior Change. She loves reading, traveling, street dancing, trying out new restaurants with friends in Boston, and watching Japanese animations.