Food Label Fear Mongering and its “Toxic” Effects

by Megan Maisano

You know it’s hard out here for a processed food. These days, most consumers want to know what’s in their food and how it’s processed. While that may sound promising towards improving food choices and overall health, it also might be contributing to a culture of fear-mongering and food discrimination – none of which is helpful. This month, Megan Maisano investigates common marketing strategies employed by food manufacturers that result in unnecessary fear, doubt, and confusion in the minds of consumers.

Grocery supermarket

Source: pexels.com

Good news: over half of the U.S. population is paying attention to food labels. Bad news: it might be increasing consumer confusion and contributing to unintended health hysteria.

Whether it’s the latest Netflix documentary demonizing an entire food group, an Instagram feed promoting “clean” eating, or your mother’s cousin Carol pushing her latest detox agenda on Facebook, food fear mongering is real.

The problem is that many claims of “toxic” or “unclean” foods don’t come from health professionals or experts. On top of that, their messages are more accessible by the common consumer than, let’ say, the most recent edition of the American Journal of Clinical Nutrition.

I’ll be the first to admit I read Michael Pollen’s Food Rules a few years ago. I loved it. It was simple, easy to understand, and seemed logical. Nutrition science, however, is not simple, not easy to understand, and evolves with advancing evidence-based research… and nutrition research is hard.

While the desire for food transparency is warranted and can lead to healthier decision-making, the marketing response by the food industry has taken advantage of consumers’ unwarranted fears. Instead of highlighting what’s good in the food we eat, product labels emphasize what’s not in our food, and it’s contributing to the chaos.

I decided to explore the research and science behind common food label claims. The results: practices that range from reasonable transparency to questionable marketing tactics that make us say C’mon Man.

 

Non-GMO Project

The Non-GMO Project, which started in two grocery stores in 2007, now has its iconic butterfly on more than 3,000 brands and 43,000 products. GMOs, or Genetically Modified Organisms, are plants, animals, microorganisms or other organisms whose DNA has been changed via genetic engineering or transgenic technology. The debate concerning GMO safety remains highly controversial. Without going into too much detail, cynics claim that GMOs have not been proven safe and that people have a right to know whether their food contains them. On the other side, folks like the National Academies of Sciences, Engineering, and Medicine claim GMOs have not been proven harmful to humans or the environment.

Regardless of the verdict, the Non-GMO butterfly is landing on more and more products that are naturally GMO-free, such as tomatoes, oranges, and milk. This trend leads to the misconception that tomatoes, oranges, and milk without said-butterfly DO have GMOs and are therefore less safe. This deceptive labeling practice not only hurts the consumer, but also competing brands and their farmers.

The Impact – a 2015 nonpartisan analysis reported that only 37 percent of those surveyed feel that GMOs are safe to eat and 57 percent considered them unsafe. Individuals with a higher education, on the other hand, were more likely to consider GMOs safe. Numerous studies also show that consumer knowledge of GMOs is low and that their information is mainly sourced by the media – insert cousin Carol’s shared Facebook article on GMOs’ toxic effects. The fear continues.

Paleonola grain free granola

Source: thrivemarket.com

Gluten Free and Grain Free

In his book Grain Brain, David Perlmutter writes, “Gluten sensitivity represents one of the greatest and most under-recognized health threats to humanity.” The well-known blogger, Wellness Mama, once wrote an article titled “How Grains are Killing You Slowly” (but has since changed the title). The 2015-2020 Dietary Guidelines for Americans, on the other hand, list grains (specifically whole grains) as a part of a healthy eating pattern. How did this extreme divide on gluten and grains come about?

The 1990’s brought about increased awareness of celiac disease and the effectiveness of treatment following a gluten-free diet. This was a major win and relief for folks with gluten-related disorders. What followed was an increase in the amount of research on gluten and its potential effects on other chronic disorders – and that’s when hysteria hit. Books like Grain Brain and Wheat Belly, both which have been accused of literature cherry-picking and generalization, earned best-selling status and changed the way we looked at a baguette. This frenzy, combined with the highly popular low-carb Atkins Diet, created the recipe for a new villain – gluten and grains.

The food industry responded and so did the media. According to the research firm Packaged Facts, sales in gluten-free products came in around $973 million in 2014 and are expected to exceed $2 billion by 2019 – far exceeding what would be expected in marketing to the less than one percent of individuals with celiac disease. Oh, and these products are about 240% more expensive. Celebrity influences like Gwyneth Paltrow’s book and Miley Cyrus’ tweet, have made the gluten-free diet appear more mainstream, swaying consumer perception and decreasing the seriousness of disorders like celiac disease.

While research on non-celiac gluten sensitivity (affecting about six percent of the U.S.) is still mixed, many studies suggest that gluten may not necessarily be the underlying problem and symptoms may even be psychological. In his book, The Gluten Lie, Alan Levinovitz explains that the significant increase in negative responses to gluten may be due to a phenomenon called Mass Sociogenic Illness – where a physiological response is provoked by mass anxiety and negative expectations.

The Impact – a 2015 Hartman Group survey found that 35% of respondents adopted a gluten-free lifestyle for “no reason,” 26% followed it because they thought it was a “healthier option,” 19% followed it for “digestive health,” and only 8% followed it because of a “gluten sensitivity.”

There is a growing body of research that suggests there is no evidence to support gluten-free diets for the general population and that going gluten-free may even hinder health. Nevertheless, the damage may be done.

 

usda organic label

Source: usda.gov

Going Organic

The USDA Organic label identifies a product that meets federal guidelines for farming and processing. Guidelines include soil quality, animal raising practices, pest and weed control, and the use of additives. As far as organic packaged foods, 95% of the product must be organic and free of artificial preservatives, colors, or flavors.

The organic movement is a step in the right direction towards encouraging more responsible agricultural practices. However, the social impact of the organic label has created unwarranted confusion and fear in “chemically-ridden” conventional foods that aren’t free of synthetic fertilizers or pesticides. The fear is hurting small farmers and our wallets.

A common source of organic fear-mongering comes from the infamous Dirty Dozen published by the Environmental Working Group (EWG). This list identifies twelve non-organic produce items that are reported to have the highest levels of pesticide residue. What the EWG fails to mention, however, is the type of pesticide and its relation to its chronic reference dose (i.e., safe maximum daily dose for life). A Journal of Toxicology study found that none of the dirty dozen products came even close to their reference dose and that EWG’s methodology lacked scientific credibility. While there is nothing wrong with being mindful of pesticide use, people should know organic farmers use pesticides too and their levels are not tested by the USDA.

From a nutrition perspective, research on organic food is mixed. Both organic and conventional practices offer nutritious produce with plenty of phytochemicals; however, organic produce may come out on top as far as levels of phosphorous, antioxidants and less pesticide residue.

From a health-outcome perspective however, there is no direct evidence that organic diets lead to improved health or lower the risk of disease and cancer. Pesticide residue risk, if a concern, can be reduced by simply washing fresh produce.

Lastly, organic farming, labeling, and products are expensive. If price is keeping consumers from purchasing organic produce and fear is keeping them from purchasing conventional produce, we have a problem.

In a country where less than twenty percent of adults eat their daily recommended fruits and vegetables, all produce should be promoted without adding unnecessary confusion or fear.

 

all natural health claim label

Source: topclassactions.com

“Natural” and “Free of …”

According to a 2014 global health survey, 43% of respondents rate “all-natural” foods very important in purchasing decisions. Therefore, having that green and neutral-colored label considerably influences consumer behavior. In regards to meat and poultry, the USDA defines “natural” as containing no artificial ingredients, added colors, and minimal processing. Unfortunately, there is no regulated definition of the use of “natural” for all other products – hence marketing exploitation and further confusion. Below are just a few assumptions that consumers make about natural products regarding what they’re free of, and whether or not that really matters:

Free of Preservatives: Preservatives in food help delay spoilage, improve quality, and decrease food waste. They decrease the risk of food-borne illness, lower oxidation in the body, and keep us from worrying about things like getting tuberculosis from our milk. Consumers often fear ingredients that have chemical-sounding names; however, lest we forget, we are made of chemical compounds!  Many preservatives are harmless and even nutritious like ascorbic acid (vitamin C), alpha-tocopherol (vitamin E), calcium propionate, niacin (vitamin B3), lysozyme, and tertiary butylhydroquinone (TBHQ). Some other preservatives, however, may have questionable effects on health when consumed in high doses, so more research is needed on their safety.

No Antibiotics Ever: This term’s tricky. For a long time, many farmers used antibiotics not just for the treatment of ill animals but also to facilitate growth. The FDA has since banned the use for growth and animal antibiotics sales have fallen considerably. However, sick animals do need treatment and not using antibiotics to treat them would be unethical and pose a risk to food safety. So, here’s the deal to understanding the label: Farm A has a sick chicken which they treat with antibiotics. The chicken is therefore removed from the antibiotic-free group for sale (and who knows what that means). Farm B has a sick chicken which they treat with antibiotics. The chicken then goes through a withdrawal period and is tested before it can be used for processing, often with the oversight of a licensed veterinarian. Only Farm A can have the “No Antibiotics Ever” label. Is Farm A healthier than Farm B? Probably not.

No Hormones Added: Fun fact: adding hormones or steroids to poultry and pork is illegal in the U.S. Just like tomatoes with a Non-GMO label, chicken and pork products with a “No Hormones Added” label are simply playing into consumer fears.

Free of High Fructose Corn Syrup (HFCS):  Great! But keep in mind that sugar, molasses, agave nectar, cane juice, and honey are “natural” sources of added sugars too. HFCS is essentially a mix of fructose, glucose, and water. It varies from having either 42% fructose (often found in processed food) to 55% fructose (often found in soft drinks) – not too different from sugar with a 50:50 mix or your $10 organic agave nectar.

 

chicken breast no antibiotics non gmo organic

Source: target.com

Conclusion: Fear Mongering Isn’t Helping

When it comes to promoting healthy eating behaviors, fear tactics aren’t helping and may even be harmful. Unlike tobacco or drug use, two issues where fear campaigns were successfully used to impact behavior, we need to eat to live. Instilling unnecessary anxiety about foods that are not Non-GMO, gluten-free, certified organic, or “free from” whatever may keep us from consuming a nutritious, well-balanced diet.

Unfortunately, the U.S. hasn’t learned its lesson from the anti-fat and anti-cholesterol era because we continue to look for something simple to blame for health problems, and the media and food industry continues to take advantage of that desire. Moderation just isn’t sexy.

Whether it’s the latest one-dimensional diet, a food blogger’s recent witch hunt, or a misleading food label in an earthy color tone, fear-induced messages are not helping. They are harming consumer knowledge, self-efficacy, health, and ultimate trust in food industry and nutrition science. It’s time to stop the food fear mongering and encourage the good in foods that will lead to our “natural” wellbeing.

 

Megan Maisano is a second year NICBC student and an RD-to-be. She has a Wheat Belly and a Grain Brain, but is doing okay. She’s got no beef with Non-GMO, Gluten-free, or Organic products, only their use in scare-tactics that aren’t based in science.

From Blue to Green, and Everything in Between: The Evolution of Saint Patrick’s Day

by Megan Maisano

Saint Patrick’s Day—when wearing green, eating corned beef and cabbage, and drinking beer has nothing to do with Saint Patrick himself. This month, Megan Maisano explains the history behind the holiday and the American influence on its evolution and popularity.

Somewhere behind the green shamrocks, the Kiss Me, I’m Irish attire, the corned beef and pints of Guinness, lies the fascinating history of Saint Patrick’s Day. The holiday that began as a religious observance for the patron saint of Ireland in the early 10th century, has evolved into an international celebration of Irish culture.

But when it comes to the modern practices we often associate with the holiday … they may as well be as Irish as Saint Patrick himself (hint – he’s not Irish).

Saint Patrick (Photo: history.com)

Saint Patrick (Photo: history.com)

History of Saint Patrick

The story of Saint Patrick dates back to the fifth century. Originally named Maewyn Succat, the saint was born to a wealthy family in Britain. When he was a teenager, he was taken as a slave to Ireland and put to work as a shepherd. A religious experience inspired him to become a priest after his escape, and eventually return to the island as a missionary. Legend has it that Saint Patrick banished all the snakes from Ireland. While that may sound impressive, truth-be-told there were never any snakes on the island to begin with. The story is often used as an allegory to explain how he converted the Irish from Paganism to Catholicism. March 17th marks the date of Saint Patrick’s supposed death and has remained a holy day ever since.

American Influence

Until the 1700’s, Saint Patrick’s Day was a holy, and quite somber, day for Irish Christians. But as more Irishmen immigrated to the U.S., particularly during the Great Potato Famine in the mid 1800’s, the holy day became a time for connection amongst Irish immigrants and an outlet to celebrate their shared history. Irish organizations and societies arose, and in 1848, New York City had the first official Saint Patrick’s Day Parade. What was once a holy day of obligation slowly transformed into a patriotic one-day reprieve from lent, allowing indulgences like meat, alcohol, music and festivity. Today, Saint Patrick’s Day is one of the most globally celebrated national holidays.

Saint Patrick’s Day: Things Explained

The Wearing Of Green (Lyrics: irishmusicdaily.com)

The Wearing Of Green (Lyrics: irishmusicdaily.com)

Green Everything

Up until the 18th century, the color associated with the Order of Saint Patrick was actually blue. The significance of the color green stems from supporters of the Irish Rebellion of 1798, showing their solidarity against the British in red and their loyalty to the native Irish shamrock.

 

Shamrocks (Photo: Pixabay)

Shamrocks (Photo: Pixabay)

Shamrocks

A popular legend about Saint Patrick is that he used a shamrock as a way to symbolize the Holy Trinity and win over Irish Christians. Each leaf of the native clover represented the Father, Son, and Holy Spirit. The shamrock became an even greater symbol of Irish nationalism when it was worn during the 1798 rebellion.

 

Corned Beef and Cabbage

Unfortunately, this Saint Patrick’s Day dish does not originate in Ireland. For hundreds of years, the meal had no ties to a specific country, but was rather a practical dish for many European immigrants in the U.S.

The term “corned beef” is British slang for using corn-sized salt crystals to cure meat. In the U.S., few Irish immigrants could find or afford the bacon they grew up with. Instead, they purchased meat from their Jewish neighbors. The kosher butchers made more affordable corned beef from brisket. As a result, the Irish swapped their traditional meal of boiled bacon and potatoes for corned beef and cabbage. Why cabbage? It was one of the cheapest vegetables available.

While the sweet and salty dish gained popularity in immigrant neighborhoods throughout the 19th century, its association with the Irish may not have stuck if it weren’t for George McManus’ popular American comic, “Bringing Up Father.” The comic featured Jiggs and Maggie, an Irish couple who win the lottery and become millionaires. Even as a millionaire, Jiggs’ fondness of playing cards, drinking, smoking, and of course, eating corned beef and cabbage became an influential stereotype of Irish immigrants that lasted beyond the comic strip’s running.

In the mood for this hearty Irish(ish) dish? Try this Genius Kitchen recipe for a crowd-pleasing meal this Saint Patrick’s Day. You’ll get a hefty serving of protein, vitamin B-12, iron, and zinc, as well as vitamin C (boosting that iron absorption), vitamin K, and fiber. The cabbage may also improve your running performance too! Bonus—Guinness beer is one of the ingredients, so you can still technically call it Irish.

Soda Bread

Did you know that there is a Society for the Preservation of Irish Soda Bread? Well there is. And just like corned beef and cabbage, soda bread cannot be credited to the Irish either. Sigh. The dish, which is a variety of quick bread that uses baking soda instead of yeast, traces back to Native Americans who used pearl ash (potassium carbonate) to make bread on hot rocks. The bread became associated with the Irish because of its use during the Great Potato Famine. The recipe required few ingredients, didn’t take long, was dense and hearty, and reduced food waste.

Need a simple, go-to, quick bread recipe? This epicurious recipe for Irish soda bread can be made by even the most novice baker. The added raisins (or Craisins, your choice) can boost its fiber content too.

Beer

While alcohol is often tied into many religious feasts, it wasn’t until the 1970’s that pubs were open on Saint Patrick’s Day in Ireland. As mentioned before, the holiday falls during Lent – a period of Christian repentance. According to a New York Times journalist, Dublin was once “the dullest place on earth to spend St. Patrick’s Day.” But because of the economic and tourism opportunity, Ireland adopted the American way and the rest is drunk history.

Are you a fan of statistics? Then grab yourself a Guinness and learn about how the Student’s T-test was created. Yes, there was Guinness involved.

If baking is more on your mind more than confidence intervals, grab yourself a Guinness AND some Baileys, and make these unforgettable Saint Patrick’s Day cupcakes with this Tide & Thyme recipe (my mother and I made these a few years ago, and I still dream of that Baileys cream frosting).

Conclusion: Who Cares?

Alright, so many of our beloved Saint Patrick’s Day traditions may not necessarily trace to Ireland or to the saint himself. But at the end of the day, who cares? Saint Patrick’s Day offers a nostalgic opportunity for people from all over the world to come together and be Irish for a day. So, go ahead and proudly rock that green t-shirt on March 17th, chances are you won’t be alone.

Megan Maisano is a second year NICBC student and an RD-to-be. Irish by blood and frugal by school, cabbage is a staple food in her fridge. In 2012, she earned her Perfect Pint Pour Certificate from the Guinness factory in Dublin and is available for individual lessons.

References:

  1. Allan, Patrick. The Real History if St. Patrick’s Day. Lifehacker.com. March 2017. Internet: https://lifehacker.com/the-real-history-of-st-patrick-s-day-1793354674 (accessed February 2018).
  2. Binchy, Maeve. A Pint for St. Patrick in the New Ireland. The New York Times. March 2001. Internet: http://www.nytimes.com/2001/03/17/opinion/a-pint-for-st-patrick-in-the-new-ireland.html (accessed February 2018).
  3. Binder, Julian. The ‘historical’ Saint Patrick. Approaching the Life of Ireland’s Patron Saint. GRIN Verlag. Norderstedt, Germany. 2015.
  4. Cronin, M. and Adair, D. Wearing of green: A history of St. Patrick’s Day. Routledge. London. 2002.
  5. History.com. The History of Saint Patrick’s Day. 2009. Internet: http://www.history.com/topics/st-patricks-day/history-of-st-patricks-day (accessed February 2018).
  6. O’Dwyer, Edward. The History of Soda Bread. The Society for the Preservation of Irish Soda Bread. Internet: http://www.sodabread.info/history/ (accessed February 2018).
  7. Ruby, Jeff. Even the Irish Hate Corned Beef and Cabbage. The Daily Beast. March 2017. Internet: https://www.thedailybeast.com/even-the-irish-hate-corned-beef-and-cabbage (accessed February 2018).

Evaluating the Pinnertest: The Importance of Scientific Evidence

by Erin Child

So, you think you have a food intolerance? What do you do? You could call your doctor and set-up an appointment that is inevitably months away. Then you have a 10-minute meeting in which they only look at their computer and refer you to a specialist, THEN go through more testing, and finally (hopefully!) get some answers. Or, you could order an at-home kit that takes 10 minutes to complete and promises results that will get you feeling better, sooner. Which one do you choose? Read on and decide.

In our current world of food intolerances and hypersensitivities in which the best path to treatment is often a conundrum, the Pinnertest promises an easy solution to any dietary woes.  A few months ago, I started noticing ads for this new test popping up on social media. The Pinnertest is an over-the-counter food intolerance testing kit that uses microarray technology to test for IgG (Immunoglobulin G) mediated sensitivities for 200 common foods.

The classic manifestations of true food allergies (hives, oral discomfort, trouble breathing, anaphylaxis, etc) are mediated by overproduction of IgE antibodies. Like IgE, IgG is a type of antibody. And IgG is the most common antibody in the human body. (The immune system releases five types of antibodies: IgA, IgE, IgG, IgD, and IgM.) Instead of testing IgE mediated allergies, the Pinnertest producers claim that the microarray technology allows them to test for IgG mediated intolerances to 200 different foods—including lettuce, quail, and baking powder—using only a few drops of blood. It sounds scientific, but also seemed too good to be true. Was it?

I started my research by reaching out to the Pinnertest folks directly. My goal? To score a pro-bono test to try it out myself and see the results first hand. I was thrilled when a friendly representative at Pinner immediately reached out to set up a phone interview (calendar evite and everything). When the day came, I called—and was sent to voicemail. Twenty minutes and five tries later, I knew I had been ghosted. My subsequent emails were ignored, and my quest to learn first-hand about the scientific evidence backing their product was squashed.

So, I began researching on my own. The Pinnertest website sports a cluttered page of medical study citations that cover work on food allergies, intolerances and Celiac Disease—but none of which provide any evidence of using IgG testing for food intolerances.  My own PubMed search [IgG + food intolerance; Immunoglobulin G + food intolerance] yielded little, but did include one recently retracted 2016 article linking IgG testing to food allergies. The rest of the Pinnertest website leads you down a rabbit-hole of B-list celebrity endorsements and every Friedman student’s favorite—Dr. Oz videos! Interestingly, nowhere on the site does it tell you the cost of the test. To find out pricing, you must first enter your information (“Hi, my name is Newt Trition”) before you discover that the test will run you a whopping $490.

To further explore if this test has any scientific merit, and is worth the hefty price tag, I reached out the Boston Food Allergy Center (BFAC). Dr. John Leung, MD, the founder and CEO of the BFAC, and the Director of the Food Allergy Center at Tufts Medical Center and Co-Director of the Food Allergy Center at Floating Hospital for Children, took some time out of his day to answer my questions. Dr. Leung said, “We have patients coming into our office on a weekly basis with that kind of report [IgG], who pay out of pocket no matter what insurance they have. [Insurance doesn’t cover the test] because there is a statement from both the American and European Societies for Allergy saying that this test has no clinical significance.”

This is something to consider in any area of medicine—if a test is not covered by insurance, it may be the first sign that clinical significance could be lacking.

My conversation with Dr. Leung was brisk, informative, and confirmed my gut reaction that this test was too good to be true. Furthermore, there is a body of literature providing evidence that IgG mediated reactions are a sign that a food intolerance does not exist, not the other way around. In a 2008 European Academy of Allergy and Clinical (EAACI) Task Force Report, the authors wrote, “food-specific IgG4 does not indicate (imminent) food allergy or intolerance, but rather a physiological response of the immune system after exposition to food components.” Simply put, IgG evidence can show that you’ve been eating that food, not that you are intolerant to it. The EAACI has been joined by their Canadian, American, and South African counterparts in clear statements that research does not support the use of IgG mediated testing for food intolerances at this time.

Having shadowed Dr. Leung at the BFAC, I know that he takes patients’ claims of food intolerances seriously, and is invested in using the best clinical practices and scientific evidence available to make the diagnosis. Concerning IgG mediated testing, he stated, “There is so little research, so from a clinical view it is not very useful, it doesn’t mean much. It is not diagnostic.” And yet, the Pinnertest website claims that the“Pinnertest is a common procedure in most European countries. In many cases, dietitians and nutritionists will ask for their client’s Pinnertest results before creating any kind of diet plan.” Since this approach directly contradicts the current EAACI recommendation, that’s highly unlikely.

I also had the opportunity to speak with Rachel Wilkinson, MS, RD, LDN, and Practice Manager of the BFAC. Rachel explained, “If patients come in concerned about food intolerances, we can do the hydrogen breath test for lactose, fructose or fructan [found in some vegetables, fruits and grains]. These are the three main ones [food intolerances] we can test for, because we actually have tests for those.” She went on to state, “What was interesting to me about the Pinnertest, was how they say they can specify one specific food–so not just a category. I honestly don’t understand how they would pinpoint a specific food. It makes more sense to me to listen to patient’s histories and to look at how their intestines are able to break down that particular group of sugars. So, I really would love to know how they [Pinnertest] are coming up with this.”

It is important to note that the Pinnertest is not just marketing itself as a food intolerance test. It is also presenting itself as a weight loss tool. Current Frances Stern Dietetic Intern and Masters Candidate Jocelyn Brault, interning at BFAC, indicated her concern: “I think this is also being marketed for weight loss, which you can see throughout their website. This is usually a good sign that we should dig deeper. Is this a proven weight loss method? This claim seemed out of nowhere to me.” Indeed, directly on the Pinnertest box it reads, “Discover which foods are making you sick or overweight.” If taken seriously, this test will result in unnecessary diet restrictions, and potential malnutrition if too many foods are unnecessarily eliminated. Rachel Wilkinson, RD noted, “if you’re going to be avoiding certain types of foods, you need to make sure your diet is still adequate. We do not want to see people over-restricting foods for no reason.”

Over the course of my research and conversations with Dr. Leung, Rachel, and Jocelyn, I confirmed that my initial gut reaction was correct: too good to be true. And here’s the kicker, so does The Pinnertest. In a tiny disclaimer at the bottom of their website, they write: “Quantification of specific IgE antibodies to foods and inhalants is an FDA-accepted diagnostic procedure for the assessment of allergies. However, the assessment of human IgG antibodies specific for individual food and inhalant antigens is not an FDA-recognized diagnostic indicator of allergy.”

It is a noble task to try to design an allergy test that does not require you to doctor hop, or wait months for an appointment, but the scientific evidence needed to back up the Pinnertest is lacking. Perhaps one day this will shift, and the body of evidence will improve. In the meantime, however, anyone who thinks they might have a food intolerance (or food allergy) is best served by going to their clinician (and then a dietitian). This at-home kit promises a quick fix, but is really just an expensive, dangerous distraction.

Erin Child is a second-semester NICBC student in the dual MS-DPD program. She is fascinated by the science of food allergy and intolerances, and will probably keep writing about them until someone tells her to stop.  With two weeks left in the semester, she would really like a nap. Like right now.

Nutrition in a Nutshell: Lessons Learned as a Dietetic Intern

by Katelyn Castro

I was one of those few teenagers who knew exactly what I wanted to be when I grew up. Now, after four years of college and two years of graduate school combined with a dietetic internship, a career as a registered dietitian is not far out of reach. While my passion for nutrition has never dwindled over these last six years, my approach nutrition has changed significantly.

Nutrition tips on the sidebar of Self magazine, an over-simplified nutrition lesson in a health class in middle school, and a quick nutrition lecture from my pediatrician, summed up my understanding of nutrition before entering college. Now­—six years of coursework and 2000+ hours of dietetic rotations later—I not only know the nitty-gritty details of nutrition science, but I also have learned some larger truths about nutrition that are not always talked about.

Beyond what you may read as you thumb through your social media feed, or even what you may learn from an introductory nutrition textbook, here are some of the lessons that I have acquired about nutrition along the way:

1- Nutrition is an evolving science.

First, let’s be clear that nutrition is a science that relies on concepts from biology, chemistry, anatomy, physiology, and epidemiology to study how nutrients impact health and disease outcomes. Understanding how diabetes alters carbohydrate metabolism allows people with diabetes to live without fear of dying from diabetic ketoacidosis or seizures due to unsafe blood glucose levels. Understanding how ulcerative colitis impacts mineral absorption and increases protein losses helps those with the condition manage nutrient deficiencies with adequate nutrition supplementation. These are only a few examples of the many ways our knowledge of nutrition science makes it possible to improve individuals’ health outcomes.

However, the more I learn about nutrition, the more I realize that the research still holds many unanswered questions. For example, previous nutrition guidelines, like when to introduce hypoallergenic food to children, are being disproven and questioned by more recent studies. On the other hand, research on the gut microbiota is just beginning to uncover how one’s diet interacts with their gut microbiota through hormonal and neural signaling. Staying up-to-date on the latest research and analyzing study results with a critical eye has been crucial as new scientific discoveries challenge our understanding of nutrition and physiology.

Who would have thought a career in nutrition would require so much detective work?

 2- Food is medicine, but it can’t cure everything.

The fact that half of the leading causes of death in the U.S. can be influenced by diet and physical activity highlights the importance of nutrition for long-term health. Using medical nutrition therapy for patients with variety of health problems, ranging from cancer and cardiovascular disease to cystic fibrosis and end-stage renal disease, has also allowed me to see nutrition powerfully impact the management and treatment of many health conditions. High cholesterol? Avoid trans fat and limit saturated fat in foods. Type 2 diabetes? Adjust the timing and type of carbohydrates eaten.

While making simple changes to eating habits can improve lab values and overall health, nutrition is often only one component of treatment accompanied by medication, surgery, therapy, sleep, and/or stress management. Interacting with patients of all ages and health problems, and working with health professionals from a range of disciplines has forced me to step out of my nutrition bubble and take a more comprehensive approach to patient care: Improving quality of life and overall health and wellbeing is always going to be more important than striving for a perfect nutrition plan.

3- Nutrition is political and nutrition messages can be misleading.

Back when the Academy of Nutrition and Dietetics was one of many health organizations sponsored by Coca-Cola and PepsiCo, I realized how much influence large food industries have on food advertising, marketing, and lobbying. With known health consequences of drinking too many sugary beverages, the concept of health organizations being sponsored by soda companies was perplexing to me. Learning more about the black box process of developing the government dietary guidelines has also made me more cognizant of government-related conflicts of interest with industries that can color the way nutrition recommendations are presented to the public.

Industry-funded nutrition research raises another issue with nutrition messaging. For example, only recently a study revealed that the sugar industry’s funded research 50 years ago downplayed the risks of sugar, influencing the debate over the relative risks of sugar in the years following. Unfortunately, industry-sponsored nutrition research continues to bias study results, highlighting positive outcomes, leaving out negative ones, or simply using poor study designs.  While sponsorships from big companies can provide a generous source of funding for research, as both a nutrition professional and a consumer, I’ve learned to take a closer look at the motives and potential bias of any industry-funded nutrition information.           

4- Nutrition is not as glamorous as it sounds, but it’s always exciting.

When the media is flooded with nutrition tips for healthy skin, food for a healthy gut, or nutrients to boost mood, the topic of nutrition can seem light and fluffy. With new diets and “superfoods” taking the spotlight in health magazines and websites, it’s easy to think of nutrition as nothing more than a trend.

However, any nutrition student or dietitian will prove you otherwise. In the words of one of my preceptors, “my job [as a dietitian nutritionist] is not as glamorous and sexy as it sounds.” Throughout my dietetic rotations, my conversations with patients and clients have gone into much more depth than just aesthetics and trendy nutrition topics. If I’m working with a patient with Irritable Bowel Syndrome, bowel movements (a.k.a poop) may dominate the conversation. If I’m counseling someone who has been yo-yo dieting, I may be crushing their expectations of fad diets while encouraging more realistic, sustainable healthy goals. If I’m speaking with a group of teenagers with eating disorders, I may not talk about nutrition at all and focus more on challenging unhealthy thoughts and behaviors about food. It is these conversations, discussing what really matters when it comes to food, nutrition, and overall health that make a career in nutrition ever-changing and always exciting.

Katelyn Castro is a second-year student graduating this May from the DI/MS Nutrition program at the Friedman School. She hopes to take advantage of her experiences at Tufts to make positive impact on individuals’ health and wellbeing through community nutrition outreach. You can follow on her journey as she blogs on all things relating to food and nutrition at nutritionservedsimply.com.

 

 

Finding Common Ground for Nutrition in a World of Alternative Facts

by Rachel Baer

Rachel Baer tackles the implications of the “post-truth” culture for the nutrition profession and poses 3 questions to consider about our response to the unending barrage of nutrition-related “alternative facts.”

As a registered dietitian, I can tell you this: Nutrition professionals know a thing or two about alternative facts. We spend our careers with textbooks and scientific journals in hand, waiting for the next misinformed food fad to go viral. We fight to defend the facts because we have always believed that if we could show people what is true, we could convince them that we have the best answers for their nutrition-related questions. But the concept of truth is losing popularity.

The Oxford English Dictionary declared the term “post-truth” to be the 2016 word-of-the-year. Post-truth is defined as “related to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” Let that sink in for a moment: emotional appeals are more influential than objective facts. While this concept is alarming on many levels, I am particularly concerned about its implications for health professionals who rely on scientific truths as the basis of their credibility.

Don’t get me wrong. I understand the frustration people feel as they watch seemingly contradictory nutrition headlines emerge at the very hint of new research findings. One day people are told to limit egg consumption to 3 yolks per week, the next, the one-yolk-per-day allowance is back. However, as nutrition professionals, we have a certain appreciation for the fact that science is ever-evolving. We hold our recommendations lightly because we believe in a scientific community that is always growing, and that new discoveries only sharpen our understanding of nutrition and physiology. The public, on the other hand, does not always share this appreciation.

Confusion over wavering nutrition claims is exacerbated by the inundation of un-credentialed, unschooled voices clamoring for attention in popular media. Social media has provided a proverbial soapbox for anyone with a passionate message to share, regardless of qualifications. Simultaneously, dietitians tend to hold back on making bold retorts, often waiting for consensus to catch up with the fads so that our recommendations are supported with the latest research. This seeming imbalance of voices alongside the emergence of the post-truth culture only perpetuates the proliferation of unfounded claims, or “alternative facts,” as they have become popularly known.

I have no easy answers for this predicament, but here are 3 questions that we could benefit from exploring as nutrition professionals:

1. How do we remain experts while also being compelling?

Dietitians have long been referred to as the “food police.” While I resent this reputation, it highlights a worthy question: Do nutrition professionals present information in a way that is relatable, realistic, and winsome to the people whose respect we want to gain?

We can no longer depend solely on the letters after our names to gain an audience with the public, particularly when we are pitted against wayward blog and media influencers using sensationalized language to win over vast groups of people who blindly follow their passionate advice. The internet is full of examples of people preferring to follow the advice of a persuasive friend or influencer over the advice of a knowing professional. While this situation is endlessly frustrating to those of us who see through their hyperbolic messages, is there anything we can learn from these blog/media personalities that may help us reach the audience they seem to have hooked? How do we successfully build rapport with the public while maintaining good science?

2. How do we talk about fundamentals in a world that wants controversy?

Let’s face it. Fundamentals don’t make great headlines. For decades, consensus research has revealed that a diet full of minimally-processed fruits, vegetables, whole grains, nuts/seeds, lean proteins, and healthy fats is unequivocally and unanimously the best diet for human health. Yet, people still search elsewhere looking for the latest and greatest weight-loss, risk-reducing, and health-enhancing diets. Could it be that balance is more challenging than we thought? Perhaps avoiding certain food groups or food ingredients altogether is easier than the amorphous concept of moderation? Our greatest challenge is not getting more people to consume health information, it is finding new and compelling ways to deliver the information we’ve known for decades, and this is no small task.

3. How do we overcome differences within the nutrition profession to present a united front to people lost in the sea of alternative facts?

In 2014, David Katz and Walter Willet co-chaired a conference sponsored by the non-profit Oldways*, titled “Finding Common Ground.” Oldways and the co-chairs assembled what they referred to as “the dream team of nutrition experts,” including Friedman’s own, Dariush Mozaffarian, as well as Dean Ornish, creator of the Ornish Diet; David Jenkins, founder of the glycemic index; Boyd Eaton, originator of the Paleolithic diet; Collin Campbell, author of The China Study; and a myriad of others. Known most commonly for their differences, this group of scientists gathered together for the sole purpose of coming to a consensus on the basic tenants of a healthy diet. In the end, the group agreed on 11 common denominators of the widely differing philosophies they espouse. The topics ranged from fruit and vegetable consumption, to sustainability, to food literacy.

Following the conference, David Katz published an article in Forbes where he said “…it is the controversies at the edge of what we know that interest experts most, but ask [experts] about the fundamentals, and the vast expanse of common ground is suddenly revealed.” The Common Ground committee’s decision to gather around a table, invite open dialogue, and pursue unity is something we could all learn a lesson from. Alternative facts will always provide fodder for hucksters and peddlers of over-simplified nutrition information, but the scientific community has a vast body of research that unites us. As nutrition professionals, we cannot forget that our voices will always be more powerful together than they ever will apart.

Rachel Baer is a registered dietitian and a first-year in the NICBC program at Friedman. Her favorite foods are Brussels sprouts and brownies, and she loves nothing more than cooking great meals and gathering people around a table.

*Editor’s Note, 5/1/17  2:09 PM: An earlier version of this article incorrectly spelled the name of the organization, “OldWays.” The correct spelling is Oldways, and the change has been made above.

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

The Dr. Oz Effect

by Julia Sementelli

With the beginning of the new year inevitably comes an onslaught of promotions and advertisements for miracle diets, detoxes, and supplements that vow to help you shed pounds, live longer, etc. And when you think of diets and supplements, most likely two words come to mind: “Dr. Oz.”  He is a doctor, but he is also a registered dietitian’s worst nightmare. While dietitians are out there teaching patients and clients that weight loss cannot be healthfully achieved in a pill or in a 2 week “cleanse,” Dr. Oz is preaching the opposite. Read on for the inside scoop of how Dr. Oz further complicates the already messy, ever-changing world of nutrition and health, including an interview with the man himself.

A recent client of mine, Mark (name changed for privacy), eats a fairly healthy diet: Greek yogurt and berries for breakfast, a salad with lean protein for lunch, and something from the Whole Foods salad bar for dinner (he doesn’t like to cook).  He says that his major downfalls are cookies and beer. Mark’s goal is to lose 30 pounds and improve his overall health given his family history of heart disease. “Give me a meal plan and I will follow it,” says Mark. I can work with that. He is actually a dietitian’s dream—someone who already doesn’t mind eating well and is motivated to lose weight. I thought his meal plan would be a breeze, until he said “Oh—I should tell you about my supplements.” I had expected a multivitamin and some daily vitamin D, but my hopes were dashed as Mark rattled off more than 15 supplements that he is currently taking, only one of them being a multivitamin. Among these supplements were resveratrol, an antioxidant found in red grape skins that he claims sheds years off of your life, and Conjugated Linoleic Acid (CLA), which apparently melts body fat. When I asked Mark where he learned about all of these supplements, he said “Dr. Oz.”

No two words can send angry chills up a dietitian’s spine quicker than Dr. Oz. While I am a fairly green registered dietitian, I have interacted with enough patients to see firsthand the power of Dr. Oz. Dr. Mehmet Oz started out as the resident expert on “The Oprah Winfrey Show” for five years before he was given his own spotlight, “The Dr. Oz Show.” He holds three degrees: a B.S. in biology from Harvard and an M.D. and M.B.A. from the University of Pennsylvania. He is vice-chairman of the department of surgery at the Columbia University College of Physicians and Surgeons in New York. He is also likeable. Consequently, he has become one of the most trusted doctors in the world and yet he uses words like “magical” and “miraculous” to promote supplements that promise to burn fat or prevent cancer. However, what the public may not understand is that a pill is not a miracle cure for anything. According to Stephanie Clarke, registered dietitian and co-owner of C&J Nutrition in New York City: “Most MDs get very little (or zero) nutrition education and background—so it’s a frustrating when they dole out nutrition advice or research without enough details or without thinking about how their messages will be interpreted by the public and related to real life eating.” But Americans continue to believe in the power of nutritional supplements recommended by a doctor that (most likely) has had minimal nutrition education and, more surprisingly, continue to buy them.  In fact, Americans spent more than $21 billion on vitamins and herbal supplements in 2015.  According to analyses, just the mention of a product on the Dr. Oz Show causes a surge in sales.

This phenomenon has been coined as “The Dr. Oz Effect.” Combine charismatic with a few letters after his name and you have someone who is more believable than the thousands of nutrition professionals that use science, not pseudoscience, to back up their recommendations. Even my own father, who has type 2 diabetes, an affinity for soy sauce (read: sodium), and meets my attempts to improve his diet with stubbornness, listens to Dr. Oz. Meanwhile, I have gone through four years of undergraduate education in nutrition, applying for competitive dietetic internships (50% acceptance rate), a one year unpaid dietetic internship, studying for and passing a comprehensive exam, and an additional two years of graduate work to get to where I am. And yet I still don’t have the influence that Dr. Oz does to change my father’s food behaviors.

As a dietitian, I strongly believe in balance. It is my goal to reduce the all-or-nothing thinking that surrounds eating and exercise. The media and people like Dr. Oz perpetuate this mindset, capitalizing on the public’s obsession with weight loss and diets by highlighting drastic regimens and alleged cure-all supplements. Diets do not work because they typically deprive a person of entire food groups, fats or carbohydrates, for example, and eventually the individual gives in and eats those food groups in excess since they have been denying themselves of them for so long.

The demonization of food, another spawn of the media, is the belief that particular foods are good or bad. It has resulted in mass confusion and further damage to peoples’ relationship with food. One of the most infuriating examples of this demonization is fruit. Yes, fruit. “I heard that the sugar in fruit is bad for you” or “I was told not to eat pineapple because it is high in sugar” are actual quotes that I have heard from clients. And not surprisingly, both clients attributed their beliefs to Dr. Oz. After some research, I discovered that, lo and behold, Dr. Oz did a segment titled “Can the Sugar in Fruit Make You Fat?” that most likely influenced these beliefs. Aside from vegetables, fruit is one of the most wholesome food groups, packed with fiber, antioxidants, vitamins, and minerals. Yet fruit cannot even avoid falling victim to the war on food. Conundrums like this exist for nearly every food: eggs, fish, coffee, potatoes…the list goes on. The only way to try to reverse the damage is to tell people that no food is off limits and remind them that there is no replacement for good eating and regular exercise. The only way that I have seen weight loss occur is with gradual and sustainable changes over time. And anyone that promises anything different is lying or worse, using pseudoscience to make outrageous claims.

Pseudoscience, the basis upon which Dr. Oz has constructed his lucrative empire, involves exaggerated and often contradictory claims that are not supported by reputable research. The media is also a culprit of using pseudoscience, composing articles and news stories from press releases of studies with small sample sizes or that use mice as their subjects. Just because it is effective or safe for mice, does not mean it will be safe for humans. Many writers for tabloids and mainstream magazines are stretched for time and are more concerned with quantity rather than quality given that their main goal is to make headlines that sell papers and magazines. Unfortunately, such writers and apparent health experts like Dr. Oz produce the majority of what the general public sees and uses to shape its food choices. However, according to a study published in the BMJ in 2014: “Consumers should be skeptical about any recommendations provided on television medical talk shows, as details are limited and only a third to one half of recommendations are based on believable or somewhat believable evidence.” That’s right—more than half of what Dr. Oz claims on his show regarding nutrition is not based on science. While the show has seen a dip in ratings, currently 1.8 million still tune into the Dr. Oz Show and are consequently exposed to information that is incorrect 50-67% of the time according to the 2014 study in the BMJ.

Dr. Oz has been criticized by a slew of medical professionals for his scam marketing, most notably in 2015 when ten physicians wrote a letter to the dean of health sciences at Columbia University requesting that Dr. Oz be removed as a faculty member due to his “egregious lack of integrity” on his TV show. Dr. Oz defends what he tells the public by claiming that “it’s not a medical show,” despite the fact that the show is titled The Dr. Oz show. Dr. Oz says that freedom of speech gives him the right to say what he wants to. But it is difficult to respect this freedom when he is a faculty member at a prestigious university that makes false claims on TV.

I reached out to the Dr. Oz team and received a response from Oz himself. When asked where he finds his nutrition information he said, “We obtain nutrition information from a wide variety of sources. We rely heavily on literature published in scientific journals as well as textbooks. In addition we consult a wide variety of experts including medical doctors and nutritionists. Our research staff is made up of myself a physician trained in preventive medicine as well as 3 medical students who take a year off to work with us. We evaluate all of the content on our show to ensure that viewers are getting accurate information. One of our researchers this year has a master’s degree in nutrition as well.” I am not sure which scientific journals Dr. Oz and his team are using, but when I researched “curcumin” and “oil of oregano,” two of the supplements that Dr. Oz has promoted on his show and that Mark, my client, is currently taking, the conclusion was that “the existing scientific evidence is insufficient to recommend their safe use.” In our interview, Dr. Oz said: “We also reach out to the Friedman school when we have difficult questions. I spent a day up at the school this summer meeting with a number of your faculty. Most recently I have spoken to an expert about fiber fortified foods and to your Dean about the current opinions on dietary fats.” He included a note that says that he and his team welcome interns to join them every month from September to June and students from Friedman are welcome to apply. *Insert eye roll*

When I asked about Dr. Oz and his team’s stance on nutritional supplements, he replied: “In general we believe that many have a place in people’s life to enhance nutrition. We always love to see more and better studies conducted on the utility of supplements in promoting health.” This is a nice response but when I begrudgingly watched a clip from the Dr. Oz show in which he says that Conjugated Linoleic Acid (CLA) can help to burn body fat, even without diet and exercise, I realized that what he says and what he does do not match. And aside from empty promises and putting people at risk with questionable pills, he is encouraging people to waste their money. This is what I told Mark in an effort curb his daily supplement cocktail. If the risk of taking his favorite “fat-melting” supplement won’t stop him, maybe the opportunity to save money will.

Dr. Oz is frustrating for many reasons, but for nutrition professionals it is the fact he uses his credentials as a physician to get away with promoting pseudoscience. Being a dietitian no longer involves simply telling people what to eat. It is trying to untangle the web of misinformation surrounding nutrition that clients have woven over the course of their lives and re-teach them what a healthy relationship with food should look like. While turning to supplements can seem like an easy fix, science shows that eating a diet based on whole foods like fruits, vegetables, whole grains, lean protein, and healthy fats, is the ideal diet. Science does not show that a pill is the secret to losing those last five pounds that keep hanging on. If scientists really found a cure for obesity, we would not be hearing about it at 4pm on a Tuesday afternoon. And unfortunately, the supplement industry is not going anywhere. The FDA and FTC regulate the supplement industry, but not very well. So it is up to trained and licensed nutritional professionals (i.e. registered dietitians) to educate the public about the dangers of supplements and listening to people who are simply “health experts.”

Julia Sementelli is a second-year Nutrition Communication & Behavior Change student and Boston-based registered dietitian who works in a local hospital and also counsels private clients.  You can find her on Instagram (@julia.the.rd.eats- Follow her!) where she strives to intercept confusing nutrition messages from self-proclaimed health experts with expert nutrition advice and tips (as well as some beautiful food photos if she does say so herself!).