Balance, Variety, and Moderation: What Do They Really Mean?

by Katelyn Castro

Balance, variety, and moderation have been referenced in the Dietary Guidelines for Americans for decades. Yet overtime, the ambiguity of these terms has clouded their importance and left their meaning open for interpretation—often misinterpretation.

“Everything in moderation.”

“It’s all about balance.”

“I eat a variety of foods… well, a variety of ice-cream flavors!”

These words are often used to justify our food choices or to make us feel better when our diet is not 100% nutritious. Not anymore! Instead of using these words to rationalize our eating habits (which is completely unnecessary and counterproductive), let’s talk about how these nutrition concepts can be interpreted with a more intuitive approach to healthy eating.

Variety

Fruits and vegetables are usually the food groups that we focus on when we talk about variety in our diet. However, variety is encouraged within all the major food groups and among the food groups.

Besides making meals more colorful, eating a variety of fruits, vegetables, dairy, proteins, and grains provides a wider range of vitamins, minerals, antioxidants, prebiotics, and probiotics—keeping our heart, mind, skin, eyes, and gut functioning optimally. Varying protein with a combination of eggs, dairy, legumes, grains, and nuts is especially important for vegetarians to receive adequate amounts of all essential amino acids.

In addition to the benefits of variety at the biochemical level, a varied diet can also make eating more satisfying and flexible. While it can be easy to rely on your food staples for meals, introducing new ingredients can bring attention back to the flavor and enjoyment of eating, preventing you from eating on autopilot. Swap out an apple for a grapefruit or peach; have turkey or fish in place of chicken; substitute barley or quinoa for pasta. Choosing local and seasonal foods will also keep your diet varied diet throughout the year. Giving yourself permission to eat a variety of foods within all food groups can be freeing, helping to overcome rigid eating habits and food rules and appreciate the range of foods that satisfy your hunger and cravings.

Photo credit: https://stocksnap.io

Moderation

Sweets, fatty meats, fried food, fast food, soda… these are all foods recommended to “eat in moderation,” or limit, in some cases. Whether it is unwanted weight gain or increased risk of type 2 diabetes, the negative health effects of eating excess added sugars and solid fats have been identified in the literature. However, cutting out sugary and fatty foods completely can be just as damaging for our emotional health, leaving us disconnected from friends and family and preoccupied with thoughts about food. Food is a huge part of our culture; it’s social, celebratory, and meant to be enjoyed in good company. That’s why moderation—not restriction or overindulgence—is the secret to healthy, happy eating habits.

But, what does moderation really mean? Technically, the most recent dietary guidelines recommend limiting added sugars to less than 10% of total calories per day, saturated fat to less than 10% of total calories per day, and trans fat to as little as possible. Realistically, this may translate into having more added sugars one day (i.e. when you’re eating cake at a family birthday party), and having more saturated fat another day (i.e. when you eat pizza with friends on a weekend).

Moderation is about being open to day-to-day variations in your diet depending on your appetite, cravings, and activity level. Sometimes a big bowl of ice-cream is just want you need to satisfy your sweet tooth, other times a small square of chocolate may be enough to keep sweet cravings at bay. Savoring the flavor of sugary and fatty foods and becoming aware of how your body responds can help you determine what “eating in moderation” means for you.

Photo credit: https://stocksnap.io

Balance

Out of all three of these terms, balance probably has the most interpretations. A balanced diet is often defined as a balance of protein, carbohydrates, and fat within the Acceptable Macronutrient Distribution Ranges set by the Institute of Medicine. A balanced meal, on the other hand, refers to a balance of food groups consistent with MyPlate or Harvard’s Healthy Eating Plate: fill half your plate with fruits and vegetables, one fourth with lean protein, and one fourth with whole grains. Together, creating a balance of food groups and macronutrients can make meals and snacks more filling (from protein and fiber) and provide more sustained energy (from carbohydrates in whole grains, beans, fruits, and vegetables).

Beyond balance within our food choices, energy balance looks more broadly at the balance between energy intake (calories from food) and energy expenditure (calories used for exercise and metabolic processes). Energy balance is associated with weight maintenance, while energy imbalance can contribute to weight loss or weight gain. However, this concept is often oversimplified because energy expenditure cannot be precisely calculated since many factors like the stress, hormones, genetics, and gut microbiota (bacteria in our digestive tract) can alter our metabolism. For example, chronic stress can lead to high levels of cortisol, which signal the body to store fat, contributing to weight gain. In contrast, a diverse composition of gut microbiota may enhance metabolism and promote weight loss, according to preliminary research.

Considering the multiple factors influencing our metabolism, listening to our bodies’ hunger and fullness cues can often guide food intake better than relying on calculated formulas and food trackers. Creating balance, variety, and moderation in our diets can help us meet our nutritional needs and achieve energy balance, while preserving the joy and connection that food brings to our lives.

Photo credit: https://stocksnap.io

Katelyn Castro is a second-year student in the DI/MS Nutrition program at the Friedman School. She’s a foodie, runner, and part-time yogi on a mission to make healthy eating easy, sustainable, and enjoyable. You can find her thoughts on all things relating to food and nutrition at nutritionservedsimply.com

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

Putting a Pause on Peanut Butter Panic: New Guidelines Seek to Reduce Peanut Allergy Risk

by Erin Child

Do you like peanut butter? So do I. I’m kind of obsessed. Perhaps you add it to your smoothie bowl, drizzle it artfully on your Instagram worthy oatmeal, or, if you’re in grad school, it’s part of your PB&J. After all, that is the cheapest, easiest thing to make. But what if you had to take the PB out of the PB&J, and eliminate it from your diet and your life? This is a growing reality for many in the United States, with outdated, misinformed guidelines being blamed for the recent spike in peanut allergies. Read on to explore the revolutionary research that has spurred the creation of new guidelines, and why Americans need to change how we handle peanut exposure in childhood.

I recently stopped eating peanut butter in any way that could be deemed pretty or practical. Instead, you can find me in my room, with the door shut, maniacally shoveling peanut butter into my mouth with a plastic spoon.

This all started at the beginning of 2017. No, it is not some bizarre New Year’s resolution or diet trend. Rather, a new roommate moved in. She’s a great girl – kind, thoughtful, willing to learn how to properly load a dishwasher – and massively, catastrophically allergic to peanuts. She is also allergic to tree nuts and soy, but peanuts are THE BIG BAD. They are the reason why I spent the week before her arrival scrubbing my kitchen from top to bottom and running every dish and utensil (even the wooden ones, to my chagrin) through the dishwasher. And there is now an EpiPen® in our kitchen. Just as they are on some airlines, peanuts are now banned from the general living areas of my house, and thus I & my beloved jar of peanut butter have been sequestered to my room.

Many of you have probably dealt with peanut-free schools or day cares, or been informed to not consume any peanut products on your flight. Peanut allergy rates in children in the United States have quadrupled from the late 1990s (less than 0.5%) to about 2% today, and are the leading cause of anaphylaxis or death from food allergies. Thanks to my new-found awareness, I have become extremely self-conscious about eating peanut butter in public spaces. On the bus the other day some peanut butter dripped from my sandwich to the seat. I panicked, thinking “What is the chance this spill is going to wind up hurting some little kid?” (I hope they are not licking the seats on the bus, but still.)

Coupled with my new roommate’s arrival, I was fascinated to find that peanut allergies have been back in the news. On January 5th, 2017, the National Institute of Allergy and Infectious Disease (NIAID) published new guidelines for practitioners about when to introduce peanuts to high-risk, medium-risk, and low-risk infants. High-risk infants with severe eczema and/or an egg allergy should be introduced to peanuts between 4 to 6 months. Medium-risk infants with mild eczema should be introduced to peanuts by 6 months, and low-risk infants without eczema or other allergies can be introduced to peanuts any time after they have been introduced to solid foods.

These guidelines fit in with the dual-allergen exposure hypothesis. This suggests that children are first exposed to food particles through their skin as infants. This exposure primes their immune systems to treat the food proteins like invaders and build up defenses against it. If the food is eaten years later, the child has an acute allergic reaction because their immune system had ample time to prepare. Children with eczema have weakened skin barriers and are much more likely to experience repeated skin exposure to food allergens. This leads to an increased chance of an allergic reaction once they eat the food. Current research now supports this hypothesis, and also suggests that by shortening the time between skin exposure and ingestion, we will reduce the number of acute allergic reactions. The sooner an infant starts eating an allergen, the more likely the body will adjust to it without having time to bsuild up strong defenses against it.

These new guidelines on peanut exposure from NIAID seek to correct for guidelines set by the American Academy of Pediatrics in 2000. The 2000 guidelines were based on only a few tests done on hypoallergenic infant formula feeding, yet conclusively recommended that infants at high-risk for peanut allergies wait until 3 years of age to first try peanuts. Based on the newest findings, it appears that this advice was ill advised. My roommate, n=1, was born in the mid-1990s when delaying peanut exposure was coming into vogue. She had severe eczema an infant, and following doctors’ recommendations, wasn’t introduced to peanuts until somewhere between 18-24 months old. She is equally fascinated with the new research, and wishes there was some way to know if the outcome would have been different had she tried them at a younger age.

Peanut allergies are more common in the US, UK, and Australia, which are also the countries that have historically had the most stringent recommendations around peanut introduction. As doctors and researchers sought to figure out why peanut allergies were ballooning, they looked to countries with very low peanut allergy rates, like Israel, where infants are introduced to peanuts at early ages. In Israel, instead of Cheerios, infants are given a peanut based snack, called Bamba, as one of their first foods. In other developing countries, infants are exposed to peanuts early on—both in their environment and in their food. These other countries also have much lower allergy rates.

In 2015, NIAID funded the Learning Early About Peanut Allergy (LEAP) study to determine whether early exposure to peanuts would decrease the incidence of peanut allergies. The UK study was a randomized controlled trial including 640 infants between 4 and 11 months of age with severe eczema and/or egg allergy. The infants were split into two groups (based on skin prick test results for peanuts) and then randomized to either eat or avoid peanuts until 60 months old (5 years). For infants in the negative skin prick test group, 13.7% of those who avoided peanuts had developed an allergy and only 1.9% of those who ate peanuts developed an allergy (P<0.001). For infants in the positive skin prick test group, 35.3% who of those who avoided peanuts had developed an allergy and 10.6% of those who ate peanuts developed an allergy (P=0.004). These results were significant and stunning, prompting the formulation of the current NIAID guidelines.

So, should we all start slathering our babies in peanut butter? Maybe. (As always, talk to your pediatrician). Food allergy science is an evolving field, and what is true today may not hold true a decade down the line. But based on the significance of the current research and the lower peanut allergy rates in cultures and countries that do not limit peanut exposure, the evidence strongly indicates that parents in the United States should change their approach.

Only 20% of children diagnosed with peanut allergies will grow out of them. The vast majority, like my roommate, are allergic for life. For now, research on reducing peanut allergies in adults is limited, making it unlikely that we will be eliminating any allergies anytime soon. So for now, I will continue to eat my peanut butter in my room. Alone.

Erin Child is a second semester NICBC student in the dual MS-DPD program and this is her first article for the Sprout. She loves cooking (usually with Friends or Parks & Rec on in the background). She hates brownies. (Seriously.) As the Logistics Co-Chair for the Student Research Conference, she looks forward to seeing everyone there on April 8th!

Coming Back to Common Sense

by Danièle Todorov and Delphine Van Roosebeke

Ever wish the question of what to eat could be, well, simple? In an interview with cardiologist Dr. Jacques Genest, we discuss themes in “common sense nutrition:” the research behind it, the barriers to adherence, and its evolving definition.

New trends in popular nutrition seem to pop up every day. This fervor for novelty has distracted us from what Dr. Jacques Genest simply calls “common sense nutrition.” Dr. Genest is a clinician in cardiovascular disease at The Research Institute of the McGill University Health Centre and a former researcher at the HNRCA. We had the pleasure of speaking with him last November during the 5th International Symposium on Chylomicrons in Disease. (For brevity and clarity, the questions from our original interview have been paraphrased.)

picture1

From left to right: Delphine, Dr. Genest, & Danièle

Q: Supplements are immensely popular and it looks like they are here to stay. Is this frustrating to you as a practitioner?

I’m old enough to have given up. What I tell my patients is that I have no trouble with vitamin supplements, but nutrition will be far more imperative. I tell my patients to purchase [fish oils] in the original container. In other words: Eat fish. And to have a good diet as recommended by a food guide—fruits, vegetables, and no added salt. They are simple recommendations people love to forget.

Q: Such as?

Take a 46 year-old, blue-collar working male. He comes home and he will tell you that a nice piece of meat with a potato, brown gravy, and salt is like the elixir of the gods. If you put in front of him a regular salad with endives, he will not like that. So how do you change a mindset in which the palatability of food gives so much pleasure?

Q: As we have seen in the course Macronutrients [NUTR 370], there is a link between the carbohydrate intake and lipogenesis [the metabolic formation of fat]; however, there are still many people who put emphasis on minimizing dietary fat. Do you agree?

From a public health perspective, I think maybe it’s not as relevant as caloric intake. I have some patients that come back from France and they apologize because they’ve been eating some Camembert and some foie gras. I say, look, your lipids have never looked better. I think its portion size far more than anything else. Compare a steak that you would get in Europe—you’d get about a 3 oz. steak.  Here, you’d get basically a quarter of a brontosaurus. Now, I’m a huge believer in no saturated fat. I tell my patients, if you want to eat meat, eat meat that flies and that swims.

About thirty years ago, we went from a fat-diet to recommending a switch to carbohydrates. My personal impression is that this has been a huge mistake. The insulinemia you get with a high-carb diet is probably deleterious. Whereas a protein-rich, fat-rich diet is much more slowly absorbed, doesn’t produce hyperinsulinemia, and probably gives a better sense of satiety. I think we’ll look back and say that this might have been one of the biggest nutrition errors in the late 20th century.

We had forgotten about the covariates that come with a low-fat diet. Move to Japan where there is a relatively low-fat diet but you also have an incredibly good lifestyle. If you turn to more northern populations where you need the fat for some reason, you don’t necessarily correlate fat intake with cardiovascular disease. You don’t correlate caloric intake with cardiovascular disease.

Q: When you see patients, would you first talk about diet rather than prescribing medication?

My primary prevention patient—the 46-year-old man—I will often give up to two years to fix his bad habits. [If there is no lifestyle change in that time], then he is middle-aged, has high blood pressure, high cholesterol, and high blood glucose. He’ll need two pills for blood pressure, two pills for diabetes, a pill for cholesterol… Five pills when he’s 46; imagine how many pills he’s going to have when he’s really sick. And my success rate is probably less than 10%. The biggest threat [to long term health] is the insulin needle. It’s not having a heart attack, it’s going on the needle.

Q: What is the biggest gap in our knowledge that’s impairing how patients are treated?

You’re again a 46-year-old man. You have a bit of hypertension and your cholesterol is high. I put you on a statin and a blood pressure lower. At your next visit, your blood pressure is extremely normal and your cholesterol is extremely low. Why should you stay on an exercise program and a diet? The perverse effect of our outstanding medication may be that we’re not making the lifestyle effort to treat ourselves naturally.

Authors’ note: We can’t quite explain how we got onto this tangent about low-density lipoprotein (LDL), but it has been fascinating to think about and it would be a shame to exclude it.

What was your diet [50,000 years ago]? Tuberous vegetables, berries, and very little meat. Then something happens to you—you started domesticating animals. You got something you never had in your diet before, two things you rarely found in nature—cholesterol and saturated fats. It takes about a million years to change your genes through evolution. In 50,000 years, we haven’t had time to adapt to a huge influx of saturated fat and cholesterol.

How many animals do you think have LDL? Zero. Maybe the hamster if you feed it an extreme Western-style diet. But animals do not make LDL. In times of starvation, we developed the VLDL [very low-density lipoprotein] system. In my view, VLDL is unidirectional. [After removal of triglycerides by lipases], the particle should be completely taken up by the liver with no cholesterol on it. Where does the cholesterol go? It should go to HDL [high-density lipoprotein], which is the main source of cholesterol for most cells, rather than making or incorporating it. It might not be such a bad thing to say that we’re not meant to have LDL and that any technique to prevent it will be good, especially lifestyle nutrition.

Bottom Line

Surprisingly, there is a lot standing in the way of ‘common sense nutrition’. Adding a supplement or a medication is relatively easy compared to changing deep-rooted eating behaviors like food preferences and portion size. Recommendations around fat intake have changed dramatically and are still being hotly debated. The inclusion of animal products in these recommendations is even questionable from an evolutionary point of view. Dietitians and clinicians certainly have their work cut out for them.

A big thank you to Dr. Genest for taking the time to speak with us! It was a fascinating conversation and hopefully an equally enjoyable read.

Danièle Todorov is a first-year student in Nutritional Epidemiology with a focus on maternal nutrition and a minor obsession with lipid metabolism, a holdover from her biochemistry days.

Delphine Van Roosebeke is a master’s graduate in the Biochemical and Molecular Nutrition program with a background in biochemical engineering. Delphine has a crush on nutrients and the magic they perform in our body, and loves to share her knowledge with anyone who wants to hear it in a fun and approachable way! 

AFE Students Visit University of New Hampshire’s Fairchild Dairy and Organic Research Farms

by Kathleen Nay

On Saturday, October 22, students from the Fundamentals of U.S. Agriculture and Agriculture, Science and Policy II classes visited two dairy farms at the University of New Hampshire. Kathleen Nay documented the field trip for the Friedman Sprout.

The maternity barn at the University of New Hampshire’s Fairchild Dairy Teaching and Research Center. Fairchild is a conventionally-run dairy operation, typical of those seen across New England. Photo: Kathleen Nay

Saturday mornings are normally for sleeping in—the one rare day a week I can afford a leisurely wake-up time. Not today. Today my alarm is set for 5:30 am; I’m joining my fellow Agriculture, Food and Environment students for a day trip to visit two dairy farms at the University of New Hampshire: the Fairchild Dairy Teaching and Research Center in Durham, NH, and the Organic Dairy Research Farm in Lee, NH. Hot tea in hand and warm oatmeal in my belly, we make our way up I-95 and take in the beautiful fall colors along the drive.

Dr. Pete Erickson, professor of biological sciences and extension dairy specialist, meets us at the Fairchild Dairy where he introduces us to his doctoral student, Kayla Aragona, who manages several pregnant cows and calves in her research on colostrum quality. (Colostrum, the first milk produced after a cow gives birth, is key in supporting the health of her young calf.) They give us a tour of the Fairchild Dairy, a typical New England dairy operation that is home to about 90 milking-age Holsteins and Jerseys and 70 young replacement heifers. The facility relies heavily on undergraduate student labor, including students participating in the CREAM program (Cooperative for Real Education in Agricultural Management).

Before entering any of the barns at Fairchild Dairy, we slip plastic disposable boots over our footwear. This is a biosecurity measure meant to prevent the spread of pathogens to or from the farm animals. Photo: Kathleen Nay

Before entering any of the barns at Fairchild Dairy, we slip plastic disposable boots over our footwear. This is a biosecurity measure meant to prevent the spread of pathogens to or from the farm animals. Photo: Kathleen Nay

Students begin the tour of Fairchild’s maternity barn. Photo: Kathleen Nay

Students begin the tour of Fairchild’s maternity barn. Photo: Kathleen Nay

Dr. Pete Erickson leads us on a tour of the facility and answers students’ questions about the New England dairy industry. Photo: Kathleen Nay

Dr. Pete Erickson leads us on a tour of the facility and answers students’ questions about the New England dairy industry. Photo: Kathleen Nay

 Second-year AFE/UEP student Tessa Salzman makes friends with a mama cow. Milk production from these mamas averages 26,000-27,000 pounds per cow per year. Photo: Kathleen Nay

Second-year AFE/UEP student Tessa Salzman makes friends with a mama cow. Milk production from these mamas averages 26,000-27,000 pounds per cow per year. Photo: Kathleen Nay

Dr. Erickson passes samples of corn silage around for students to feel and smell. Silage, a fermented, high-moisture stored fodder, is a primary ingredient in ruminant feed. Photo: Kathleen Nay

Dr. Erickson passes samples of corn silage around for students to feel and smell. Silage, a fermented, high-moisture stored fodder, is a primary ingredient in ruminant feed. Photo: Kathleen Nay

As a bovine nutrition specialist, Dr. Erickson knows a lot about dairy cows’ diets. Here, he shows us a mixture of dried citrus pulp and beet pellets. Beet pellets are a byproduct of sugar manufacturing. Photo: Kathleen Nay

As a bovine nutrition specialist, Dr. Erickson knows a lot about dairy cows’ diets. Here, he shows us a mixture of dried citrus pulp and beet pellets. Beet pulp pellets are a byproduct of sugar manufacturing. Photo: Kathleen Nay

Blood meal, a byproduct derived from the poultry industry, is a high-protein supplement added to cow feed. Photo: Kathleen Nay

Blood meal, a byproduct derived from the poultry industry, is a high-protein supplement added to cow feed. Photo: Kathleen Nay

Pictured: Friedman professor Tim Griffin. This is the sixth time Tim Griffin and Chris Peters have brought AFE students on this field trip to the UNH dairies.

Pictured: Friedman professor Tim Griffin. This is the sixth time Tim Griffin and Chris Peters have brought AFE students on this field trip to the UNH dairies. Photo: Kathleen Nay

Dr. Erickson shows students the dairy’s stores of animal bedding. Photo: Kathleen Nay

Dr. Erickson shows students the dairy’s stores of animal bedding. Photo: Kathleen Nay

A young Jersey calf reaches to scratch an itch. The dairy houses approximately 90 milking-age Holsteins and Jerseys, and 70 young replacement animals, which will become the new stock of milking cows once they reach maturity. Photo: Kathleen Nay

A young Jersey calf reaches to scratch an itch. Fairchild houses approximately 90 milking-age Holsteins and Jerseys, and 70 young replacement animals, which will become the new stock of milking cows once they reach maturity. Photo: Kathleen Nay

Students observe the Jersey herd up close. The milk from the Jerseys and Holsteins at Fairchild is sold to consumers as fluid milk and—everyone’s favorite dairy treat—ice cream. Photo: Kathleen Nay

Students observe the Jersey herd up close. The milk from the Jerseys and Holsteins at Fairchild is sold to consumers as fluid milk and—everyone’s favorite dairy treat—ice cream. Photo: Kathleen Nay

Holsteins watch as we peel off our protective boots and get ready to head to UNH’s organic farm. Photo: Kathleen Nay

Holstein cows watch as we peel off our protective boots and get ready to head to UNH’s organic farm. Photo: Kathleen Nay

After an extensive tour of Fairchild, we head seven miles down the road to the university’s Organic Dairy Research Farm. Established in 2005, this facility was the country’s first organic dairy operation at a land grant university. The farm houses roughly 100 organic Jersey cows, heifers and calves, and the property includes 275 acres of woodlands, crop and forage production, and land for pasture.

Brand-new calves greet us upon arrival at the Organic Dairy Research Farm. Photo: Kathleen Nay

Brand-new calves greet us upon arrival at the Organic Dairy Research Farm. Photo: Kathleen Nay

Students pose for a feeding photo-op. Photo: Kathleen Nay

Students pose for a feeding photo-op. Photo: Kathleen Nay

The organic herd is exclusively Jersey cows. As a breed, Jerseys are prized for the high butterfat content of their milk. These cows average 43 pounds of milk production per day. Photo: Kathleen Nay

The organic herd is exclusively Jersey cows. As a breed, Jerseys are prized for the high butterfat content of their milk. These cows average 43 pounds of milk production per day. Photo: Kathleen Nay

The milk from UNH’s organic herd supplies Stonyfield Yogurt, an organic yogurt company located in Londonderry, New Hampshire. Photo: Kathleen Nay

The milk from UNH’s organic herd supplies Stonyfield Yogurt, an organic yogurt company located in Londonderry, New Hampshire. Photo: Kathleen Nay

UNH’s organic dairy was the first of its kind to be established at a land grant university. Primary areas of research include dairy nutrition and feeds, pasture quality, forage production, compost production, and natural resource management. Photo: Kathleen Nay

UNH’s organic dairy was the first of its kind to be established at a land grant university. Primary areas of research include dairy nutrition and feeds, pasture quality, forage production, compost production, and natural resource management. Photo: Kathleen Nay

Friedman professor Chris Peters (in yellow) walks the pasture with Dr. Erickson and UNH graduate student Kayla Aragona. UNH manages 55 acres of pasture, in addition to 120 acres of woodlands and 100 acres of crops and forage. Photo: Kathleen Nay

Friedman professor Chris Peters (in yellow) walks the pasture with Dr. Erickson and UNH graduate student Kayla Aragona. UNH manages 55 acres of pasture, in addition to 120 acres of woodlands and 100 acres of crops and forage. Photo: Kathleen Nay

The University of New Hampshire’s Fairchild Dairy is open to the public seven days a week between 8:00 am to 6:00 pm. Visitors can observe milking at 3:30 pm.

Kathleen Nay is a second-year AFE/UEP student and has a Bachelor of Fine Arts in Photography. In undergrad, she spent a semester photographing life on a small organic raw-milk dairy in Baroda, Michigan.

Book Review: The Dorito Effect–The Surprising New Truth About Food and Flavor By Mark Schatzker

By Hannah Meier

Grocery store shelves are teeming with products that cater to every sense of flavor. New flavor combinations seem to appear out of thin air every day. Even meat and produce sections increasingly offer pre-seasoned and flavor-enhanced options. What happened to real flavor, and what does all of this have to do with the obesity epidemic? Mark Schatzker, a New York Times food journalist, hypothesizes the connection is stronger than cayenne pepper.

PDorito Effect pictureublished in 2015 and riding the wave of other big name titles in food journalism (The Omnivore’s Dilemma, Food Politics, and Soda Politics, to name a few), The Dorito Effect takes a similar investigational look at the food industry, with author Mark Schatzker aiming to reveal just how much we, as consumers, don’t know about what we’re eating. In The Dorito Effect, Schatzker introduces the current state of emergency regarding obesity by detailing, in quite intimate (and condescending) detail, the despair a woman named Jean Nidtech felt about her weight and her relationship with food that led her to found what is now Weight Watchers. I’m not sure how he knew Nidtech had “visions of jelly beans […] dancing in her head,” but the picture he paints is one arguing that the biggest problem with obesity is an addiction to, or an obsession with, junk food.

Despite this sweeping generalization, Schatzker does illuminate the discrepancy between Americans’ obsession with fad diets and diet foods alongside obesity’s continued rise in prevalence. He notes that we have flip-flopped between villains du jour for decades (is it the salt or the sugar that’s slowly killing you today?). We invent reformulated food products to champion this “food danger,” but have yet to turn the dial on the burden of obesity. Through Schatzker’s reasoning, something in our food environment has certainly changed, but we have taken too reductionist of an approach in addressing it. Food is complicated, he aptly admits. As “we keep mistaking the mechanism of obesity for the cause” (eating too many calories), we dig ourselves deeper into a hole filled with a surplus of nutrient-poor, flavor enhanced, unsatisfying and addicting junk food.

Schatzker summarizes what is common knowledge for many in the Friedman community—our agricultural system, by primarily emphasizing production capacity and ignoring taste, has vastly reduced the nutritional quality and flavor of plant and animal products. The nutrients and plant “secondary compounds”—bioactives as you may know them—are really what constitutes flavor in food in its natural state. He argues that our senses were developed to recognize the various flavors and aromas inherent to particular foods, and that we are “wired” to want foods that fulfill particular physiological needs within our bodies.

Citing a Utah State Professor’s experiments with goats, who developed aversions to plants with toxins and learned to prefer flavors associated with nutrients in which they were deficient, Schatzker concludes that if humans interacted with food in the same way—choosing to eat particular types of plants based on the nutritional demands of the body—obesity would not be the epidemic it is today. We have confused ourselves, he claims, by ridding our food supply of plant secondary compounds, thereby stripping it of flavor and handicapping our innate ability to recognize key qualities and self-regulate our nutrition. Instead, according to Schatzker, we never get full from manufactured, flavor-added products because they don’t truly fulfill their purpose. We keep eating and eating and have ultimately found ourselves in the deep pit of an obesity epidemic.

The idea that our bodies innately respond to our food environment is a convincing hypothesis that has some scientific backing. For one thing, it’s long been accepted that humans have this same kind of post-ingestive feedback for high-calorie foods because we evolved to seek out foods with the most energy density. Schaztker dug up a study conducted by a pediatrician in the 1920’s, who fostered 15 babies and let them grow up eating whatever they wanted from a list of 34 foods (including potatoes, corn, barley, carrots, peaches and brains… among others) and found that these babies were excellent at adopting balanced diets and choosing foods to meet their needs as they changed over time. One baby with rickets, Schaztker recounts, drank cod liver oil in varying amounts over the course of his illness until he was better.

In a brief search of the literature, I failed to find similar studies to back this up. But this may be due to the increased ethical considerations of involving humans in experimental studies over a lack of effect.

Being realistic, the type of food exposure created by the dedicated pediatrician in the 20’s isn’t what most people in the 21st century experience. We live within cultures valuing food norms and are subjected to unbridled media influence. Schatzker would argue that the relationship between nutrients and flavors has been adulterated by the twin forces of the dwindling nutritional quality of our food supply and the abundance of synthetic flavor enhancements we now associate more with meeting emotional needs than biological ones.

Schatzker goes on to spend an inordinate amount of time oscillating between revering the flavor industry for its chemical ingenious and condemning it for perpetuating the disconnect between nutrition and flavor. His sometimes unrefined writing style blames both the overweight individual (often identifying her as “fat” so-and-so) and the food system at large for failing to reverse obesity. Though he does a good job of addressing the complexity of the association between food, flavor and nutrition, he stops short of identifying other key issues that cannot be overlooked when confronting obesity. Financial instability, social inequality, food policy and availability, and cultural norms among the larger issues, with emotional and psychological influences also playing a huge role in what food ends up on individual and family tables.

Schatzker’s grand resolution at the end of the book is to entrust food technology with the task of bringing us back to foods with flavors true to their nutrient content. He believes that if food technology can harness genetic modification to improve yield and durability, surely it can modify genes that enhance nutrient quality. While certainly a good idea, will genetically modifying food to be more nutritious reverse obesity on its own? Hardly.

It will be up to experts like us to dig deeper, and tackle each level of the complex food system with the Friedman understanding that everything is connected, everything is important.

Hannah Meier is a first-year, second-semester NUTCOM student, registered dietitian, and food lover enamored with the complexity of the food system and the way individuals interact with it. Reading The Dorito Effect had no impact on her liberal use of herbs and spices in the kitchen.

 

Beyond Bulking Up on Bugs: Are Insects a Sustainable Solution for Future Protein Needs?

By Michelle Pearson

Eating insects is nothing new. Cultures across centuries have incorporated these creatures in dishes across the globe. As evidenced by early foraging tools and in examining the chemical composition of feces, bugs contributed to improving diet quality during human evolution. Insects are staples in Asian, Australian, African, and South American cuisine. Insects can be consumed live as well as cooked, often roasted, boiled, or fried. In fact, these creatures are a rich source of nutrition. High in fiber, fat, protein, vitamins, and minerals, bugs are a nutrient powerhouse, especially high in zinc and iron. In the Amazon, insects contribute as much as 70% of the population’s dietary protein needs. Perhaps bugs will be the new vegetarian alternative. There is quite a bit of buzz as to whether or not bugs will be the sustainable protein source of the future.

Of course, many Americans take issue with the idea: there certainly is an “ick” factor present in our culture. We tend to be more squeamish about foods. Just look at the way meat is prepared: prepackaged indiscernible cuts of pink flesh completely devoid of evidence from the creatures whence they came. Yet, what many people do not know is that they have already eaten insects. Being so ubiquitous, bugs are an unavoidable contaminant. The FDA has created standards: 60 insect bits for every 3.5 ounces of chocolate and 5 fruit flies for every cup of juice. Bugs are simply part of the food system. It has been estimated that Americans consume about 1 pound of insects a year! Thus, acceptability of insect protein may change with knowledge and preparation; people may be more likely to try insect protein ground up into protein bars or baked goods. Substituting whey protein for an insect version may feel just fine for some. However, reducing the “ick” factor may not be the main issue at play.

Fried_insects_for_sale_in_Cambodia.jpg

Fried insects in Cambodia; photo by Steve Baragona

Is insect protein nutritious and sustainable?

The types of proteins found in insects are comparable to animals in nutrition quality and digestibility. Some species of bugs are more nutritious than others, such as crickets and meal worms, and, as with any food, the method of cooking will also impact nutritional content. Less known is the impact on the environment when raising mass amounts of insects. For example, insects already feed a significant population of animals, contributing 70% of food for all land birds in the world and 40% for all fish.

What would increasing insect populations do to these animals or, for that matter, to any aspect of the ecosystem? Many insects are already cultivated on a mass scale for pest control and for feeding pet birds and reptiles. It has been found that insects can be raised on waste product and other low substrates that could feed livestock or human populations. However, one study conducted by Lundy and Parrella found that crickets produced at a large scale required grain feeding to reach necessary protein yield, making them no more sustainable than chickens. When crickets are raised in their natural habitat at a small scale, protein yields are high, and the insects have a very low impact on the environment. Currently, insect diets are supported in regions of the world that are either less population dense, or more integrated, allowing insects to have a symbiotic relationship with the environment and maintaining a balanced ecosystem. In general, the nutritional content of insects is highly variable, depending on the season, population, species, and geographic area. For insects to be sustainable on a mass scale, it will be a challenge to incorporate them into a balanced environment, essentially dedicating large areas of land to creating a new ecosystem.

More research is needed to determine the environmental impact of mass produced insects, as well as ways to maximize protein content. Looking beyond bugs, other sustainable sources of protein should be considered such as red algae. Sources currently being incorporated into American diets include single cell protein, soy protein, and fish protein concentrate. As with anything else, insects cannot solve the problem alone. Sustainable protein sources may require a patchwork of various sources to provide future populations with necessary nutritional needs.

What would an insect food movement look like?

Many passionate people are utilizing insects as a way to improve issues in the food system. The FAO has reported on the future prospects of insect consumption, while independent groups of enthusiasts are promoting insect culture in hopes of a bug-friendly future. Little Herds based out of Austin, Texas is dedicated to educating communities to the benefits of eating insects, while startups like Crickers and Chirps use cricket flour to create sustainable food products. Insects are also being utilized to address malnutrition. Aspire, also from Austin, Texas, seeks to promote the farming and consumption of insects to help malnourished populations, utilizing palmweevel larvae that are naturally high in iron to reduce anemia in Ghana. Austin is even host to an annual insect eating contest, complete with cooking demonstrations and local artists.

Despite the passion of these groups, the movement is not without its challenges. Other than sustainability or novelty, there is no real intrigue or hook for consumer buy-in. Insect foods need a brand. The ick factor may be the biggest hurdle, but good research and developed methodology is another. For one, transforming bugs into generally accepted food products like chips can mess with the flavor profile. Though crickets are considered to have a mild, nutty flavor; once they are ground into a powder, the flavor becomes a potent tribute to wet dog food. Though highly innovative, the “wet dog food” flavor likely won’t win over the hearts and taste bugs of foodies. Having no clear FDA regulation also poses a challenge for developing insect supply chains. Funders are reluctant to invest in an industry without knowing the regulations that will be placed on it in the future.

Lack of corporate backing is part of what gives the insect movement its charm. It is an underdog movement (pun intended). So far, public desire rather than big business has pushed the industry. It has started on the back of crowd funding with start up companies putting out protein bars and crackers. If an insect movement is to gain momentum, it will be up to us as consumers. If you would like to see a change, vote with your dollar, support the startups, and spread the word.

Michelle Pearson is a second-year Master’s student studying Biochemical and Molecular Nutrition.