5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

Putting a Pause on Peanut Butter Panic: New Guidelines Seek to Reduce Peanut Allergy Risk

by Erin Child

Do you like peanut butter? So do I. I’m kind of obsessed. Perhaps you add it to your smoothie bowl, drizzle it artfully on your Instagram worthy oatmeal, or, if you’re in grad school, it’s part of your PB&J. After all, that is the cheapest, easiest thing to make. But what if you had to take the PB out of the PB&J, and eliminate it from your diet and your life? This is a growing reality for many in the United States, with outdated, misinformed guidelines being blamed for the recent spike in peanut allergies. Read on to explore the revolutionary research that has spurred the creation of new guidelines, and why Americans need to change how we handle peanut exposure in childhood.

I recently stopped eating peanut butter in any way that could be deemed pretty or practical. Instead, you can find me in my room, with the door shut, maniacally shoveling peanut butter into my mouth with a plastic spoon.

This all started at the beginning of 2017. No, it is not some bizarre New Year’s resolution or diet trend. Rather, a new roommate moved in. She’s a great girl – kind, thoughtful, willing to learn how to properly load a dishwasher – and massively, catastrophically allergic to peanuts. She is also allergic to tree nuts and soy, but peanuts are THE BIG BAD. They are the reason why I spent the week before her arrival scrubbing my kitchen from top to bottom and running every dish and utensil (even the wooden ones, to my chagrin) through the dishwasher. And there is now an EpiPen® in our kitchen. Just as they are on some airlines, peanuts are now banned from the general living areas of my house, and thus I & my beloved jar of peanut butter have been sequestered to my room.

Many of you have probably dealt with peanut-free schools or day cares, or been informed to not consume any peanut products on your flight. Peanut allergy rates in children in the United States have quadrupled from the late 1990s (less than 0.5%) to about 2% today, and are the leading cause of anaphylaxis or death from food allergies. Thanks to my new-found awareness, I have become extremely self-conscious about eating peanut butter in public spaces. On the bus the other day some peanut butter dripped from my sandwich to the seat. I panicked, thinking “What is the chance this spill is going to wind up hurting some little kid?” (I hope they are not licking the seats on the bus, but still.)

Coupled with my new roommate’s arrival, I was fascinated to find that peanut allergies have been back in the news. On January 5th, 2017, the National Institute of Allergy and Infectious Disease (NIAID) published new guidelines for practitioners about when to introduce peanuts to high-risk, medium-risk, and low-risk infants. High-risk infants with severe eczema and/or an egg allergy should be introduced to peanuts between 4 to 6 months. Medium-risk infants with mild eczema should be introduced to peanuts by 6 months, and low-risk infants without eczema or other allergies can be introduced to peanuts any time after they have been introduced to solid foods.

These guidelines fit in with the dual-allergen exposure hypothesis. This suggests that children are first exposed to food particles through their skin as infants. This exposure primes their immune systems to treat the food proteins like invaders and build up defenses against it. If the food is eaten years later, the child has an acute allergic reaction because their immune system had ample time to prepare. Children with eczema have weakened skin barriers and are much more likely to experience repeated skin exposure to food allergens. This leads to an increased chance of an allergic reaction once they eat the food. Current research now supports this hypothesis, and also suggests that by shortening the time between skin exposure and ingestion, we will reduce the number of acute allergic reactions. The sooner an infant starts eating an allergen, the more likely the body will adjust to it without having time to bsuild up strong defenses against it.

These new guidelines on peanut exposure from NIAID seek to correct for guidelines set by the American Academy of Pediatrics in 2000. The 2000 guidelines were based on only a few tests done on hypoallergenic infant formula feeding, yet conclusively recommended that infants at high-risk for peanut allergies wait until 3 years of age to first try peanuts. Based on the newest findings, it appears that this advice was ill advised. My roommate, n=1, was born in the mid-1990s when delaying peanut exposure was coming into vogue. She had severe eczema an infant, and following doctors’ recommendations, wasn’t introduced to peanuts until somewhere between 18-24 months old. She is equally fascinated with the new research, and wishes there was some way to know if the outcome would have been different had she tried them at a younger age.

Peanut allergies are more common in the US, UK, and Australia, which are also the countries that have historically had the most stringent recommendations around peanut introduction. As doctors and researchers sought to figure out why peanut allergies were ballooning, they looked to countries with very low peanut allergy rates, like Israel, where infants are introduced to peanuts at early ages. In Israel, instead of Cheerios, infants are given a peanut based snack, called Bamba, as one of their first foods. In other developing countries, infants are exposed to peanuts early on—both in their environment and in their food. These other countries also have much lower allergy rates.

In 2015, NIAID funded the Learning Early About Peanut Allergy (LEAP) study to determine whether early exposure to peanuts would decrease the incidence of peanut allergies. The UK study was a randomized controlled trial including 640 infants between 4 and 11 months of age with severe eczema and/or egg allergy. The infants were split into two groups (based on skin prick test results for peanuts) and then randomized to either eat or avoid peanuts until 60 months old (5 years). For infants in the negative skin prick test group, 13.7% of those who avoided peanuts had developed an allergy and only 1.9% of those who ate peanuts developed an allergy (P<0.001). For infants in the positive skin prick test group, 35.3% who of those who avoided peanuts had developed an allergy and 10.6% of those who ate peanuts developed an allergy (P=0.004). These results were significant and stunning, prompting the formulation of the current NIAID guidelines.

So, should we all start slathering our babies in peanut butter? Maybe. (As always, talk to your pediatrician). Food allergy science is an evolving field, and what is true today may not hold true a decade down the line. But based on the significance of the current research and the lower peanut allergy rates in cultures and countries that do not limit peanut exposure, the evidence strongly indicates that parents in the United States should change their approach.

Only 20% of children diagnosed with peanut allergies will grow out of them. The vast majority, like my roommate, are allergic for life. For now, research on reducing peanut allergies in adults is limited, making it unlikely that we will be eliminating any allergies anytime soon. So for now, I will continue to eat my peanut butter in my room. Alone.

Erin Child is a second semester NICBC student in the dual MS-DPD program and this is her first article for the Sprout. She loves cooking (usually with Friends or Parks & Rec on in the background). She hates brownies. (Seriously.) As the Logistics Co-Chair for the Student Research Conference, she looks forward to seeing everyone there on April 8th!

Microalgae: Do They Have a Place in Your Diet or Should They Be Left in the Pond?

by Julia Sementelli

If you have an Instagram account, chances are you’ve seen a slew of blue-green smoothies pop up on your feed. That vibrant color comes from adding some form of powdered algae to the smoothie. High in antioxidants, healthy fats, and protein, microalgae are the latest superfood to take over the nutrition world. The most popular types of algae include chlorella, spirulina, Aphanizomenon flos-aquae (AFA), Blue Majik…the list goes on. Microalgae are claimed to boost your energy, decrease stress, and reduce your risk for diabetes and heart disease. The question, of course, is whether these microalgae have any science-based health benefits beyond the nutrients they provide. I’ve asked consumers, health food companies, and nutrition experts to weigh in on whether algae should be added to your daily regimen or if they’re better off as fish food.

What are algae?  And why are we eating them?

Microalgae are very small photosynthetic plants rich in chlorophyll, which is where the green comes from (hello flashbacks to high school biology class). According to research, algae types differ in the nutrients they provide but all share one characteristic: they are high in antioxidants.  (See “Get To Know Your Blue-Green Algae” in the sidebar to learn more about individual microalgae). While some microalgae have been on the market for years, they have just recently risen to fame in the nutrition world as social media, blogs, and magazines advertise the purported benefits. One microalga in particular, spirulina, has received a significant amount of attention.  Companies have jumped on the microalgae bandwagon by adding spirulina to their products and even selling it in pure form. Abby Schulman, vegan and nutrition enthusiast, says that her fascination with superfood culture generally led to hearing about microalgae, in particular spirulina.  “It is sort of billed as this amazing nutrient-dense secret pill,” she states. “I was actually concerned about my iron levels and nutrition generally when I first started using it, since it was right when I transitioned to veganism. It felt like a good way of packing in some vitamins was to try the spirulina.” As a vegan who eats a diet rich in fresh produce, Abby states that adding spirulina to her diet is “ a more shelf stable way of getting in greens at the level I eat them than having to buy huge tubs of greens all the time.”

Processed with VSCOcam with c1 preset

Photo credit: Julia Sementelli

Microalgae’s time in the sun

 Blue-green microalgae have become a nutritional celebrity thanks to their prevalence in popular health food spots across the United States. Juice Generation, a national juice and smoothie chain, has jumped on the algae bandwagon by selling products that tout its supposed benefits. Products range from “Holy Water,” which contains Blue Majik, tulsi, coconut water, and pineapple, to concentrated shots of E3Live. These products claim to boost energy, enhance focus, and balance blood sugar. However, research to support these claims is lacking.

Infographic credit: Julia Sementelli

Infographic credit: Julia Sementelli

Health food businesses that use social media and blogs to advertise their products have also played a significant role in making microalgae famous. Sun Potion, an online medicinal plants and superfoods company, sells a slew of supplements, including chlorella. Sky Serge, Sun Potion spokesperson, is a big proponent of the power of chlorella. “Sun Potion chlorella is a single-celled green algae that is different than others, and is grown indoors and processed using an advanced sound frequency technology to crack the cell wall, making its many nutrients available for us to enjoy,” she explains. She says that she enjoys consuming chlorella in a glass of spring water each morning. “I have personally felt its detoxification benefits and have noticed healthier skin, better digestion and overall, a better wellbeing. Whether I am drinking it in my water in the morning or adding it to a salad dressing, I try and want to consume it every day!”

To further bolster Sun Potion’s belief in the power of its chlorella, founder, Scott Linde claims that chlorella “contains all eight essential amino acids, which could allow a person to live solely on chlorella and clean drinking water.” Not surprisingly, he too consumes chlorella daily. “Upon waking in the morning, I enjoy an eight ounce glass of water with a teaspoon of chlorella mixed in,” he says. “This simple action can punctuate the start of a great day. The body is slightly dehydrated after sleep, meaning the nutrients from the chlorella are absorbed almost immediately into the blood stream.” When asked about the nutrition benefits of chlorella, Linde claims that drinking chlorella offers much more than just antioxidants. “It helps to oxygenate the blood, waking up the brain; nourish the organs; aid in healthy elimination; and assist the body in moving toxins out of the system.” Not only have Serge and Linde experienced excellent results, but their customers have as well. “At Sun Potion, we have actually had customers tell us that they have forgotten to make their coffee in the morning because they were feeling so good from their morning chlorella ritual. This is perfect example of potent nutrition and best quality plant materials helping to saturate the body with positive influence, leading to looking, feeling, and operating at one’s best.”

The good, the bad, and the blue-green

Although many health claims about microalgae, such as increasing energy and regulating blood sugar, are not supported by science, research has shown some promising, more realistic benefits. A 2013 study showed that adding 3600 milligrams per day of chlorella to the diets of 38 chronic smokers for six weeks helped to improve their antioxidant status and reduce their risk of developing cancer. Another study found that daily intake of 5 grams of chlorella reduced cholesterol and triglyceride levels in patients with high cholesterol. Research has even found that supplementing chlorella can improve the symptoms of depression, when used in conjunction with antidepressant therapy. Still, many of these studies are the first of their kind and more evidence is needed regarding the long-term effects on cholesterol, cancer prevention, and depression, in addition to other conditions microalgae are claimed to help to alleviate.

While the supposed benefits of microalgae typically receive all of the attention, microalgae also have their own list of caveats. According to New York City-based registered dietitian, Willow Jarosh, “Some people can have allergic reactions to both spirulina and chlorella, so take that into consideration when trying. In addition, spirulina can accumulate heavy metals from contaminated waters.” She also states that microalgae can actually be too high in certain nutrients. “If you have high iron levels, have gone through menopause, or are a man, be aware of the high iron levels in microalgae—especially if you use them regularly.”

So what’s the verdict?

While there is certainly a lot of hype surrounding microalgae in the media, from companies that sell products containing them to preliminary supporting research, when it comes to recommending adding chlorella to your daily diet, experts are hesitant.

According to Jarosh, “There are some really major health claims, with very little scientific evidence/research to back up the claims, for both chlorella and spirulina.” As the co-owner of a nutrition consulting business, C&J Nutrition, she finds that her clients are frequently asking her about her thoughts on microalgae. “We’re always reluctant to recommend taking something when the long-term safety is unknown,” Jarosh says. “And since there’s not much research in humans to provide strong reasons to take these supplements (yet!), and the long-term research is also lacking, we’d recommend not using either on a regular basis.”

Microalgae are packed with antioxidants and those are always a good addition to your daily eats. Although the colors of microalgae appear supernatural and their effects often advertised as having the ability to give you superpowers, research is currently inadequate to say whether microalgae have more benefits than other antioxidant-rich foods. If you do decide to try it based on its antioxidant content, make sure that it does not replace other fruits and vegetables in your diet. Remember: Whole foods are always better than a powder.

Julia Sementelli is a second-year Nutrition Communication & Behavior Change student and registered dietitian.  Follow her on Instagram at @julia.the.rd.eats

 

 

 

 

 

 

Timing of your Meals–Does it Matter?

by Yifan Xia

How would you feel if you were told to not have dinner for the rest of your life? Skipping dinner every day might sound shocking to most of us, but it was once a very common practice in ancient China in the Han Dynasty. In fact, even today Buddhism and Traditional Chinese Medicine (TCM) promote this practice as a healthier choice than eating three meals per day. But does this practice have roots in science? Of course, controversy exists around this topic, but one thing that we can be certain of today is that the timing of our meals can have a much greater impact on our health than we originally thought.

Researchers investigating the circadian system (internal biological clock) have started looking at the effects of mealtime on our health. Surprisingly, preliminary evidence seems to support the claims of Buddhism and TCM, indicating that eating meals earlier in the day might help promote weight loss and reduce the risk of chronic disease.

What are circadian rhythms and the circadian system?

Circadian rhythms are changes in the body that follow a roughly 24-hour cycle in response to external cues such as light and darkness. Our circadian system, or internal biological clock, drives circadian rhythms and prepares us to function according to a 24-hour daily cycle, both physically and mentally.

Why do they matter to our health?

Our internal biological clock is involved in almost every aspect of our daily lives: it influences our sleep-and-wake cycle, determines when we feel most energetic or calm, and when we want to eat.

These days people don’t always rely on their biological clocks to tell them when to eat, and there are many distractions in the environment that can influence mealtime. We typically think how many calories we eat—and what we eat—are the major contributors to our weight and health, but researchers have found that eating at inappropriate times can disrupt the internal biological clock, harm metabolism, and increase the risk of obesity and chronic disease.

What does the research say?

Although currently the body of research evidence for this area is relatively small, there are several human studies worth highlighting. One randomized, open-label, parallel-arm study, conducted by Jakubowicz, D., et al and published in 2013, compared effects of two isocaloric weight loss diets on 93 obese/overweight women with metabolic syndrome. After 12 weeks, the group with higher caloric intake during breakfast showed greater weight loss and waist circumference reduction, as well as significantly greater decrease in fasting glucose and insulin level, than the group with higher caloric intake during dinner. Another study published in the same year with 420 participants noted that a 20-week weight-loss treatment was significantly more effective for early lunch eaters than late lunch eaters. In 2015, a randomized, cross-over trial, conducted in 32 women and published in International Journal of Obesity, showed that late eating pattern resulted in a significant decrease in pre-meal resting-energy expenditure, lower pre-meal utilization of carbohydrates, and decreased glucose tolerance, confirming the differential effects of meal timing on metabolic health. However, few studies were identified reporting negative findings, probably due to the fact that this is an emerging field and more research is needed to establish a solid relationship.

 So when should we eat? Is there a perfect mealtime schedule for everyone?

“There are so many factors that influence which meal schedules may be suitable for an individual (including biological and environmental) that I cannot give a universal recommendation,” says Gregory Potter, a PhD candidate in the Leeds Institute for Genetics, Health and Therapeutics (LIGHT) laboratory at the University of Leeds in the United Kingdom and lead author on the lab’s recent paper reviewing evidence of nutrition and the circadian systems, published in The British Journal of Nutrition in 2016. Potter also comments that regular mealtime seems to be more important than sticking to the same schedule as everyone else: “There is evidence that consistent meal patterns are likely to be superior to variable ones and, with everything else kept constant, it does appear that consuming a higher proportion of daily energy intake earlier in the waking day may lead to a lower energy balance and therefore body mass.”

Aleix Ribas-Latre, a PhD candidate at the Center for Metabolic and Degenerative Diseases at the University of Texas Health Science Center and lead author on another review paper investigating the interdependence of nutrient metabolism and the circadian systems, published in Molecular Metabolism in 2016, also agrees: “To find the appropriate meal time has to be something totally personalized, although [it] should not present [too] much difference.” Aleix especially noted that people who are born with a tendency to rise late, eat late, and go to bed late (“night owls” versus “early birds”) are more likely to be at risk for metabolic disease.

Do we have to eat three meals a day?

How many meals do you usually have? In fact, how much food makes a meal and how much is a snack? There is no universal definition, which makes these difficult questions to answer.

“To maintain a healthy attitude towards food, I think it is important to avoid being too rigid with eating habits … I do think consistency is important as more variable eating patterns may have adverse effects on metabolism,” says Potter. “Although there is evidence that time-of-day-restricted feeding (where food availability is restricted to but a few hours each day) has many beneficial effects on health in other animals such as mice, it is as yet unclear if this is true in humans. I’d also add that periodic fasting (going for one 24 hour period each week without energy containing foods and drinks) can confer health benefits for many individuals,” Potter comments.

[See Hannah Meier’s recent article on intermittent fasting for more.]

Based on their research, Ribais-Latre and his lab have a different opinion. “We should eat something every 3-4 hours (without counting 8 hours at night). Many people complain about that but then consume a huge percentage of calories during lunch or even worse at night, because they are very hungry. Eating a healthy snack prevents us [from] eating too [many] calories at once.” He suggests what he considers a healthier mealtime schedule:

–          6:00 am  Breakfast (30% total calories)

–          9:30 am  Healthy snack (10%)

–          1:00 pm  Lunch (35%)

–          4:30 pm  Healthy snack (10%)

–          8:00 pm  Dinner (15%)

What if you are a shift worker or your work requires you to travel across time zones a lot? Ribais-Latre’s advice is “not to impair more their lifestyle… at least it would be great if they are able to do exercise, eat healthy, sleep a good amount of hours.”

What does Traditional Chinese Medicine say?

There are historical reasons behind the no-dinner practice in ancient China in the Han Dynasty. First, food was not always available. Second, electricity hadn’t been invented, so people usually rested after sunset and they didn’t need much energy at what we now consider “dinner time.”

However, there are also health reasons behind this practice. In TCM theory, our internal clock has an intimate relationship with our organs. Each organ has its “time” for optimal performance, and we can reap many health benefits by following this clock. For example, TCM considers 1:00 am – 3:00 am the time of “Liver”. The theory says that is when the body should be in deep sleep so that the liver can help to rid toxins from our body and make fresh blood. Disruption at this time, such as staying up until 2:00 am, might affect the liver’s ability to dispel toxins, leading to many health problems, according to the theory.

Many Western researchers do not seem to be familiar with the TCM theory. When asked about the practice of skipping dinner, Potter comments, “I think that skipping dinner can be a perfectly healthy practice in some circumstances; in others, however, it may be ill advised if, for example, the individual subsequently has difficulty achieving consolidated sleep.”

On the flip side, Ribais-Latre says that “skipping a meal is not good at all. We should not eat more calories than those we need to [live], and in addition, the quality of these calories should be high… If you can split those calories [to] 5 times a day instead of three, I think this is healthier.”

Even though there is no universal agreement on mealtime, the tradition of “skipping dinner” did come back into style several years ago in China as a healthier way of losing weight, and was quite popular among Chinese college women. Yan, a sophomore from Shanghai and a friend of mine, said that she tried the method for six months but is now back to the three-meal pattern. “The first couple of days were tough, but after that, it was much easier and I felt my body was cleaner and lighter… I did lose weight, but that’s not the main goal anymore… I got up early every day feeling energetic. Maybe it’s because I only ate some fruits in the afternoon, I usually felt sleepy early and went to bed early, which made it easier to get up early the next day with enough sleep… I’m eating three meals now, but only small portions at dinner, and I think I will continue this practice for my health.”

So what’s the take-away?

Mealtime does seem to matter. But exactly how, why, and what we can do to improve our health remains a mystery. Researchers are now looking into the concept of “chrono-nutritional therapy,” or using mealtime planning to help people with obesity or other chronic diseases. When we resolve this mystery, the question of “When do you eat?” will not just be small talk, but perhaps a key to better health.

Yifan Xia is a second-year student studying Nutrition Communication and Behavior Change. She loves reading, traveling, street dancing, trying out new restaurants with friends in Boston, and watching Japanese animations.

 

 

Lemon Preserve: Lemons + Salt + Patience

By Jennifer Huang

Have you ever seen “patience” listed as a recipe ingredient? No? Well you’ll need it, as this simple recipe promises a unique and versatile flavor burst that is well worth the wait.

I have seen lemon preserves in Middle Eastern grocery stores before—usually as unappealing lemons floating in questionable liquid—and never gave them a second thought. Thankfully, my brother recently enlightened me on what lemon preserves are after he saw a recipe posted by our favorite Taiwanese food blogger, Karen Hsu.

Lemon preserves aren’t new to the scene: The earliest reference to this ingredient was in an Arab Mediterranean recipe from the 11th century, according to the Encyclopedia of Kitchen History. Today, you will find lemon preserves in many Moroccan and Middle Eastern cuisines.

So what’s the hype? Well, many recipe bloggers describe how the fermenting process brings the flavor and fragrance of lemon to an unimaginable level. Used in recipes for salad dressing, couscous, chicken and many other dishes, lemon preserve is a versatile ingredient sure to liven up any dish. Another beautiful thing about lemon preserve—it is simple to make and requires few ingredients. However, one of them, as Serious Eats has put it, is patience.

Here is the recipe translated and adapted from Karen Hsu’s blog:

Lemon Preserve

Duration: 15 minutes

Ingredients:

  • 12 lemons
  • 200 g salt (approximately ¾ cup)
  • 3-4 whole pieces of bay leaves (optional)
  • Some black pepper (optional)
  • Patience

Lemon Preserve Picture (1)

Steps:

  • Sterilize glass jars. (My brother and I boiled mason jars in water for 10 minutes.)
  • Cut six lemons into ¼ inches slices.
  • Layer salt and lemon slices in the jar. Put some salt into the jar first, then a lemon slice, then salt, etc.
  • After layering, crush the bay leaves and sprinkle both the bay leaves and black pepper into the jar (optional).
  • Squeeze juice from the other six lemons into the jar.
  • Seal the jar and put it in the refrigerator.
  • Wait for a month. (Yes. A month, but it will transform your life after that month.)

Disclaimer: I have used this recipe, but have not tasted it myself (it won’t be ready until September 14, and is currently fermenting in Houston). However, after reading a myriad of articles about lemon preserve, I think it is a promising addition to anyone’ shelf.

But… since my patience is wearing thin, I have found Moroccan restaurants in Somerville and Charleston that have lemon preserve dishes I am dying to try. Join me if you are interested, because I shall be going there, very soon.

Jennifer Huang is a first-year FPAN student. She worked as a dietitian in Houston and is interested in the economics and trade of food and food safety at the international level.

What is Intermittent Fasting, and Does It Really Work?

By Hannah Meier

You may have heard of caloric restriction and the myriad benefits it supposedly brings to the metabolic table. New research suggests that intermittent fasting could be a safe way for people to improve their health, but before you adopt this eating pattern, read up on six common mistakes to avoid.

The newest diet to gain popular attention isn’t much of a diet at all. It is something that most people who adhere to a traditional sleeping and waking cycle are already primed to do—and, proponents would argue, is something humans have been doing successfully for centuries. Intermittent Fasting (IF) has garnered support in the fitness community as a weight management tool for bodybuilders and other fitness enthusiasts. Recently, a growing portion of the scientific community has begun to also regard IF as a feasible way to improve metabolic health and perhaps even extend one’s lifespan.

Instead of eating many times throughout the day, between 6:00 am and 10:00 pm for example, Intermittent Fasters will couple periods of extended fasting (from 14 to 24 hours) with shorter periods of eating. This can be achieved by a change as simple as lengthening the overnight fast by a few hours each day. Different variations of IF propose reducing intake to 500-600 calories for just two days of the week; others recommend one full, 24-hour weekly fast. There are no particular restrictions on the type of foods allowed to be consumed, as long as meals are kept within the “eating window” and consumption does not surpass the feeling of comfortable fullness.

Experimental studies in rats have suggested that providing the body with an extended fast (up to 24 hours) is physiologically beneficial, potentially improving insulin sensitivity, decreasing resting heart rate and blood pressure and reducing body wide inflammation—all of which could contribute to a longer expected lifespan. Further, adapting to a shorter eating window may help to moderate overall calorie intake. Randomized controlled trials demonstrating benefits in humans have yet to be published. Because humans share an evolutionary adaptation to generations of unpredictable periods of fasting and feasting, however, scientists are eager to tease out this connection in future studies.

Still, many nutrition professionals are hesitant to advocate IF as superior to other diets or as a safe and effective approach to weight loss. At the end of the day, reducing calories consumed and increasing energy expended through physical activity is what matters for losing weight, and there are many ways to achieve this goal that do not require adopting a rigid eating schedule. It is important to consider your lifestyle, motivation, and sacrifices you are willing (or not willing) to make in order to reap the potential benefits of intermittent fasting. Like any diet, adherence is key to success. Here are six common mistakes to avoid if you think intermittent fasting sounds like something you want to try.

Six Mistakes Most People Make When They Begin Intermittent Fasting

  1. Giving up too soon

It is normal to feel more irritable or sluggish as the body adapts to a longer fasting period and adjusts its hormonal signaling (most scientists believe this adaptation underlies many of the health benefits of IF). Intermittent Fasters will likely find that true hunger feels different than the hunger pangs and uptick in heartbeat associated with fluctuating blood sugar, which we experience when we are used to frequent eating—learn to recognize it.

  1. Forgetting about quality
The "Basic Seven" Developed by the USDA in 1943

The “Basic Seven” Developed by the USDA in the 1940’s

Even though IF does not restrict the type of foods allowed to be consumed during the eating period, it’s essential to maintain proper nutrition. Metabolic improvements like insulin sensitivity and reduced inflammation could very well be negated if fasters neglect nutritional balance and decide to eat foods high in salt, saturated fat, and refined carbohydrates exclusively, avoiding fruits, vegetables, whole grains and lean proteins. You may still lose weight if you’re consuming fewer calories overall, but the efficiency of your body systems will suffer—and you probably won’t feel too well, either.

 

 

  1. Forgetting to hydrate

Hydration is key, especially during periods of fasting. Adequate hydration is necessary for pretty much every function in the body and will keep you feeling energized and alert. During fasting periods water, tea, coffee and no- or low-calorie beverages are allowed (just watch out for added cream and sugar). Keep tabs on the color of your urine as a gauge for hydration status: if it is darker than the light-yellow of hay you need to drink more fluid.

  1. Exercising too much

Some athletes swear by intermittent fasting as a means to improve performance, burn more fat, and even increase endurance. However, none of these benefits have consistently been backed up with controlled human studies. In fact, many observational studies of Muslim athletes during Ramadan show evidence of decreased performance (some athletes practicing IF might not maintain a fasting pattern requiring them to train during a fasted state, so these experimental differences could be important in interpreting results). Moderate and consistent exercise is encouraged for general health, but excessive exercise on top of prolonged fasting may send the body in to a state of chronic stress which can lead to inflammation, lean tissue breakdown, insulin resistance and injury.

  1. Not working with your schedule

There are different variations of IF and the only thing that makes one program more effective than the next is whether or not you can stick to it. For example, don’t decide to fast for 24 hours if you know missing your nightly family dinner will cause mental and social strain. There are many methods for reducing calorie intake for weight loss, and intermittent fasting may not be right for you if it leads to feelings of isolation and reduced quality of life.

  1. Believing that if some is good, more is better

Just because a little bit of fasting may be healthy does not mean that a lot of fasting is healthy. Going too long without food can lead the body into a state known as “starvation mode,” which greatly slows the metabolic rate, begins breaking down muscle for energy, and stores a greater majority of consumed calories as fat. Further, fasting for too long can lead to severe feelings of deprivation and preoccupation with food, culminating in uncontrollable or disordered eating behavior including binging and even anorexia. If you sense your relationship with food is becoming abnormal because of IF, make necessary adjustments and seek help if needed. 

Because IF can represent a major shift in metabolism and routine, most nutrition professionals are hesitant to recommend it as an intervention for just anyone. It is important to work with a licensed professional who understands your needs and who can help you maintain optimal nutrition, physical activity, and mental health during periods of prolonged fasting. Preliminary studies show that IF, when done right, may be a great tool for improving health, but it is not the only option to boost endurance and lose weight.

Hannah Meier is a first-year, second-semester NUTCOM student, registered dietitian and aspiring spokesperson for honest, scientifically driven and individualized nutrition. 

Beyond Bulking Up on Bugs: Are Insects a Sustainable Solution for Future Protein Needs?

By Michelle Pearson

Eating insects is nothing new. Cultures across centuries have incorporated these creatures in dishes across the globe. As evidenced by early foraging tools and in examining the chemical composition of feces, bugs contributed to improving diet quality during human evolution. Insects are staples in Asian, Australian, African, and South American cuisine. Insects can be consumed live as well as cooked, often roasted, boiled, or fried. In fact, these creatures are a rich source of nutrition. High in fiber, fat, protein, vitamins, and minerals, bugs are a nutrient powerhouse, especially high in zinc and iron. In the Amazon, insects contribute as much as 70% of the population’s dietary protein needs. Perhaps bugs will be the new vegetarian alternative. There is quite a bit of buzz as to whether or not bugs will be the sustainable protein source of the future.

Of course, many Americans take issue with the idea: there certainly is an “ick” factor present in our culture. We tend to be more squeamish about foods. Just look at the way meat is prepared: prepackaged indiscernible cuts of pink flesh completely devoid of evidence from the creatures whence they came. Yet, what many people do not know is that they have already eaten insects. Being so ubiquitous, bugs are an unavoidable contaminant. The FDA has created standards: 60 insect bits for every 3.5 ounces of chocolate and 5 fruit flies for every cup of juice. Bugs are simply part of the food system. It has been estimated that Americans consume about 1 pound of insects a year! Thus, acceptability of insect protein may change with knowledge and preparation; people may be more likely to try insect protein ground up into protein bars or baked goods. Substituting whey protein for an insect version may feel just fine for some. However, reducing the “ick” factor may not be the main issue at play.

Fried_insects_for_sale_in_Cambodia.jpg

Fried insects in Cambodia; photo by Steve Baragona

Is insect protein nutritious and sustainable?

The types of proteins found in insects are comparable to animals in nutrition quality and digestibility. Some species of bugs are more nutritious than others, such as crickets and meal worms, and, as with any food, the method of cooking will also impact nutritional content. Less known is the impact on the environment when raising mass amounts of insects. For example, insects already feed a significant population of animals, contributing 70% of food for all land birds in the world and 40% for all fish.

What would increasing insect populations do to these animals or, for that matter, to any aspect of the ecosystem? Many insects are already cultivated on a mass scale for pest control and for feeding pet birds and reptiles. It has been found that insects can be raised on waste product and other low substrates that could feed livestock or human populations. However, one study conducted by Lundy and Parrella found that crickets produced at a large scale required grain feeding to reach necessary protein yield, making them no more sustainable than chickens. When crickets are raised in their natural habitat at a small scale, protein yields are high, and the insects have a very low impact on the environment. Currently, insect diets are supported in regions of the world that are either less population dense, or more integrated, allowing insects to have a symbiotic relationship with the environment and maintaining a balanced ecosystem. In general, the nutritional content of insects is highly variable, depending on the season, population, species, and geographic area. For insects to be sustainable on a mass scale, it will be a challenge to incorporate them into a balanced environment, essentially dedicating large areas of land to creating a new ecosystem.

More research is needed to determine the environmental impact of mass produced insects, as well as ways to maximize protein content. Looking beyond bugs, other sustainable sources of protein should be considered such as red algae. Sources currently being incorporated into American diets include single cell protein, soy protein, and fish protein concentrate. As with anything else, insects cannot solve the problem alone. Sustainable protein sources may require a patchwork of various sources to provide future populations with necessary nutritional needs.

What would an insect food movement look like?

Many passionate people are utilizing insects as a way to improve issues in the food system. The FAO has reported on the future prospects of insect consumption, while independent groups of enthusiasts are promoting insect culture in hopes of a bug-friendly future. Little Herds based out of Austin, Texas is dedicated to educating communities to the benefits of eating insects, while startups like Crickers and Chirps use cricket flour to create sustainable food products. Insects are also being utilized to address malnutrition. Aspire, also from Austin, Texas, seeks to promote the farming and consumption of insects to help malnourished populations, utilizing palmweevel larvae that are naturally high in iron to reduce anemia in Ghana. Austin is even host to an annual insect eating contest, complete with cooking demonstrations and local artists.

Despite the passion of these groups, the movement is not without its challenges. Other than sustainability or novelty, there is no real intrigue or hook for consumer buy-in. Insect foods need a brand. The ick factor may be the biggest hurdle, but good research and developed methodology is another. For one, transforming bugs into generally accepted food products like chips can mess with the flavor profile. Though crickets are considered to have a mild, nutty flavor; once they are ground into a powder, the flavor becomes a potent tribute to wet dog food. Though highly innovative, the “wet dog food” flavor likely won’t win over the hearts and taste bugs of foodies. Having no clear FDA regulation also poses a challenge for developing insect supply chains. Funders are reluctant to invest in an industry without knowing the regulations that will be placed on it in the future.

Lack of corporate backing is part of what gives the insect movement its charm. It is an underdog movement (pun intended). So far, public desire rather than big business has pushed the industry. It has started on the back of crowd funding with start up companies putting out protein bars and crackers. If an insect movement is to gain momentum, it will be up to us as consumers. If you would like to see a change, vote with your dollar, support the startups, and spread the word.

Michelle Pearson is a second-year Master’s student studying Biochemical and Molecular Nutrition.