“Food Will Win the War!” American Food Policies During World War I

by Jennifer Pustz

“The consumption of sugar sweetened drinks must be reduced” . . . “use less meat and wheat” . . . “buy local foods.” These are familiar phrases at the Friedman School in 2017. But these slogans and many others could be found on posters one hundred years ago after the United States officially entered World War I in April 1917. Friedman student Jennifer Pustz a story from food history that may offer inspiration for the promotion of gardening, conservation, and sustainability in the twenty-first century.

One hundred years ago, on April 6, 1917, the United States ended over two years of neutrality and officially entered World War I. Although the war ended in November of the next year, the nineteen-month period of involvement had an enormous impact on everyday life in the U.S., especially when it came to food and government engagement in food supply and distribution. In Victory Gardens, canning clubs, and kitchens all over America, women engaged in a massive effort to produce, preserve, and conserve food to support the war effort.

By the time the United States entered the war, the issue of food production and conservation had become a top priority for American soldiers and European civilians. After nearly three years of constant ground war, Europe’s agricultural fields were ravaged, much of the labor force had joined the military, and trade was disrupted both on land and at sea. The result was a humanitarian crisis that required the assistance of the United States, whose policy of neutrality and geographic distance from the front lines had protected agricultural production from serious harm.

President Wilson established the United States Food Administration by executive order on August 10, 1917, and Congress passed the Food and Fuel Control Act, also known as the Lever Act. Herbert Hoover, a former mining engineer with prior experience in facilitating food aid to Europe, was hired to serve as the administrator. The Food Administration’s goals were broad—from regulating exports and managing the domestic food supply, to preventing hoarding and profiteering, to promoting agriculture and food conservation. In addition to the federal program, state branches of the Food Administration promoted programs that met the needs of their residents and responded to their own unique food production and consumption issues.

Food will win the war. Wheat is needed for the allies, 1917. Charles Edward Chambers, illustrator. Boston Public Library Prints Department. http://ark.digitalcommonwealth.org/ark:/50959/ft848v37p

 

Hoover took no salary to provide a model of self-sacrifice that he hoped to see in other Americans. One remarkable aspect of the World War I Food Administration story is the overwhelming success of a voluntary effort. In a report about the Massachusetts Committee on Public Safety, published shortly after the war’s conclusion, the author noted the following:

“At no point, even in the most intense shortage of sugar, did the Food Administration establish any legally effective system of rationing for householders; and in the case of both sugar and wheat substitutes, the selfish disregard of Food Administration requests, shown by a few, was much more than offset by the voluntary efforts of that great majority who went well beyond the requested measures, and brought about a total saving far greater than would have been possible by a mechanical rationing program” (311).

Efforts to increase food production targeted large-scale farmers to homeowners with very little land, and almost everyone in between. Even industrial sites engaged in food production. At the American Woolen Company’s 50 mills, over 500 acres were cultivated; factory workers produced over 45,000 bushels of potatoes, 40,000 ears of sweet corn, and thousands of bushels of root crops and summer vegetables. The industrial production was so successful that it was “recognized by many manufacturers that such provision for their employees is of great value, not only in contributing to the support of families, but in its bearing on permanence of occupation and on contentment of mind” (339).

Household Victory Gardens sprouted up in “all manner of unheard-of-places” and allowed homeowners to reduce their dependence on the national food supply by growing their own produce for immediate consumption and canning the surplus for winter months. The U. S. Food Administration advocated for raising livestock as well and promoted “Pig Clubs” for boys and girls. Pigs could aid in reduction of food waste by eating the family’s household scraps. In Massachusetts, the supply of pigs was unable to meet the demand for them.

A massive publicity and communication campaign supported the public adoption of conservation methods. Posters that promoted reduced consumption of sugar, wheat, and meat played upon emotions of patriotism and guilt. Literature on food conservation was translated into at least eleven languages in Massachusetts: Armenian, Finnish, French, Greek, Italian, Lithuanian, Polish, Portuguese, Swedish, Syrian, and Yiddish. More than 800,000 of these leaflets were distributed. A group of five cottages, surrounded by demonstration gardens, were located in the Boston Common between May and October 1918, where visitors could hear lectures, see demonstrations, and pick up educational materials.

War garden entrance on Boston Common during war with Germany, 1918. Leslie Jones, photographer. Boston Public Library Print Department. http://ark.digitalcommonwealth.org/ark:/50959/5h73qd62f

Americans who participated in home gardening and preserving their harvests took some burden off of the general food supply. In Topsfield, Massachusetts, a canning club provided facilities and services for fruit and vegetable preservation. For a 50-cent membership, one could order and buy from the club’s stock at 4 percent discount, send her vegetables and fruits to be preserved in exchange for the cost of labor plus overhead, or could do her own canning using the club’s facilities, which were open 4 days per week. In one season, the canning club produced 3000 jars of fruits and vegetables, 1800 glasses of jelly, and 500 pounds of jam.

Americans voluntarily adopted practices such as “Wheatless Mondays” and “Meatless Tuesdays,” as did hotels and restaurants, which participated in “No White Bread Week” between August 6-12, 1917. Recipes that conserved sugar, wheat, fats, and meat dominated women’s publications and cookbooks of the time. The 1918 book Foods that Will Win the War and How to Cook Them included this recipe for “War Bread”:

2 cups boiling water

2 tablespoons sugar

1 ½ teaspoons salt

¼ cup lukewarm water

2 tablespoons fat                 

6 cups rye flour

1 ½ cups whole wheat flour

1 cake yeast 

To the boiling water, add the sugar, fat and salt. When lukewarm, add the yeast which has been dissolved into the lukewarm water. Add the rye and whole wheat flour. Cover and let rise until twice its bulk, shape into loaves; let rise until double and bake about 40 minutes in a moderately hot oven.

Young people were not exempt from “doing their bit.” The U. S. Food Administration published books, including some for use in schools, to influence young readers who would pass the message on to their parents. Home economics textbooks for college classes applied lessons on macro- and micronutrients and energy metabolism to the state of the food supply in the United States and abroad.

After the war ended on November 11, 1918, the activities of the Food Administration slowed and the agency was eliminated in August 1920. The government implemented mandatory rationing during World War II, but since then, Americans have experienced little to no government interference with their food consumption. Many of the voluntary efforts promoted in the name of patriotism in 1917 and 1918 resonate with some of the food movements of today, such as reducing the amount of added sugar in foods and increasing consumption of whole grains. One would hope it would not take a war and a national propaganda campaign to change behaviors, but perhaps it is worth looking back one hundred years for inspiration to promote gardening, healthier and more sustainable eating habits, and reduced food waste.

Jennifer Pustz is a first-year NICBC student in the MS-MPH dual degree program. In her previous professional work as a historian, Jen’s research interests focused on the history of domestic life, especially the lives of domestic workers, the history of kitchens, domestic technology, and of course, food.

Works Cited:

C. Houston and Alberta M. Goudiss. Foods that Will Win the War and How to Cook Them. New York: World Syndicate Co., 1918; George Hinckley Lyman. The Story of the Massachusetts Committee On Public Safety: February 10, 1917-November 21, 1918. Boston: Wright & Potter Printing Co., 1919.

 

 

Balance, Variety, and Moderation: What Do They Really Mean?

by Katelyn Castro

Balance, variety, and moderation have been referenced in the Dietary Guidelines for Americans for decades. Yet overtime, the ambiguity of these terms has clouded their importance and left their meaning open for interpretation—often misinterpretation.

“Everything in moderation.”

“It’s all about balance.”

“I eat a variety of foods… well, a variety of ice-cream flavors!”

These words are often used to justify our food choices or to make us feel better when our diet is not 100% nutritious. Not anymore! Instead of using these words to rationalize our eating habits (which is completely unnecessary and counterproductive), let’s talk about how these nutrition concepts can be interpreted with a more intuitive approach to healthy eating.

Variety

Fruits and vegetables are usually the food groups that we focus on when we talk about variety in our diet. However, variety is encouraged within all the major food groups and among the food groups.

Besides making meals more colorful, eating a variety of fruits, vegetables, dairy, proteins, and grains provides a wider range of vitamins, minerals, antioxidants, prebiotics, and probiotics—keeping our heart, mind, skin, eyes, and gut functioning optimally. Varying protein with a combination of eggs, dairy, legumes, grains, and nuts is especially important for vegetarians to receive adequate amounts of all essential amino acids.

In addition to the benefits of variety at the biochemical level, a varied diet can also make eating more satisfying and flexible. While it can be easy to rely on your food staples for meals, introducing new ingredients can bring attention back to the flavor and enjoyment of eating, preventing you from eating on autopilot. Swap out an apple for a grapefruit or peach; have turkey or fish in place of chicken; substitute barley or quinoa for pasta. Choosing local and seasonal foods will also keep your diet varied diet throughout the year. Giving yourself permission to eat a variety of foods within all food groups can be freeing, helping to overcome rigid eating habits and food rules and appreciate the range of foods that satisfy your hunger and cravings.

Photo credit: https://stocksnap.io

Moderation

Sweets, fatty meats, fried food, fast food, soda… these are all foods recommended to “eat in moderation,” or limit, in some cases. Whether it is unwanted weight gain or increased risk of type 2 diabetes, the negative health effects of eating excess added sugars and solid fats have been identified in the literature. However, cutting out sugary and fatty foods completely can be just as damaging for our emotional health, leaving us disconnected from friends and family and preoccupied with thoughts about food. Food is a huge part of our culture; it’s social, celebratory, and meant to be enjoyed in good company. That’s why moderation—not restriction or overindulgence—is the secret to healthy, happy eating habits.

But, what does moderation really mean? Technically, the most recent dietary guidelines recommend limiting added sugars to less than 10% of total calories per day, saturated fat to less than 10% of total calories per day, and trans fat to as little as possible. Realistically, this may translate into having more added sugars one day (i.e. when you’re eating cake at a family birthday party), and having more saturated fat another day (i.e. when you eat pizza with friends on a weekend).

Moderation is about being open to day-to-day variations in your diet depending on your appetite, cravings, and activity level. Sometimes a big bowl of ice-cream is just want you need to satisfy your sweet tooth, other times a small square of chocolate may be enough to keep sweet cravings at bay. Savoring the flavor of sugary and fatty foods and becoming aware of how your body responds can help you determine what “eating in moderation” means for you.

Photo credit: https://stocksnap.io

Balance

Out of all three of these terms, balance probably has the most interpretations. A balanced diet is often defined as a balance of protein, carbohydrates, and fat within the Acceptable Macronutrient Distribution Ranges set by the Institute of Medicine. A balanced meal, on the other hand, refers to a balance of food groups consistent with MyPlate or Harvard’s Healthy Eating Plate: fill half your plate with fruits and vegetables, one fourth with lean protein, and one fourth with whole grains. Together, creating a balance of food groups and macronutrients can make meals and snacks more filling (from protein and fiber) and provide more sustained energy (from carbohydrates in whole grains, beans, fruits, and vegetables).

Beyond balance within our food choices, energy balance looks more broadly at the balance between energy intake (calories from food) and energy expenditure (calories used for exercise and metabolic processes). Energy balance is associated with weight maintenance, while energy imbalance can contribute to weight loss or weight gain. However, this concept is often oversimplified because energy expenditure cannot be precisely calculated since many factors like the stress, hormones, genetics, and gut microbiota (bacteria in our digestive tract) can alter our metabolism. For example, chronic stress can lead to high levels of cortisol, which signal the body to store fat, contributing to weight gain. In contrast, a diverse composition of gut microbiota may enhance metabolism and promote weight loss, according to preliminary research.

Considering the multiple factors influencing our metabolism, listening to our bodies’ hunger and fullness cues can often guide food intake better than relying on calculated formulas and food trackers. Creating balance, variety, and moderation in our diets can help us meet our nutritional needs and achieve energy balance, while preserving the joy and connection that food brings to our lives.

Photo credit: https://stocksnap.io

Katelyn Castro is a second-year student in the DI/MS Nutrition program at the Friedman School. She’s a foodie, runner, and part-time yogi on a mission to make healthy eating easy, sustainable, and enjoyable. You can find her thoughts on all things relating to food and nutrition at nutritionservedsimply.com

Soul of the Louisiana Kitchen

by Katie Moses

When the only remnants of Mardi Gras are plastic beads hanging from the oaks along St. Charles Avenue, Louisiana still draws people from around the world for the lively music and incredible food. Discover the secret to the depth of flavor in Cajun and Creole cuisine and recreate a classic Louisiana dish, red beans and rice, in your own kitchen.

Celery, Onion, Bell Peppers. Photo credit: Flickr

Imagine early afternoon in southern Louisiana. The sweltering heat is held at bay by the air conditioner running on full blast; your grandmother begins to quarter onions, seed bell peppers, and break celery stalks. This “southern symphony” begins to swell with the sudden whir of an old food processor finely mincing onion, celery, and green bell pepper, while a layer of oil in a large pot warms on the stovetop. The sizzle crescendos in the Cajun kitchen as she adds the onions, then celery and bell pepper to the hot oil, and their aromatics waft through the 1960s ranch-style house..

Growing up in my grandmother’s home in the heart of Cajun country, this is how homemade dinners began. No matter if it’s red beans, gumbo, or jambalaya, every Cajun dinner starts with a little oil in a heavy-bottomed pot and the Cajun holy trinity – onion, celery, and green bell pepper.

History of the Trinity

The influences of French and Spanish occupation of Cajun and Creole country adds to the “southern symphony” with echoes of Catholic church bells in every town. Naming the aromatic trio the Cajun holy trinity in this predominantly Catholic region reflects how food traditions are as fundamental to the identity of the residents of south Louisiana as their faith.

Aromatic vegetables sautéed in oil as the foundation of flavor in Cajun and Creole cuisine is mirrored in the many cultures that have influenced its traditions: the mirepoix in France; the sofrito in Spanish-speaking countries; and the sacred flavor trinities of West African cuisines. The mirepoix combines onions, celery, and carrots. The slightly sweet carrot adds a different flavor profile compared to the bitter notes of the green bell pepper. A typical sofrito in Spain mixes tomatoes, bell peppers, onions, and garlic, while the Cuban sofrito is the Cajun trinity with garlic added as an official fourth ingredient in the seasoning mix. West African dishes typically begin with tomatoes, onions, and chili peppers. While West African and Spanish cuisines influenced both the Cajun and Creole cuisines, the tomatoes in Creole gumbo and not Cajun gumbo illustrates the stronger influence of West African and Spanish on the Creoles in New Orleans than the Cajuns in Acadiana.

Mirepoix: Celery, Onion, and Carrots. Photo credit: Flickr

Spanish Sofrito: Tomato, Onion, Bell Pepper, and Garlic. Photo credit: Flickr

West African Trinity: Tomato, Onion, and Chili Pepper. Photo credit: Flickr

A Kitchen Staple

Louisiana grocers make the lives of Cajun cooks easier by always stocking the produce section with celery, green bell peppers, onions, and garlic, and by making the ingredients available in many formats. For those who seek even more convenience, cooks can get Guidry’s Fresh Cuts Creole Seasoning a container of finely chopped yellow onions, green bell pepper, celery, green onions, parsley, and garlic. Every freezer section has the holy trinity pre-chopped and frozen at the peak of freshness if you’re willing to have your oil pop a little extra from the moisture of the frozen vegetables.

How to Prep the Trinity

If you prefer to prep your own vegetables, the perfect ratio of aromatics is 2 parts yellow onion:1 part celery:1 part green bell pepper. Every Cajun cookbook will tell you to pair the trinity with garlic for extra depth of flavor. The goal is to mince the onion, celery, and bell pepper so finely that they almost disintegrate while cooking. If you’re far from southern grocery stores that offer the Cajun trinity pre-chopped, you can follow the steps below to hand chop or emulate my grandmother and save time using a food processor. Note: a key to preparing the trinity is to keep the onion separate from the celery and bell pepper; you don’t want to overcrowd the onions, so you always add the celery and bell pepper later.

Hand Chopped:

Supplies needed are a large stable cutting board, two prep bowls, and a sharp non-serrated knife.

  1. Peel and cut the onion into a small dice, a ¼ inch square cut, and place in a prep bowl.
    1. If using garlic, peel and crush or finely mince and mix with the diced onions.
  2. Cut off the stemmed top of the green bell pepper to remove seeds and create a flat surface. Slice pepper into planks and then cut into a small dice. Place in a separate prep bowl.
  3. Chop off white base of celery then halve stalks. Bunch together the halved stalks with your free hand and cut into a small dice. Mix celery with the bell pepper in the prep bowl.

Food Processor:

Supplies needed are a food processor with chopping blade and two prep bowls.

  1. If using garlic, peel and add 4 cloves into the processor first, pulsing until finely chopped. Peel and quarter the onion, then add to the processor and pulse until finely chopped. Remove the onion and garlic mixture and place in a separate prep bowl.
  2. Cut off the stemmed top of the green bell pepper, remove seeds and quarter. Remove white base of celery then roughly chop stalks into 3-inch pieces. Add to the food processor and pulse until finely chopped. Then, place in a prep bowl separate from onion and garlic.
Louisiana Kitchen

Among those who live far from the shade of the magnolias, my home state of Louisiana is known for three things – Mardi Gras, music, and good food. When the only remnants of Mardi Gras are plastic beads hanging from the oaks along St. Charles Avenue, Louisiana still draws people from around the world for the lively music and incredible food. While those outside the kitchen may assume the rich depth of flavor in Cajun and Creole cuisine is thanks to a heavy hand with butter and cayenne pepper, the soul of the unique flavor is the holy trinity. The recipe below blends Cajun, Spanish, and African flavors with a few culinary shortcuts to showcase the holy trinity in a delicious pot of the Louisiana Cajun classic: red beans and rice.

Figure 5: Red Beans and Rice with Louisiana Hot Sauce. Photo credit: Flickr

Good Friday Red Beans and Rice

Servings: 8

This classic south-Louisiana dish saves on time without cutting back on flavor by using canned beans and chipotle peppers in adobo instead of ham hock or tasso. This recipe is perfect for a Lenten Friday or a vegetarian potluck.

Ingredients   

4 cloves garlic, peeled

2 medium yellow onions, peeled (2 cups chopped)*

3 ribs of celery (1 cup chopped)*

1 green bell pepper (1 cup chopped)*

1 tbsp olive or vegetable oil

4 (15-oz) cans dark red kidney beans, drained and rinsed with hot water**

1½ tsp Better than Bouillon Vegetable (or No Chicken) Base***

3 bay leaves

½ tsp cayenne pepper

½ tsp freshly ground black pepper

1 chipotle pepper canned in adobo, chopped

2-4 cups of water (just enough to cover the beans)

Optional: 2 tsp dried thyme and 2 tsp ground oregano

1 lb long grain brown (or white) rice, prepared according to package directions

Instructions
  1. Prep then combine the finely minced onion and garlic in a bowl, and celery and green bell pepper in a separate bowl.
  2. Heat a large heavy-bottom pot over medium heat and add 2 tablespoons of olive or vegetable oil.
  3. Add the onion and garlic to oil and sauté until onions are translucent and garlic is golden.
  4. Add the green bell pepper and celery to the pot and sauté until soft. Be careful not to let the garlic burn.
  5. Add the remaining ingredients to the pot: the kidney beans, bouillon base, bay leaves, cayenne pepper, black pepper, adobo chipotle pepper, and enough water to cover it all by an inch.
  6. Stir until all ingredients are well combined then simmer, covered, over low to medium-low heat for at least 45 minutes. Check and stir occasionally, adding water as needed if beans begin to stick.
  7. The red beans are ready when most of them have begun to fall apart.
  8. Serve on top of an equal portion of rice.

Tip!

Balance your plate by pairing this fiber- and protein-rich dish with collard greens, stewed okra and tomatoes, or a simple cucumber and onion salad.

Substitutions:

*4 cups pre-chopped frozen trinity

**1 lb dry kidney beans, soaked overnight, drained, and brought to a boil then simmer in lightly salted water with bay leaves.

***1½ tsp favorite bouillon/stock base. Alternatively, replace water with your favorite vegetable stock or chicken broth (if not vegetarian).

Southern Serving Suggestion:

If you’re left pining to recreate the Louisiana restaurant experience, turn up a Louis Armstrong record, pour yourself an iced tea,and top those red beans with some thin-sliced, pan-fried andouille sausage. Laissez les bon temp roulez!

Katie Moses is a Registered Dietitian Nutritionist who has worked as a culinary nutrition educator for over 5 years. Starting life with a unique culinary upbringing in the heart of Cajun country with Sicilian, Syrian, and French grandparents, she finds ways to adapt traditional dishes to fit current nutrition recommendations Katie is currently enrolled in the Master’s Degree Program in Nutrition Interventions, Communication, and Behavior Change at the Friedman School. Connect with her at linkedin.com/in/mkatiemoses.

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

Putting a Pause on Peanut Butter Panic: New Guidelines Seek to Reduce Peanut Allergy Risk

by Erin Child

Do you like peanut butter? So do I. I’m kind of obsessed. Perhaps you add it to your smoothie bowl, drizzle it artfully on your Instagram worthy oatmeal, or, if you’re in grad school, it’s part of your PB&J. After all, that is the cheapest, easiest thing to make. But what if you had to take the PB out of the PB&J, and eliminate it from your diet and your life? This is a growing reality for many in the United States, with outdated, misinformed guidelines being blamed for the recent spike in peanut allergies. Read on to explore the revolutionary research that has spurred the creation of new guidelines, and why Americans need to change how we handle peanut exposure in childhood.

I recently stopped eating peanut butter in any way that could be deemed pretty or practical. Instead, you can find me in my room, with the door shut, maniacally shoveling peanut butter into my mouth with a plastic spoon.

This all started at the beginning of 2017. No, it is not some bizarre New Year’s resolution or diet trend. Rather, a new roommate moved in. She’s a great girl – kind, thoughtful, willing to learn how to properly load a dishwasher – and massively, catastrophically allergic to peanuts. She is also allergic to tree nuts and soy, but peanuts are THE BIG BAD. They are the reason why I spent the week before her arrival scrubbing my kitchen from top to bottom and running every dish and utensil (even the wooden ones, to my chagrin) through the dishwasher. And there is now an EpiPen® in our kitchen. Just as they are on some airlines, peanuts are now banned from the general living areas of my house, and thus I & my beloved jar of peanut butter have been sequestered to my room.

Many of you have probably dealt with peanut-free schools or day cares, or been informed to not consume any peanut products on your flight. Peanut allergy rates in children in the United States have quadrupled from the late 1990s (less than 0.5%) to about 2% today, and are the leading cause of anaphylaxis or death from food allergies. Thanks to my new-found awareness, I have become extremely self-conscious about eating peanut butter in public spaces. On the bus the other day some peanut butter dripped from my sandwich to the seat. I panicked, thinking “What is the chance this spill is going to wind up hurting some little kid?” (I hope they are not licking the seats on the bus, but still.)

Coupled with my new roommate’s arrival, I was fascinated to find that peanut allergies have been back in the news. On January 5th, 2017, the National Institute of Allergy and Infectious Disease (NIAID) published new guidelines for practitioners about when to introduce peanuts to high-risk, medium-risk, and low-risk infants. High-risk infants with severe eczema and/or an egg allergy should be introduced to peanuts between 4 to 6 months. Medium-risk infants with mild eczema should be introduced to peanuts by 6 months, and low-risk infants without eczema or other allergies can be introduced to peanuts any time after they have been introduced to solid foods.

These guidelines fit in with the dual-allergen exposure hypothesis. This suggests that children are first exposed to food particles through their skin as infants. This exposure primes their immune systems to treat the food proteins like invaders and build up defenses against it. If the food is eaten years later, the child has an acute allergic reaction because their immune system had ample time to prepare. Children with eczema have weakened skin barriers and are much more likely to experience repeated skin exposure to food allergens. This leads to an increased chance of an allergic reaction once they eat the food. Current research now supports this hypothesis, and also suggests that by shortening the time between skin exposure and ingestion, we will reduce the number of acute allergic reactions. The sooner an infant starts eating an allergen, the more likely the body will adjust to it without having time to bsuild up strong defenses against it.

These new guidelines on peanut exposure from NIAID seek to correct for guidelines set by the American Academy of Pediatrics in 2000. The 2000 guidelines were based on only a few tests done on hypoallergenic infant formula feeding, yet conclusively recommended that infants at high-risk for peanut allergies wait until 3 years of age to first try peanuts. Based on the newest findings, it appears that this advice was ill advised. My roommate, n=1, was born in the mid-1990s when delaying peanut exposure was coming into vogue. She had severe eczema an infant, and following doctors’ recommendations, wasn’t introduced to peanuts until somewhere between 18-24 months old. She is equally fascinated with the new research, and wishes there was some way to know if the outcome would have been different had she tried them at a younger age.

Peanut allergies are more common in the US, UK, and Australia, which are also the countries that have historically had the most stringent recommendations around peanut introduction. As doctors and researchers sought to figure out why peanut allergies were ballooning, they looked to countries with very low peanut allergy rates, like Israel, where infants are introduced to peanuts at early ages. In Israel, instead of Cheerios, infants are given a peanut based snack, called Bamba, as one of their first foods. In other developing countries, infants are exposed to peanuts early on—both in their environment and in their food. These other countries also have much lower allergy rates.

In 2015, NIAID funded the Learning Early About Peanut Allergy (LEAP) study to determine whether early exposure to peanuts would decrease the incidence of peanut allergies. The UK study was a randomized controlled trial including 640 infants between 4 and 11 months of age with severe eczema and/or egg allergy. The infants were split into two groups (based on skin prick test results for peanuts) and then randomized to either eat or avoid peanuts until 60 months old (5 years). For infants in the negative skin prick test group, 13.7% of those who avoided peanuts had developed an allergy and only 1.9% of those who ate peanuts developed an allergy (P<0.001). For infants in the positive skin prick test group, 35.3% who of those who avoided peanuts had developed an allergy and 10.6% of those who ate peanuts developed an allergy (P=0.004). These results were significant and stunning, prompting the formulation of the current NIAID guidelines.

So, should we all start slathering our babies in peanut butter? Maybe. (As always, talk to your pediatrician). Food allergy science is an evolving field, and what is true today may not hold true a decade down the line. But based on the significance of the current research and the lower peanut allergy rates in cultures and countries that do not limit peanut exposure, the evidence strongly indicates that parents in the United States should change their approach.

Only 20% of children diagnosed with peanut allergies will grow out of them. The vast majority, like my roommate, are allergic for life. For now, research on reducing peanut allergies in adults is limited, making it unlikely that we will be eliminating any allergies anytime soon. So for now, I will continue to eat my peanut butter in my room. Alone.

Erin Child is a second semester NICBC student in the dual MS-DPD program and this is her first article for the Sprout. She loves cooking (usually with Friends or Parks & Rec on in the background). She hates brownies. (Seriously.) As the Logistics Co-Chair for the Student Research Conference, she looks forward to seeing everyone there on April 8th!

Microalgae: Do They Have a Place in Your Diet or Should They Be Left in the Pond?

by Julia Sementelli

If you have an Instagram account, chances are you’ve seen a slew of blue-green smoothies pop up on your feed. That vibrant color comes from adding some form of powdered algae to the smoothie. High in antioxidants, healthy fats, and protein, microalgae are the latest superfood to take over the nutrition world. The most popular types of algae include chlorella, spirulina, Aphanizomenon flos-aquae (AFA), Blue Majik…the list goes on. Microalgae are claimed to boost your energy, decrease stress, and reduce your risk for diabetes and heart disease. The question, of course, is whether these microalgae have any science-based health benefits beyond the nutrients they provide. I’ve asked consumers, health food companies, and nutrition experts to weigh in on whether algae should be added to your daily regimen or if they’re better off as fish food.

What are algae?  And why are we eating them?

Microalgae are very small photosynthetic plants rich in chlorophyll, which is where the green comes from (hello flashbacks to high school biology class). According to research, algae types differ in the nutrients they provide but all share one characteristic: they are high in antioxidants.  (See “Get To Know Your Blue-Green Algae” in the sidebar to learn more about individual microalgae). While some microalgae have been on the market for years, they have just recently risen to fame in the nutrition world as social media, blogs, and magazines advertise the purported benefits. One microalga in particular, spirulina, has received a significant amount of attention.  Companies have jumped on the microalgae bandwagon by adding spirulina to their products and even selling it in pure form. Abby Schulman, vegan and nutrition enthusiast, says that her fascination with superfood culture generally led to hearing about microalgae, in particular spirulina.  “It is sort of billed as this amazing nutrient-dense secret pill,” she states. “I was actually concerned about my iron levels and nutrition generally when I first started using it, since it was right when I transitioned to veganism. It felt like a good way of packing in some vitamins was to try the spirulina.” As a vegan who eats a diet rich in fresh produce, Abby states that adding spirulina to her diet is “ a more shelf stable way of getting in greens at the level I eat them than having to buy huge tubs of greens all the time.”

Processed with VSCOcam with c1 preset

Photo credit: Julia Sementelli

Microalgae’s time in the sun

 Blue-green microalgae have become a nutritional celebrity thanks to their prevalence in popular health food spots across the United States. Juice Generation, a national juice and smoothie chain, has jumped on the algae bandwagon by selling products that tout its supposed benefits. Products range from “Holy Water,” which contains Blue Majik, tulsi, coconut water, and pineapple, to concentrated shots of E3Live. These products claim to boost energy, enhance focus, and balance blood sugar. However, research to support these claims is lacking.

Infographic credit: Julia Sementelli

Infographic credit: Julia Sementelli

Health food businesses that use social media and blogs to advertise their products have also played a significant role in making microalgae famous. Sun Potion, an online medicinal plants and superfoods company, sells a slew of supplements, including chlorella. Sky Serge, Sun Potion spokesperson, is a big proponent of the power of chlorella. “Sun Potion chlorella is a single-celled green algae that is different than others, and is grown indoors and processed using an advanced sound frequency technology to crack the cell wall, making its many nutrients available for us to enjoy,” she explains. She says that she enjoys consuming chlorella in a glass of spring water each morning. “I have personally felt its detoxification benefits and have noticed healthier skin, better digestion and overall, a better wellbeing. Whether I am drinking it in my water in the morning or adding it to a salad dressing, I try and want to consume it every day!”

To further bolster Sun Potion’s belief in the power of its chlorella, founder, Scott Linde claims that chlorella “contains all eight essential amino acids, which could allow a person to live solely on chlorella and clean drinking water.” Not surprisingly, he too consumes chlorella daily. “Upon waking in the morning, I enjoy an eight ounce glass of water with a teaspoon of chlorella mixed in,” he says. “This simple action can punctuate the start of a great day. The body is slightly dehydrated after sleep, meaning the nutrients from the chlorella are absorbed almost immediately into the blood stream.” When asked about the nutrition benefits of chlorella, Linde claims that drinking chlorella offers much more than just antioxidants. “It helps to oxygenate the blood, waking up the brain; nourish the organs; aid in healthy elimination; and assist the body in moving toxins out of the system.” Not only have Serge and Linde experienced excellent results, but their customers have as well. “At Sun Potion, we have actually had customers tell us that they have forgotten to make their coffee in the morning because they were feeling so good from their morning chlorella ritual. This is perfect example of potent nutrition and best quality plant materials helping to saturate the body with positive influence, leading to looking, feeling, and operating at one’s best.”

The good, the bad, and the blue-green

Although many health claims about microalgae, such as increasing energy and regulating blood sugar, are not supported by science, research has shown some promising, more realistic benefits. A 2013 study showed that adding 3600 milligrams per day of chlorella to the diets of 38 chronic smokers for six weeks helped to improve their antioxidant status and reduce their risk of developing cancer. Another study found that daily intake of 5 grams of chlorella reduced cholesterol and triglyceride levels in patients with high cholesterol. Research has even found that supplementing chlorella can improve the symptoms of depression, when used in conjunction with antidepressant therapy. Still, many of these studies are the first of their kind and more evidence is needed regarding the long-term effects on cholesterol, cancer prevention, and depression, in addition to other conditions microalgae are claimed to help to alleviate.

While the supposed benefits of microalgae typically receive all of the attention, microalgae also have their own list of caveats. According to New York City-based registered dietitian, Willow Jarosh, “Some people can have allergic reactions to both spirulina and chlorella, so take that into consideration when trying. In addition, spirulina can accumulate heavy metals from contaminated waters.” She also states that microalgae can actually be too high in certain nutrients. “If you have high iron levels, have gone through menopause, or are a man, be aware of the high iron levels in microalgae—especially if you use them regularly.”

So what’s the verdict?

While there is certainly a lot of hype surrounding microalgae in the media, from companies that sell products containing them to preliminary supporting research, when it comes to recommending adding chlorella to your daily diet, experts are hesitant.

According to Jarosh, “There are some really major health claims, with very little scientific evidence/research to back up the claims, for both chlorella and spirulina.” As the co-owner of a nutrition consulting business, C&J Nutrition, she finds that her clients are frequently asking her about her thoughts on microalgae. “We’re always reluctant to recommend taking something when the long-term safety is unknown,” Jarosh says. “And since there’s not much research in humans to provide strong reasons to take these supplements (yet!), and the long-term research is also lacking, we’d recommend not using either on a regular basis.”

Microalgae are packed with antioxidants and those are always a good addition to your daily eats. Although the colors of microalgae appear supernatural and their effects often advertised as having the ability to give you superpowers, research is currently inadequate to say whether microalgae have more benefits than other antioxidant-rich foods. If you do decide to try it based on its antioxidant content, make sure that it does not replace other fruits and vegetables in your diet. Remember: Whole foods are always better than a powder.

Julia Sementelli is a second-year Nutrition Communication & Behavior Change student and registered dietitian.  Follow her on Instagram at @julia.the.rd.eats

 

 

 

 

 

 

Timing of your Meals–Does it Matter?

by Yifan Xia

How would you feel if you were told to not have dinner for the rest of your life? Skipping dinner every day might sound shocking to most of us, but it was once a very common practice in ancient China in the Han Dynasty. In fact, even today Buddhism and Traditional Chinese Medicine (TCM) promote this practice as a healthier choice than eating three meals per day. But does this practice have roots in science? Of course, controversy exists around this topic, but one thing that we can be certain of today is that the timing of our meals can have a much greater impact on our health than we originally thought.

Researchers investigating the circadian system (internal biological clock) have started looking at the effects of mealtime on our health. Surprisingly, preliminary evidence seems to support the claims of Buddhism and TCM, indicating that eating meals earlier in the day might help promote weight loss and reduce the risk of chronic disease.

What are circadian rhythms and the circadian system?

Circadian rhythms are changes in the body that follow a roughly 24-hour cycle in response to external cues such as light and darkness. Our circadian system, or internal biological clock, drives circadian rhythms and prepares us to function according to a 24-hour daily cycle, both physically and mentally.

Why do they matter to our health?

Our internal biological clock is involved in almost every aspect of our daily lives: it influences our sleep-and-wake cycle, determines when we feel most energetic or calm, and when we want to eat.

These days people don’t always rely on their biological clocks to tell them when to eat, and there are many distractions in the environment that can influence mealtime. We typically think how many calories we eat—and what we eat—are the major contributors to our weight and health, but researchers have found that eating at inappropriate times can disrupt the internal biological clock, harm metabolism, and increase the risk of obesity and chronic disease.

What does the research say?

Although currently the body of research evidence for this area is relatively small, there are several human studies worth highlighting. One randomized, open-label, parallel-arm study, conducted by Jakubowicz, D., et al and published in 2013, compared effects of two isocaloric weight loss diets on 93 obese/overweight women with metabolic syndrome. After 12 weeks, the group with higher caloric intake during breakfast showed greater weight loss and waist circumference reduction, as well as significantly greater decrease in fasting glucose and insulin level, than the group with higher caloric intake during dinner. Another study published in the same year with 420 participants noted that a 20-week weight-loss treatment was significantly more effective for early lunch eaters than late lunch eaters. In 2015, a randomized, cross-over trial, conducted in 32 women and published in International Journal of Obesity, showed that late eating pattern resulted in a significant decrease in pre-meal resting-energy expenditure, lower pre-meal utilization of carbohydrates, and decreased glucose tolerance, confirming the differential effects of meal timing on metabolic health. However, few studies were identified reporting negative findings, probably due to the fact that this is an emerging field and more research is needed to establish a solid relationship.

 So when should we eat? Is there a perfect mealtime schedule for everyone?

“There are so many factors that influence which meal schedules may be suitable for an individual (including biological and environmental) that I cannot give a universal recommendation,” says Gregory Potter, a PhD candidate in the Leeds Institute for Genetics, Health and Therapeutics (LIGHT) laboratory at the University of Leeds in the United Kingdom and lead author on the lab’s recent paper reviewing evidence of nutrition and the circadian systems, published in The British Journal of Nutrition in 2016. Potter also comments that regular mealtime seems to be more important than sticking to the same schedule as everyone else: “There is evidence that consistent meal patterns are likely to be superior to variable ones and, with everything else kept constant, it does appear that consuming a higher proportion of daily energy intake earlier in the waking day may lead to a lower energy balance and therefore body mass.”

Aleix Ribas-Latre, a PhD candidate at the Center for Metabolic and Degenerative Diseases at the University of Texas Health Science Center and lead author on another review paper investigating the interdependence of nutrient metabolism and the circadian systems, published in Molecular Metabolism in 2016, also agrees: “To find the appropriate meal time has to be something totally personalized, although [it] should not present [too] much difference.” Aleix especially noted that people who are born with a tendency to rise late, eat late, and go to bed late (“night owls” versus “early birds”) are more likely to be at risk for metabolic disease.

Do we have to eat three meals a day?

How many meals do you usually have? In fact, how much food makes a meal and how much is a snack? There is no universal definition, which makes these difficult questions to answer.

“To maintain a healthy attitude towards food, I think it is important to avoid being too rigid with eating habits … I do think consistency is important as more variable eating patterns may have adverse effects on metabolism,” says Potter. “Although there is evidence that time-of-day-restricted feeding (where food availability is restricted to but a few hours each day) has many beneficial effects on health in other animals such as mice, it is as yet unclear if this is true in humans. I’d also add that periodic fasting (going for one 24 hour period each week without energy containing foods and drinks) can confer health benefits for many individuals,” Potter comments.

[See Hannah Meier’s recent article on intermittent fasting for more.]

Based on their research, Ribais-Latre and his lab have a different opinion. “We should eat something every 3-4 hours (without counting 8 hours at night). Many people complain about that but then consume a huge percentage of calories during lunch or even worse at night, because they are very hungry. Eating a healthy snack prevents us [from] eating too [many] calories at once.” He suggests what he considers a healthier mealtime schedule:

–          6:00 am  Breakfast (30% total calories)

–          9:30 am  Healthy snack (10%)

–          1:00 pm  Lunch (35%)

–          4:30 pm  Healthy snack (10%)

–          8:00 pm  Dinner (15%)

What if you are a shift worker or your work requires you to travel across time zones a lot? Ribais-Latre’s advice is “not to impair more their lifestyle… at least it would be great if they are able to do exercise, eat healthy, sleep a good amount of hours.”

What does Traditional Chinese Medicine say?

There are historical reasons behind the no-dinner practice in ancient China in the Han Dynasty. First, food was not always available. Second, electricity hadn’t been invented, so people usually rested after sunset and they didn’t need much energy at what we now consider “dinner time.”

However, there are also health reasons behind this practice. In TCM theory, our internal clock has an intimate relationship with our organs. Each organ has its “time” for optimal performance, and we can reap many health benefits by following this clock. For example, TCM considers 1:00 am – 3:00 am the time of “Liver”. The theory says that is when the body should be in deep sleep so that the liver can help to rid toxins from our body and make fresh blood. Disruption at this time, such as staying up until 2:00 am, might affect the liver’s ability to dispel toxins, leading to many health problems, according to the theory.

Many Western researchers do not seem to be familiar with the TCM theory. When asked about the practice of skipping dinner, Potter comments, “I think that skipping dinner can be a perfectly healthy practice in some circumstances; in others, however, it may be ill advised if, for example, the individual subsequently has difficulty achieving consolidated sleep.”

On the flip side, Ribais-Latre says that “skipping a meal is not good at all. We should not eat more calories than those we need to [live], and in addition, the quality of these calories should be high… If you can split those calories [to] 5 times a day instead of three, I think this is healthier.”

Even though there is no universal agreement on mealtime, the tradition of “skipping dinner” did come back into style several years ago in China as a healthier way of losing weight, and was quite popular among Chinese college women. Yan, a sophomore from Shanghai and a friend of mine, said that she tried the method for six months but is now back to the three-meal pattern. “The first couple of days were tough, but after that, it was much easier and I felt my body was cleaner and lighter… I did lose weight, but that’s not the main goal anymore… I got up early every day feeling energetic. Maybe it’s because I only ate some fruits in the afternoon, I usually felt sleepy early and went to bed early, which made it easier to get up early the next day with enough sleep… I’m eating three meals now, but only small portions at dinner, and I think I will continue this practice for my health.”

So what’s the take-away?

Mealtime does seem to matter. But exactly how, why, and what we can do to improve our health remains a mystery. Researchers are now looking into the concept of “chrono-nutritional therapy,” or using mealtime planning to help people with obesity or other chronic diseases. When we resolve this mystery, the question of “When do you eat?” will not just be small talk, but perhaps a key to better health.

Yifan Xia is a second-year student studying Nutrition Communication and Behavior Change. She loves reading, traveling, street dancing, trying out new restaurants with friends in Boston, and watching Japanese animations.