Nutrition in a Nutshell: Lessons Learned as a Dietetic Intern

by Katelyn Castro

I was one of those few teenagers who knew exactly what I wanted to be when I grew up. Now, after four years of college and two years of graduate school combined with a dietetic internship, a career as a registered dietitian is not far out of reach. While my passion for nutrition has never dwindled over these last six years, my approach nutrition has changed significantly.

Nutrition tips on the sidebar of Self magazine, an over-simplified nutrition lesson in a health class in middle school, and a quick nutrition lecture from my pediatrician, summed up my understanding of nutrition before entering college. Now­—six years of coursework and 2000+ hours of dietetic rotations later—I not only know the nitty-gritty details of nutrition science, but I also have learned some larger truths about nutrition that are not always talked about.

Beyond what you may read as you thumb through your social media feed, or even what you may learn from an introductory nutrition textbook, here are some of the lessons that I have acquired about nutrition along the way:

1- Nutrition is an evolving science.

First, let’s be clear that nutrition is a science that relies on concepts from biology, chemistry, anatomy, physiology, and epidemiology to study how nutrients impact health and disease outcomes. Understanding how diabetes alters carbohydrate metabolism allows people with diabetes to live without fear of dying from diabetic ketoacidosis or seizures due to unsafe blood glucose levels. Understanding how ulcerative colitis impacts mineral absorption and increases protein losses helps those with the condition manage nutrient deficiencies with adequate nutrition supplementation. These are only a few examples of the many ways our knowledge of nutrition science makes it possible to improve individuals’ health outcomes.

However, the more I learn about nutrition, the more I realize that the research still holds many unanswered questions. For example, previous nutrition guidelines, like when to introduce hypoallergenic food to children, are being disproven and questioned by more recent studies. On the other hand, research on the gut microbiota is just beginning to uncover how one’s diet interacts with their gut microbiota through hormonal and neural signaling. Staying up-to-date on the latest research and analyzing study results with a critical eye has been crucial as new scientific discoveries challenge our understanding of nutrition and physiology.

Who would have thought a career in nutrition would require so much detective work?

 2- Food is medicine, but it can’t cure everything.

The fact that half of the leading causes of death in the U.S. can be influenced by diet and physical activity highlights the importance of nutrition for long-term health. Using medical nutrition therapy for patients with variety of health problems, ranging from cancer and cardiovascular disease to cystic fibrosis and end-stage renal disease, has also allowed me to see nutrition powerfully impact the management and treatment of many health conditions. High cholesterol? Avoid trans fat and limit saturated fat in foods. Type 2 diabetes? Adjust the timing and type of carbohydrates eaten.

While making simple changes to eating habits can improve lab values and overall health, nutrition is often only one component of treatment accompanied by medication, surgery, therapy, sleep, and/or stress management. Interacting with patients of all ages and health problems, and working with health professionals from a range of disciplines has forced me to step out of my nutrition bubble and take a more comprehensive approach to patient care: Improving quality of life and overall health and wellbeing is always going to be more important than striving for a perfect nutrition plan.

3- Nutrition is political and nutrition messages can be misleading.

Back when the Academy of Nutrition and Dietetics was one of many health organizations sponsored by Coca-Cola and PepsiCo, I realized how much influence large food industries have on food advertising, marketing, and lobbying. With known health consequences of drinking too many sugary beverages, the concept of health organizations being sponsored by soda companies was perplexing to me. Learning more about the black box process of developing the government dietary guidelines has also made me more cognizant of government-related conflicts of interest with industries that can color the way nutrition recommendations are presented to the public.

Industry-funded nutrition research raises another issue with nutrition messaging. For example, only recently a study revealed that the sugar industry’s funded research 50 years ago downplayed the risks of sugar, influencing the debate over the relative risks of sugar in the years following. Unfortunately, industry-sponsored nutrition research continues to bias study results, highlighting positive outcomes, leaving out negative ones, or simply using poor study designs.  While sponsorships from big companies can provide a generous source of funding for research, as both a nutrition professional and a consumer, I’ve learned to take a closer look at the motives and potential bias of any industry-funded nutrition information.           

4- Nutrition is not as glamorous as it sounds, but it’s always exciting.

When the media is flooded with nutrition tips for healthy skin, food for a healthy gut, or nutrients to boost mood, the topic of nutrition can seem light and fluffy. With new diets and “superfoods” taking the spotlight in health magazines and websites, it’s easy to think of nutrition as nothing more than a trend.

However, any nutrition student or dietitian will prove you otherwise. In the words of one of my preceptors, “my job [as a dietitian nutritionist] is not as glamorous and sexy as it sounds.” Throughout my dietetic rotations, my conversations with patients and clients have gone into much more depth than just aesthetics and trendy nutrition topics. If I’m working with a patient with Irritable Bowel Syndrome, bowel movements (a.k.a poop) may dominate the conversation. If I’m counseling someone who has been yo-yo dieting, I may be crushing their expectations of fad diets while encouraging more realistic, sustainable healthy goals. If I’m speaking with a group of teenagers with eating disorders, I may not talk about nutrition at all and focus more on challenging unhealthy thoughts and behaviors about food. It is these conversations, discussing what really matters when it comes to food, nutrition, and overall health that make a career in nutrition ever-changing and always exciting.

Katelyn Castro is a second-year student graduating this May from the DI/MS Nutrition program at the Friedman School. She hopes to take advantage of her experiences at Tufts to make positive impact on individuals’ health and wellbeing through community nutrition outreach. You can follow on her journey as she blogs on all things relating to food and nutrition at nutritionservedsimply.com.

 

 

Advertisements

Finding Common Ground for Nutrition in a World of Alternative Facts

by Rachel Baer

Rachel Baer tackles the implications of the “post-truth” culture for the nutrition profession and poses 3 questions to consider about our response to the unending barrage of nutrition-related “alternative facts.”

As a registered dietitian, I can tell you this: Nutrition professionals know a thing or two about alternative facts. We spend our careers with textbooks and scientific journals in hand, waiting for the next misinformed food fad to go viral. We fight to defend the facts because we have always believed that if we could show people what is true, we could convince them that we have the best answers for their nutrition-related questions. But the concept of truth is losing popularity.

The Oxford English Dictionary declared the term “post-truth” to be the 2016 word-of-the-year. Post-truth is defined as “related to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” Let that sink in for a moment: emotional appeals are more influential than objective facts. While this concept is alarming on many levels, I am particularly concerned about its implications for health professionals who rely on scientific truths as the basis of their credibility.

Don’t get me wrong. I understand the frustration people feel as they watch seemingly contradictory nutrition headlines emerge at the very hint of new research findings. One day people are told to limit egg consumption to 3 yolks per week, the next, the one-yolk-per-day allowance is back. However, as nutrition professionals, we have a certain appreciation for the fact that science is ever-evolving. We hold our recommendations lightly because we believe in a scientific community that is always growing, and that new discoveries only sharpen our understanding of nutrition and physiology. The public, on the other hand, does not always share this appreciation.

Confusion over wavering nutrition claims is exacerbated by the inundation of un-credentialed, unschooled voices clamoring for attention in popular media. Social media has provided a proverbial soapbox for anyone with a passionate message to share, regardless of qualifications. Simultaneously, dietitians tend to hold back on making bold retorts, often waiting for consensus to catch up with the fads so that our recommendations are supported with the latest research. This seeming imbalance of voices alongside the emergence of the post-truth culture only perpetuates the proliferation of unfounded claims, or “alternative facts,” as they have become popularly known.

I have no easy answers for this predicament, but here are 3 questions that we could benefit from exploring as nutrition professionals:

1. How do we remain experts while also being compelling?

Dietitians have long been referred to as the “food police.” While I resent this reputation, it highlights a worthy question: Do nutrition professionals present information in a way that is relatable, realistic, and winsome to the people whose respect we want to gain?

We can no longer depend solely on the letters after our names to gain an audience with the public, particularly when we are pitted against wayward blog and media influencers using sensationalized language to win over vast groups of people who blindly follow their passionate advice. The internet is full of examples of people preferring to follow the advice of a persuasive friend or influencer over the advice of a knowing professional. While this situation is endlessly frustrating to those of us who see through their hyperbolic messages, is there anything we can learn from these blog/media personalities that may help us reach the audience they seem to have hooked? How do we successfully build rapport with the public while maintaining good science?

2. How do we talk about fundamentals in a world that wants controversy?

Let’s face it. Fundamentals don’t make great headlines. For decades, consensus research has revealed that a diet full of minimally-processed fruits, vegetables, whole grains, nuts/seeds, lean proteins, and healthy fats is unequivocally and unanimously the best diet for human health. Yet, people still search elsewhere looking for the latest and greatest weight-loss, risk-reducing, and health-enhancing diets. Could it be that balance is more challenging than we thought? Perhaps avoiding certain food groups or food ingredients altogether is easier than the amorphous concept of moderation? Our greatest challenge is not getting more people to consume health information, it is finding new and compelling ways to deliver the information we’ve known for decades, and this is no small task.

3. How do we overcome differences within the nutrition profession to present a united front to people lost in the sea of alternative facts?

In 2014, David Katz and Walter Willet co-chaired a conference sponsored by the non-profit Oldways*, titled “Finding Common Ground.” Oldways and the co-chairs assembled what they referred to as “the dream team of nutrition experts,” including Friedman’s own, Dariush Mozaffarian, as well as Dean Ornish, creator of the Ornish Diet; David Jenkins, founder of the glycemic index; Boyd Eaton, originator of the Paleolithic diet; Collin Campbell, author of The China Study; and a myriad of others. Known most commonly for their differences, this group of scientists gathered together for the sole purpose of coming to a consensus on the basic tenants of a healthy diet. In the end, the group agreed on 11 common denominators of the widely differing philosophies they espouse. The topics ranged from fruit and vegetable consumption, to sustainability, to food literacy.

Following the conference, David Katz published an article in Forbes where he said “…it is the controversies at the edge of what we know that interest experts most, but ask [experts] about the fundamentals, and the vast expanse of common ground is suddenly revealed.” The Common Ground committee’s decision to gather around a table, invite open dialogue, and pursue unity is something we could all learn a lesson from. Alternative facts will always provide fodder for hucksters and peddlers of over-simplified nutrition information, but the scientific community has a vast body of research that unites us. As nutrition professionals, we cannot forget that our voices will always be more powerful together than they ever will apart.

Rachel Baer is a registered dietitian and a first-year in the NICBC program at Friedman. Her favorite foods are Brussels sprouts and brownies, and she loves nothing more than cooking great meals and gathering people around a table.

*Editor’s Note, 5/1/17  2:09 PM: An earlier version of this article incorrectly spelled the name of the organization, “OldWays.” The correct spelling is Oldways, and the change has been made above.

“Food Will Win the War!” American Food Policies During World War I

by Jennifer Pustz

“The consumption of sugar sweetened drinks must be reduced” . . . “use less meat and wheat” . . . “buy local foods.” These are familiar phrases at the Friedman School in 2017. But these slogans and many others could be found on posters one hundred years ago after the United States officially entered World War I in April 1917. Friedman student Jennifer Pustz a story from food history that may offer inspiration for the promotion of gardening, conservation, and sustainability in the twenty-first century.

One hundred years ago, on April 6, 1917, the United States ended over two years of neutrality and officially entered World War I. Although the war ended in November of the next year, the nineteen-month period of involvement had an enormous impact on everyday life in the U.S., especially when it came to food and government engagement in food supply and distribution. In Victory Gardens, canning clubs, and kitchens all over America, women engaged in a massive effort to produce, preserve, and conserve food to support the war effort.

By the time the United States entered the war, the issue of food production and conservation had become a top priority for American soldiers and European civilians. After nearly three years of constant ground war, Europe’s agricultural fields were ravaged, much of the labor force had joined the military, and trade was disrupted both on land and at sea. The result was a humanitarian crisis that required the assistance of the United States, whose policy of neutrality and geographic distance from the front lines had protected agricultural production from serious harm.

President Wilson established the United States Food Administration by executive order on August 10, 1917, and Congress passed the Food and Fuel Control Act, also known as the Lever Act. Herbert Hoover, a former mining engineer with prior experience in facilitating food aid to Europe, was hired to serve as the administrator. The Food Administration’s goals were broad—from regulating exports and managing the domestic food supply, to preventing hoarding and profiteering, to promoting agriculture and food conservation. In addition to the federal program, state branches of the Food Administration promoted programs that met the needs of their residents and responded to their own unique food production and consumption issues.

Food will win the war. Wheat is needed for the allies, 1917. Charles Edward Chambers, illustrator. Boston Public Library Prints Department. http://ark.digitalcommonwealth.org/ark:/50959/ft848v37p

 

Hoover took no salary to provide a model of self-sacrifice that he hoped to see in other Americans. One remarkable aspect of the World War I Food Administration story is the overwhelming success of a voluntary effort. In a report about the Massachusetts Committee on Public Safety, published shortly after the war’s conclusion, the author noted the following:

“At no point, even in the most intense shortage of sugar, did the Food Administration establish any legally effective system of rationing for householders; and in the case of both sugar and wheat substitutes, the selfish disregard of Food Administration requests, shown by a few, was much more than offset by the voluntary efforts of that great majority who went well beyond the requested measures, and brought about a total saving far greater than would have been possible by a mechanical rationing program” (311).

Efforts to increase food production targeted large-scale farmers to homeowners with very little land, and almost everyone in between. Even industrial sites engaged in food production. At the American Woolen Company’s 50 mills, over 500 acres were cultivated; factory workers produced over 45,000 bushels of potatoes, 40,000 ears of sweet corn, and thousands of bushels of root crops and summer vegetables. The industrial production was so successful that it was “recognized by many manufacturers that such provision for their employees is of great value, not only in contributing to the support of families, but in its bearing on permanence of occupation and on contentment of mind” (339).

Household Victory Gardens sprouted up in “all manner of unheard-of-places” and allowed homeowners to reduce their dependence on the national food supply by growing their own produce for immediate consumption and canning the surplus for winter months. The U. S. Food Administration advocated for raising livestock as well and promoted “Pig Clubs” for boys and girls. Pigs could aid in reduction of food waste by eating the family’s household scraps. In Massachusetts, the supply of pigs was unable to meet the demand for them.

A massive publicity and communication campaign supported the public adoption of conservation methods. Posters that promoted reduced consumption of sugar, wheat, and meat played upon emotions of patriotism and guilt. Literature on food conservation was translated into at least eleven languages in Massachusetts: Armenian, Finnish, French, Greek, Italian, Lithuanian, Polish, Portuguese, Swedish, Syrian, and Yiddish. More than 800,000 of these leaflets were distributed. A group of five cottages, surrounded by demonstration gardens, were located in the Boston Common between May and October 1918, where visitors could hear lectures, see demonstrations, and pick up educational materials.

War garden entrance on Boston Common during war with Germany, 1918. Leslie Jones, photographer. Boston Public Library Print Department. http://ark.digitalcommonwealth.org/ark:/50959/5h73qd62f

Americans who participated in home gardening and preserving their harvests took some burden off of the general food supply. In Topsfield, Massachusetts, a canning club provided facilities and services for fruit and vegetable preservation. For a 50-cent membership, one could order and buy from the club’s stock at 4 percent discount, send her vegetables and fruits to be preserved in exchange for the cost of labor plus overhead, or could do her own canning using the club’s facilities, which were open 4 days per week. In one season, the canning club produced 3000 jars of fruits and vegetables, 1800 glasses of jelly, and 500 pounds of jam.

Americans voluntarily adopted practices such as “Wheatless Mondays” and “Meatless Tuesdays,” as did hotels and restaurants, which participated in “No White Bread Week” between August 6-12, 1917. Recipes that conserved sugar, wheat, fats, and meat dominated women’s publications and cookbooks of the time. The 1918 book Foods that Will Win the War and How to Cook Them included this recipe for “War Bread”:

2 cups boiling water

2 tablespoons sugar

1 ½ teaspoons salt

¼ cup lukewarm water

2 tablespoons fat                 

6 cups rye flour

1 ½ cups whole wheat flour

1 cake yeast 

To the boiling water, add the sugar, fat and salt. When lukewarm, add the yeast which has been dissolved into the lukewarm water. Add the rye and whole wheat flour. Cover and let rise until twice its bulk, shape into loaves; let rise until double and bake about 40 minutes in a moderately hot oven.

Young people were not exempt from “doing their bit.” The U. S. Food Administration published books, including some for use in schools, to influence young readers who would pass the message on to their parents. Home economics textbooks for college classes applied lessons on macro- and micronutrients and energy metabolism to the state of the food supply in the United States and abroad.

After the war ended on November 11, 1918, the activities of the Food Administration slowed and the agency was eliminated in August 1920. The government implemented mandatory rationing during World War II, but since then, Americans have experienced little to no government interference with their food consumption. Many of the voluntary efforts promoted in the name of patriotism in 1917 and 1918 resonate with some of the food movements of today, such as reducing the amount of added sugar in foods and increasing consumption of whole grains. One would hope it would not take a war and a national propaganda campaign to change behaviors, but perhaps it is worth looking back one hundred years for inspiration to promote gardening, healthier and more sustainable eating habits, and reduced food waste.

Jennifer Pustz is a first-year NICBC student in the MS-MPH dual degree program. In her previous professional work as a historian, Jen’s research interests focused on the history of domestic life, especially the lives of domestic workers, the history of kitchens, domestic technology, and of course, food.

Works Cited:

C. Houston and Alberta M. Goudiss. Foods that Will Win the War and How to Cook Them. New York: World Syndicate Co., 1918; George Hinckley Lyman. The Story of the Massachusetts Committee On Public Safety: February 10, 1917-November 21, 1918. Boston: Wright & Potter Printing Co., 1919.

 

 

Balance, Variety, and Moderation: What Do They Really Mean?

by Katelyn Castro

Balance, variety, and moderation have been referenced in the Dietary Guidelines for Americans for decades. Yet overtime, the ambiguity of these terms has clouded their importance and left their meaning open for interpretation—often misinterpretation.

“Everything in moderation.”

“It’s all about balance.”

“I eat a variety of foods… well, a variety of ice-cream flavors!”

These words are often used to justify our food choices or to make us feel better when our diet is not 100% nutritious. Not anymore! Instead of using these words to rationalize our eating habits (which is completely unnecessary and counterproductive), let’s talk about how these nutrition concepts can be interpreted with a more intuitive approach to healthy eating.

Variety

Fruits and vegetables are usually the food groups that we focus on when we talk about variety in our diet. However, variety is encouraged within all the major food groups and among the food groups.

Besides making meals more colorful, eating a variety of fruits, vegetables, dairy, proteins, and grains provides a wider range of vitamins, minerals, antioxidants, prebiotics, and probiotics—keeping our heart, mind, skin, eyes, and gut functioning optimally. Varying protein with a combination of eggs, dairy, legumes, grains, and nuts is especially important for vegetarians to receive adequate amounts of all essential amino acids.

In addition to the benefits of variety at the biochemical level, a varied diet can also make eating more satisfying and flexible. While it can be easy to rely on your food staples for meals, introducing new ingredients can bring attention back to the flavor and enjoyment of eating, preventing you from eating on autopilot. Swap out an apple for a grapefruit or peach; have turkey or fish in place of chicken; substitute barley or quinoa for pasta. Choosing local and seasonal foods will also keep your diet varied diet throughout the year. Giving yourself permission to eat a variety of foods within all food groups can be freeing, helping to overcome rigid eating habits and food rules and appreciate the range of foods that satisfy your hunger and cravings.

Photo credit: https://stocksnap.io

Moderation

Sweets, fatty meats, fried food, fast food, soda… these are all foods recommended to “eat in moderation,” or limit, in some cases. Whether it is unwanted weight gain or increased risk of type 2 diabetes, the negative health effects of eating excess added sugars and solid fats have been identified in the literature. However, cutting out sugary and fatty foods completely can be just as damaging for our emotional health, leaving us disconnected from friends and family and preoccupied with thoughts about food. Food is a huge part of our culture; it’s social, celebratory, and meant to be enjoyed in good company. That’s why moderation—not restriction or overindulgence—is the secret to healthy, happy eating habits.

But, what does moderation really mean? Technically, the most recent dietary guidelines recommend limiting added sugars to less than 10% of total calories per day, saturated fat to less than 10% of total calories per day, and trans fat to as little as possible. Realistically, this may translate into having more added sugars one day (i.e. when you’re eating cake at a family birthday party), and having more saturated fat another day (i.e. when you eat pizza with friends on a weekend).

Moderation is about being open to day-to-day variations in your diet depending on your appetite, cravings, and activity level. Sometimes a big bowl of ice-cream is just want you need to satisfy your sweet tooth, other times a small square of chocolate may be enough to keep sweet cravings at bay. Savoring the flavor of sugary and fatty foods and becoming aware of how your body responds can help you determine what “eating in moderation” means for you.

Photo credit: https://stocksnap.io

Balance

Out of all three of these terms, balance probably has the most interpretations. A balanced diet is often defined as a balance of protein, carbohydrates, and fat within the Acceptable Macronutrient Distribution Ranges set by the Institute of Medicine. A balanced meal, on the other hand, refers to a balance of food groups consistent with MyPlate or Harvard’s Healthy Eating Plate: fill half your plate with fruits and vegetables, one fourth with lean protein, and one fourth with whole grains. Together, creating a balance of food groups and macronutrients can make meals and snacks more filling (from protein and fiber) and provide more sustained energy (from carbohydrates in whole grains, beans, fruits, and vegetables).

Beyond balance within our food choices, energy balance looks more broadly at the balance between energy intake (calories from food) and energy expenditure (calories used for exercise and metabolic processes). Energy balance is associated with weight maintenance, while energy imbalance can contribute to weight loss or weight gain. However, this concept is often oversimplified because energy expenditure cannot be precisely calculated since many factors like the stress, hormones, genetics, and gut microbiota (bacteria in our digestive tract) can alter our metabolism. For example, chronic stress can lead to high levels of cortisol, which signal the body to store fat, contributing to weight gain. In contrast, a diverse composition of gut microbiota may enhance metabolism and promote weight loss, according to preliminary research.

Considering the multiple factors influencing our metabolism, listening to our bodies’ hunger and fullness cues can often guide food intake better than relying on calculated formulas and food trackers. Creating balance, variety, and moderation in our diets can help us meet our nutritional needs and achieve energy balance, while preserving the joy and connection that food brings to our lives.

Photo credit: https://stocksnap.io

Katelyn Castro is a second-year student in the DI/MS Nutrition program at the Friedman School. She’s a foodie, runner, and part-time yogi on a mission to make healthy eating easy, sustainable, and enjoyable. You can find her thoughts on all things relating to food and nutrition at nutritionservedsimply.com

Soul of the Louisiana Kitchen

by Katie Moses

When the only remnants of Mardi Gras are plastic beads hanging from the oaks along St. Charles Avenue, Louisiana still draws people from around the world for the lively music and incredible food. Discover the secret to the depth of flavor in Cajun and Creole cuisine and recreate a classic Louisiana dish, red beans and rice, in your own kitchen.

Celery, Onion, Bell Peppers. Photo credit: Flickr

Imagine early afternoon in southern Louisiana. The sweltering heat is held at bay by the air conditioner running on full blast; your grandmother begins to quarter onions, seed bell peppers, and break celery stalks. This “southern symphony” begins to swell with the sudden whir of an old food processor finely mincing onion, celery, and green bell pepper, while a layer of oil in a large pot warms on the stovetop. The sizzle crescendos in the Cajun kitchen as she adds the onions, then celery and bell pepper to the hot oil, and their aromatics waft through the 1960s ranch-style house..

Growing up in my grandmother’s home in the heart of Cajun country, this is how homemade dinners began. No matter if it’s red beans, gumbo, or jambalaya, every Cajun dinner starts with a little oil in a heavy-bottomed pot and the Cajun holy trinity – onion, celery, and green bell pepper.

History of the Trinity

The influences of French and Spanish occupation of Cajun and Creole country adds to the “southern symphony” with echoes of Catholic church bells in every town. Naming the aromatic trio the Cajun holy trinity in this predominantly Catholic region reflects how food traditions are as fundamental to the identity of the residents of south Louisiana as their faith.

Aromatic vegetables sautéed in oil as the foundation of flavor in Cajun and Creole cuisine is mirrored in the many cultures that have influenced its traditions: the mirepoix in France; the sofrito in Spanish-speaking countries; and the sacred flavor trinities of West African cuisines. The mirepoix combines onions, celery, and carrots. The slightly sweet carrot adds a different flavor profile compared to the bitter notes of the green bell pepper. A typical sofrito in Spain mixes tomatoes, bell peppers, onions, and garlic, while the Cuban sofrito is the Cajun trinity with garlic added as an official fourth ingredient in the seasoning mix. West African dishes typically begin with tomatoes, onions, and chili peppers. While West African and Spanish cuisines influenced both the Cajun and Creole cuisines, the tomatoes in Creole gumbo and not Cajun gumbo illustrates the stronger influence of West African and Spanish on the Creoles in New Orleans than the Cajuns in Acadiana.

Mirepoix: Celery, Onion, and Carrots. Photo credit: Flickr

Spanish Sofrito: Tomato, Onion, Bell Pepper, and Garlic. Photo credit: Flickr

West African Trinity: Tomato, Onion, and Chili Pepper. Photo credit: Flickr

A Kitchen Staple

Louisiana grocers make the lives of Cajun cooks easier by always stocking the produce section with celery, green bell peppers, onions, and garlic, and by making the ingredients available in many formats. For those who seek even more convenience, cooks can get Guidry’s Fresh Cuts Creole Seasoning a container of finely chopped yellow onions, green bell pepper, celery, green onions, parsley, and garlic. Every freezer section has the holy trinity pre-chopped and frozen at the peak of freshness if you’re willing to have your oil pop a little extra from the moisture of the frozen vegetables.

How to Prep the Trinity

If you prefer to prep your own vegetables, the perfect ratio of aromatics is 2 parts yellow onion:1 part celery:1 part green bell pepper. Every Cajun cookbook will tell you to pair the trinity with garlic for extra depth of flavor. The goal is to mince the onion, celery, and bell pepper so finely that they almost disintegrate while cooking. If you’re far from southern grocery stores that offer the Cajun trinity pre-chopped, you can follow the steps below to hand chop or emulate my grandmother and save time using a food processor. Note: a key to preparing the trinity is to keep the onion separate from the celery and bell pepper; you don’t want to overcrowd the onions, so you always add the celery and bell pepper later.

Hand Chopped:

Supplies needed are a large stable cutting board, two prep bowls, and a sharp non-serrated knife.

  1. Peel and cut the onion into a small dice, a ¼ inch square cut, and place in a prep bowl.
    1. If using garlic, peel and crush or finely mince and mix with the diced onions.
  2. Cut off the stemmed top of the green bell pepper to remove seeds and create a flat surface. Slice pepper into planks and then cut into a small dice. Place in a separate prep bowl.
  3. Chop off white base of celery then halve stalks. Bunch together the halved stalks with your free hand and cut into a small dice. Mix celery with the bell pepper in the prep bowl.

Food Processor:

Supplies needed are a food processor with chopping blade and two prep bowls.

  1. If using garlic, peel and add 4 cloves into the processor first, pulsing until finely chopped. Peel and quarter the onion, then add to the processor and pulse until finely chopped. Remove the onion and garlic mixture and place in a separate prep bowl.
  2. Cut off the stemmed top of the green bell pepper, remove seeds and quarter. Remove white base of celery then roughly chop stalks into 3-inch pieces. Add to the food processor and pulse until finely chopped. Then, place in a prep bowl separate from onion and garlic.
Louisiana Kitchen

Among those who live far from the shade of the magnolias, my home state of Louisiana is known for three things – Mardi Gras, music, and good food. When the only remnants of Mardi Gras are plastic beads hanging from the oaks along St. Charles Avenue, Louisiana still draws people from around the world for the lively music and incredible food. While those outside the kitchen may assume the rich depth of flavor in Cajun and Creole cuisine is thanks to a heavy hand with butter and cayenne pepper, the soul of the unique flavor is the holy trinity. The recipe below blends Cajun, Spanish, and African flavors with a few culinary shortcuts to showcase the holy trinity in a delicious pot of the Louisiana Cajun classic: red beans and rice.

Figure 5: Red Beans and Rice with Louisiana Hot Sauce. Photo credit: Flickr

Good Friday Red Beans and Rice

Servings: 8

This classic south-Louisiana dish saves on time without cutting back on flavor by using canned beans and chipotle peppers in adobo instead of ham hock or tasso. This recipe is perfect for a Lenten Friday or a vegetarian potluck.

Ingredients   

4 cloves garlic, peeled

2 medium yellow onions, peeled (2 cups chopped)*

3 ribs of celery (1 cup chopped)*

1 green bell pepper (1 cup chopped)*

1 tbsp olive or vegetable oil

4 (15-oz) cans dark red kidney beans, drained and rinsed with hot water**

1½ tsp Better than Bouillon Vegetable (or No Chicken) Base***

3 bay leaves

½ tsp cayenne pepper

½ tsp freshly ground black pepper

1 chipotle pepper canned in adobo, chopped

2-4 cups of water (just enough to cover the beans)

Optional: 2 tsp dried thyme and 2 tsp ground oregano

1 lb long grain brown (or white) rice, prepared according to package directions

Instructions
  1. Prep then combine the finely minced onion and garlic in a bowl, and celery and green bell pepper in a separate bowl.
  2. Heat a large heavy-bottom pot over medium heat and add 2 tablespoons of olive or vegetable oil.
  3. Add the onion and garlic to oil and sauté until onions are translucent and garlic is golden.
  4. Add the green bell pepper and celery to the pot and sauté until soft. Be careful not to let the garlic burn.
  5. Add the remaining ingredients to the pot: the kidney beans, bouillon base, bay leaves, cayenne pepper, black pepper, adobo chipotle pepper, and enough water to cover it all by an inch.
  6. Stir until all ingredients are well combined then simmer, covered, over low to medium-low heat for at least 45 minutes. Check and stir occasionally, adding water as needed if beans begin to stick.
  7. The red beans are ready when most of them have begun to fall apart.
  8. Serve on top of an equal portion of rice.

Tip!

Balance your plate by pairing this fiber- and protein-rich dish with collard greens, stewed okra and tomatoes, or a simple cucumber and onion salad.

Substitutions:

*4 cups pre-chopped frozen trinity

**1 lb dry kidney beans, soaked overnight, drained, and brought to a boil then simmer in lightly salted water with bay leaves.

***1½ tsp favorite bouillon/stock base. Alternatively, replace water with your favorite vegetable stock or chicken broth (if not vegetarian).

Southern Serving Suggestion:

If you’re left pining to recreate the Louisiana restaurant experience, turn up a Louis Armstrong record, pour yourself an iced tea,and top those red beans with some thin-sliced, pan-fried andouille sausage. Laissez les bon temp roulez!

Katie Moses is a Registered Dietitian Nutritionist who has worked as a culinary nutrition educator for over 5 years. Starting life with a unique culinary upbringing in the heart of Cajun country with Sicilian, Syrian, and French grandparents, she finds ways to adapt traditional dishes to fit current nutrition recommendations Katie is currently enrolled in the Master’s Degree Program in Nutrition Interventions, Communication, and Behavior Change at the Friedman School. Connect with her at linkedin.com/in/mkatiemoses.

5 Reasons the Whole30 is Not the Anti-Diet It Claims to Be

by Hannah Meier, RD, LDN

How does the Whole30 Diet hold up from a dietitian’s perspective? Hannah Meier breaks it down.

I’m calling it: 2017 is the year of the non-diet.

As a dietitian who ardently discourages short-term dieting, I was thrilled to read many articles posted around the new year with titles like “Things to Add, Not Take Away in 2017,” and “Why I’m Resolving Not to Change This Year.” Taking a step more powerful than simply abstaining from resolution season, influencers like these authors resolved to embrace the positive, stay present, and not encourage the cycle of self-loathing that the “losing weight” resolutions tend to result in year after year.

Right alongside these posts, though, was an overwhelming amount of press exonerating the Whole30—a 30-day food and beverage “clean eating” diet.

The founders of the Whole30, however, adamantly claim it is not a diet. Even though participants are advised to “cut out all the psychologically unhealthy, hormone-unbalancing, gut-disrupting, inflammatory food groups for a full 30 days” (including legumes, dairy, all grains, sugar, MSG, and additives like carrageenan), followers are encouraged to avoid the scale and focus on learning how food makes them feel rather than how much weight they gain or lose.

But our culture is still hungry for weight loss. The possibility of losing weight ahead of her sister’s wedding was “the deciding factor” for my friend Lucy (name changed for privacy), who read the entire Whole30 book cover to cover, and fought her “sugar dragon” for 30 days in adherence to the Whole30 protocol (only to eat M&M’s on day 31, she admits).

“Whole30 focuses on foods in their whole forms which is positive for people who are learning how to incorporate more unprocessed foods in their diet,” Allison Knott, registered dietitian and Friedman alum (N12) explains. “However, the elimination of certain groups of foods like beans/legumes and grains may have negative health implications if continued over the long-term.”

Diets like these trick consumers into thinking they are forming a healthier relationship with food. Though weight loss is de-emphasized, a trio of restriction, fear, and control are in the driver’s seat and could potentially steer dieters toward a downward, disordered-eating spiral.

I still think 2017 is the year of the non-diet, but before we get there we need to unmask the Whole30 and call it what it is: an unsustainable, unhealthy, fad diet.

1: It is focused on “can” and “cannot”

The Whole30 targets perfectly nutritious foods for most people (grains, beans and legumes, and dairy) as foods to avoid entirely, relegating them to the same level of value as boxed mac and cheese, frozen pizza, and Kool-Aid. And most bodies are perfectly capable of handling these foods. They provide a convenient, affordable, and satisfying means of getting calcium, vitamin D, potassium, phosphorus, and nutrient-dense protein. The Whole30 eliminates almost all the plant-based protein options for vegans and vegetarians. While the point of eliminating these foods, creators Hartwig and Hartwig explain, is to reduce inflammation and improve gut health, nowhere in the book or website do they provide scientific studies that show removing grains, beans and dairy does this for most people. But we’ll get to that later.

The Whole30 also instructs that participants not eat any added sugar or sweeteners (real or artificial), MSG (monosodium glutamate, a flavor enhancer that has been weakly linked to brain and nervous system disruption), or carrageenan (a thickener derived from seaweed and is plentiful in the world of nut milks and frozen desserts; conflicting evidence has both suggested and refuted the possibility that it is associated with cancer and inflammatory diseases), sulfites (like those in wine), or alcohol. Not even a lick, as they are very clear to explain, or you must start the entire 30-day journey from the beginning once more.

“I couldn’t go longer than 30 days without a hit of chocolate,” Lucy told me, explaining why she was dedicated to following the program exactly.

Why take issue with focusing on “good” and “bad,” “can” and “cannot” foods? As soon as a moral value is assigned, the potential for establishing a normal relationship to food and eating is disrupted. “The diet encourages following the restrictive pattern for a solid 30 days. That means if there is a single slip-up, as in you eat peanut butter (for example), then you must start over. I consider this to be a punishment which does not lend itself to developing a healthy relationship with food and may backfire, especially for individuals struggling with underlying disordered eating patterns,” Knott argues.

How will a person feel on day 31, adding brown rice alongside their salmon and spinach salad after having restricted it for a month? Likely not neutral. Restrictive dietary patterns tend to lead to overconsumption down the road, and it is not uncommon for people to fall back in to old habits, like my friend Lucy. “People often do several Whole30 repetitions to reinforce healthier eating habits,” she explained.

Knott relates the diet to other time-bound, trendy cleanses. “There’s little science to support the need for a “cleansing diet,” she says. “Unless there is a food intolerance, allergy, or other medical reason for eliminating food groups then it’s best to learn how to incorporate a balance of foods in the diet in a sustainable, individualized way.”

While no one is arguing that consuming less sugar, MSG and alcohol are unsound health goals, making the message one of hard-and-fast, black-and-white, “absolutely don’t go near or even think about touching that” is an unsustainable, unhealthy, and inflexible way to relate to food for a lifetime.

2: It requires a lot of brainpower

After eight years of existence, the Whole30 now comes with a pretty widespread social-media support system. There is plenty of research to back up social support in any major lifestyle change as a major key to success. Thanks to this, more people than ever before (like my friend Lucy, who participated alongside her engaged sister) can make it through the 30 days without “failing.”

But the Whole30 turns the concept of moderation and balance on its head. Perfection is necessary and preparation is key. Having an endless supply of chopped vegetables, stocks for soups, meat, and eggs by the pound and meals planned and prepared for the week, if not longer, is pretty much required if you don’t want to make a mistake and start over. The Whole30 discourages between-meal snacking, (why?) and cutting out sugar, grains, and dairy eliminates many grab-and-go emergency options that come in handy on busy days. So, dieters better be ready when hunger hits.

Should the average Joe looking to improve his nutrition need to scour the internet for “compliant” recipes and plan every meal of every day in advance? While the Whole30 may help those unfamiliar with cooking wholesome, unprocessed meals at home jumpstart a healthy habit, learning about cooking, especially for beginners, should be flexible. It doesn’t have to come with a rule book. In fact, I think that’s inviting entirely too much brain power that could be used in so many other unique and fulfilling ways to be spent thinking, worrying, and obsessing about food. Food is important, but it is only one facet of wellness. The Whole30 seems to brush aside the intractable and significant influence of stress in favor of a “perfect” diet, which may or may not be nutritionally adequate, anyway.

The language used by Whole30 creators to rationalize the rigidity of the diet could make anyone feel like a chastised puppy in the corner. “It’s not hard,” they say, and then proceed to compare its difficulty to losing a child or a parent. Okay, sure, compared to a major life stressor, altering one’s diet is a walk in the park. But changing habits is hard work that requires mental energy every single day. Eating, and choosing what to eat, is a constant battle for many people and it doesn’t have to be. Life is hard enough without diet rules. The last thing anyone needs is to transform a natural and fulfilling component of it (read: food) into a mental war zone with contrived rules and harsh consequences.

3: It is elitist

When was the last time you overheard a stranger complain about healthy eating being expensive? Most likely, the protester was envisioning a diet akin to the Whole30. Grass-fed beef, free-range chicken, clarified butter, organic produce…no dry staples like beans, rice or peanut butter. Healthy eating does not exist on a pedestal. It does not have to be expensive, but it certainly can be depending on where you choose to (or can) shop. Let’s set a few things straight: You don’t need grass-fed gelatin powder in your smoothies to be healthy. You don’t need organic coconut oil to be healthy. You don’t need exotic fruits and free-range eggs to be healthy. Maybe these foods mean more than just nutrition, signifying important changes to be made within our food system. But it terms of nutrition, sometimes the best a person can do for himself and his family is buy conventional produce, whole grains in bulk, and Perdue chicken breast on sale because otherwise they would be running to the drive thru or microwaving a packet of ramen noodles for dinner. A diet like the Whole30, which emphasizes foods of the “highest quality,” does nothing more than shame and isolate those who can’t sustain the standard it imposes, further cementing their belief that healthy eating is unattainable.

4: It is socially isolating

Imagine with me: I am participating in the Whole30 and doing great for the first week eating fully compliant meals. Then comes the weekend, and “oh no” it’s a football weekend and all I want to do is relax with my friends like I love to do. For me, that typically involves a beer or two, shared appetizers (even some carrots and celery!) and lots of laughs. The Whole30 creators would likely laugh in my face and tell me to suck it up for my own good and just munch on the veggies and maybe some meatballs. (“But are those grass-fed and did you use jarred sauce to make them? I bet there’s a gram of sugar hiding in there somewhere.”)

But it is just a month—certainly anyone can abstain from these type of events for a mere 30 days (remember, “it’s not hard”)—but then what? Do you just return to your normal patterns? Or do you, more likely, go back to them feeling so cheated from a month of restraint that you drink and eat so much more than you might have if you’d maintained a sense of moderation?

Of course, there are people comfortable with declining the food-centric aspect of social life, for whom turning down a glass of wine with cheese in favor of seltzer and crudités is no big deal. And perhaps our social events have become a bit too food centric, anyway. Either way, using food rules to isolate one’s self from friends and family sounds an awful lot like the pathway to an eating disorder, and the sense of deprivation most people likely feel in these situations can snowball into chronic stress that overshadows any short-term, nutrition-related “win.”

Although, maybe we should get all our friends to drink seltzer water and eat crudités at football games.

5: It is not scientifically sound

Most of The Whole30’s success has come from word of mouth, stories, and endorsements from those who successfully made it through the program and felt “better” afterwards. The website, dismayingly, does not house a single citation or study referenced in creation of the diet.

It’s important to note that the Whole30 did not exist 20 years ago. The Whole30 is not a pattern of eating that is replicated in any society on earth, and it doesn’t seem to be based off any research suggesting that it is indeed a superior choice. At the end of the day, this is a business, created by Sports Nutritionists (a credential anyone can get by taking an online test, regardless of one’s background in nutrition—which neither of them has) part of the multi-billion-dollar diet industry. Pinpointing three major food groups as causing inflammation and hormonal imbalance is quite an extreme statement to make without any research to back it up.

What does the science actually show? Knott, who counsels clients in her Tennessee-based private practice reminds us that, “consuming a plant-based diet, including grains and beans/legumes, is known to contribute to a lower risk for chronic disease like heart disease, cancer, and diabetes. Grains and beans/legumes are a source of fiber, protein, and B vitamins such as folate. They’re also a source of phytochemicals which may play a role in cancer prevention.”

The Whole30 proposes eliminating grains because they contain phytates, plant chemicals that reduce the absorbability of nutrients like magnesium and zinc in our bodies. While it’s true that both grains and legumes contain phytates, so do certain nuts and some vegetables allowed on the diet, like almonds. It is possible to reduce the amount of phytates in an eaten food by soaking, sprouting, or fermenting grains and legumes, but research from within the last 20 years suggests that phytates may actually play a key role as antioxidants. In a diverse and balanced diet, phytates in foods like grains and legumes do not present a major micronutrient threat. Further, new findings from Tufts scientists provide more evidence that whole grains in particular improve immune and inflammatory markers related to the microbiome.

Legumes in the Whole30 are eliminated because some of their carbohydrates aren’t as well-digested and absorbed in the small intestine. Some people are highly sensitive to these types of carbohydrates, and may experience severe digestive irritation like excessive gas, bloating, constipation, etc. Strategies such as the FODMAP approach are used with these folks under professional supervision to ensure they continue to get high-quality, well-tolerated fiber in their diets, and only eliminate those foods which cause distress. For others, elimination of these types of carbohydrates is unsound. Undigested fibers like those in legumes are also known as prebiotics, and help to feed the healthy bacteria in our gut. Eliminating this beneficial food group to improve gut health goes directly against the growing base of scientific evidence surrounding the microbiota.

Dairy, for those without an allergy or intolerance, has been shown to provide many benefits when incorporated into a balanced and varied diet, including weight stabilization and blood sugar control. The diet also fails to recognize the important health benefits associated with fermented dairy products like yogurt.

In terms of the diet’s long-term sustainability, Knott adds, “There’s plenty of research to support that restrictive diets fail. Many who adopt this way of eating will likely lose weight only to see it return after the diet ends.”

Let’s not forget its few redeeming qualities

For everything wrong with the Whole30, there are a few aspects of the diet that should stick. The concept of getting more in touch with food beyond a label, reducing added sugars, and alcohol is a good one and something that everyone should be encouraged to do. Focusing on cooking more from scratch, relying less on processed foods, and learning about how food influences your mood and energy levels are habits everyone should work to incorporate into a healthy life.

Knott agrees, adding, “I do like that the diet emphasizes the importance of not weighing yourself. We know that weight is a minor piece to the puzzle and other metrics are more appropriate for measuring health such as fitness, lean muscle mass, and biometric screenings.”

Improving the nutritional quality of your diet should not eliminate whole food groups like dairy, grains, and legumes. It should not have a time stamp on its end date, and rather, should be a lifelong journey focusing on flexibility, moderation, and balance. Lower your intake of processed foods, sugars, and alcohol and increase the variety of whole foods. Et voilà! A healthy diet that won’t yell at you for screwing up.

—–

Thanks to Allison Knott MS, RDN, LDN for contributing expertise. Knott is a private practice dietitian and owner of ANEWtrition, LLC based in Tennessee. She graduated from the Nutrition Communications program at Friedman in 2012.

 

Hannah Meier is a second-year, part-time Nutrition Interventions, Communication & Behavior Change student and registered dietitian interested in learning more about non-diet approaches to wellness. She aspires to make proper nutrition a simple, accessible and fulfilling part of life for people in all walks of life. You can find her on Instagram documenting food, fitness and fun @abalancepaceRD, as well as on her (budding) blog of the same title: http://www.abalancedpace.wordpress.com

Putting a Pause on Peanut Butter Panic: New Guidelines Seek to Reduce Peanut Allergy Risk

by Erin Child

Do you like peanut butter? So do I. I’m kind of obsessed. Perhaps you add it to your smoothie bowl, drizzle it artfully on your Instagram worthy oatmeal, or, if you’re in grad school, it’s part of your PB&J. After all, that is the cheapest, easiest thing to make. But what if you had to take the PB out of the PB&J, and eliminate it from your diet and your life? This is a growing reality for many in the United States, with outdated, misinformed guidelines being blamed for the recent spike in peanut allergies. Read on to explore the revolutionary research that has spurred the creation of new guidelines, and why Americans need to change how we handle peanut exposure in childhood.

I recently stopped eating peanut butter in any way that could be deemed pretty or practical. Instead, you can find me in my room, with the door shut, maniacally shoveling peanut butter into my mouth with a plastic spoon.

This all started at the beginning of 2017. No, it is not some bizarre New Year’s resolution or diet trend. Rather, a new roommate moved in. She’s a great girl – kind, thoughtful, willing to learn how to properly load a dishwasher – and massively, catastrophically allergic to peanuts. She is also allergic to tree nuts and soy, but peanuts are THE BIG BAD. They are the reason why I spent the week before her arrival scrubbing my kitchen from top to bottom and running every dish and utensil (even the wooden ones, to my chagrin) through the dishwasher. And there is now an EpiPen® in our kitchen. Just as they are on some airlines, peanuts are now banned from the general living areas of my house, and thus I & my beloved jar of peanut butter have been sequestered to my room.

Many of you have probably dealt with peanut-free schools or day cares, or been informed to not consume any peanut products on your flight. Peanut allergy rates in children in the United States have quadrupled from the late 1990s (less than 0.5%) to about 2% today, and are the leading cause of anaphylaxis or death from food allergies. Thanks to my new-found awareness, I have become extremely self-conscious about eating peanut butter in public spaces. On the bus the other day some peanut butter dripped from my sandwich to the seat. I panicked, thinking “What is the chance this spill is going to wind up hurting some little kid?” (I hope they are not licking the seats on the bus, but still.)

Coupled with my new roommate’s arrival, I was fascinated to find that peanut allergies have been back in the news. On January 5th, 2017, the National Institute of Allergy and Infectious Disease (NIAID) published new guidelines for practitioners about when to introduce peanuts to high-risk, medium-risk, and low-risk infants. High-risk infants with severe eczema and/or an egg allergy should be introduced to peanuts between 4 to 6 months. Medium-risk infants with mild eczema should be introduced to peanuts by 6 months, and low-risk infants without eczema or other allergies can be introduced to peanuts any time after they have been introduced to solid foods.

These guidelines fit in with the dual-allergen exposure hypothesis. This suggests that children are first exposed to food particles through their skin as infants. This exposure primes their immune systems to treat the food proteins like invaders and build up defenses against it. If the food is eaten years later, the child has an acute allergic reaction because their immune system had ample time to prepare. Children with eczema have weakened skin barriers and are much more likely to experience repeated skin exposure to food allergens. This leads to an increased chance of an allergic reaction once they eat the food. Current research now supports this hypothesis, and also suggests that by shortening the time between skin exposure and ingestion, we will reduce the number of acute allergic reactions. The sooner an infant starts eating an allergen, the more likely the body will adjust to it without having time to bsuild up strong defenses against it.

These new guidelines on peanut exposure from NIAID seek to correct for guidelines set by the American Academy of Pediatrics in 2000. The 2000 guidelines were based on only a few tests done on hypoallergenic infant formula feeding, yet conclusively recommended that infants at high-risk for peanut allergies wait until 3 years of age to first try peanuts. Based on the newest findings, it appears that this advice was ill advised. My roommate, n=1, was born in the mid-1990s when delaying peanut exposure was coming into vogue. She had severe eczema an infant, and following doctors’ recommendations, wasn’t introduced to peanuts until somewhere between 18-24 months old. She is equally fascinated with the new research, and wishes there was some way to know if the outcome would have been different had she tried them at a younger age.

Peanut allergies are more common in the US, UK, and Australia, which are also the countries that have historically had the most stringent recommendations around peanut introduction. As doctors and researchers sought to figure out why peanut allergies were ballooning, they looked to countries with very low peanut allergy rates, like Israel, where infants are introduced to peanuts at early ages. In Israel, instead of Cheerios, infants are given a peanut based snack, called Bamba, as one of their first foods. In other developing countries, infants are exposed to peanuts early on—both in their environment and in their food. These other countries also have much lower allergy rates.

In 2015, NIAID funded the Learning Early About Peanut Allergy (LEAP) study to determine whether early exposure to peanuts would decrease the incidence of peanut allergies. The UK study was a randomized controlled trial including 640 infants between 4 and 11 months of age with severe eczema and/or egg allergy. The infants were split into two groups (based on skin prick test results for peanuts) and then randomized to either eat or avoid peanuts until 60 months old (5 years). For infants in the negative skin prick test group, 13.7% of those who avoided peanuts had developed an allergy and only 1.9% of those who ate peanuts developed an allergy (P<0.001). For infants in the positive skin prick test group, 35.3% who of those who avoided peanuts had developed an allergy and 10.6% of those who ate peanuts developed an allergy (P=0.004). These results were significant and stunning, prompting the formulation of the current NIAID guidelines.

So, should we all start slathering our babies in peanut butter? Maybe. (As always, talk to your pediatrician). Food allergy science is an evolving field, and what is true today may not hold true a decade down the line. But based on the significance of the current research and the lower peanut allergy rates in cultures and countries that do not limit peanut exposure, the evidence strongly indicates that parents in the United States should change their approach.

Only 20% of children diagnosed with peanut allergies will grow out of them. The vast majority, like my roommate, are allergic for life. For now, research on reducing peanut allergies in adults is limited, making it unlikely that we will be eliminating any allergies anytime soon. So for now, I will continue to eat my peanut butter in my room. Alone.

Erin Child is a second semester NICBC student in the dual MS-DPD program and this is her first article for the Sprout. She loves cooking (usually with Friends or Parks & Rec on in the background). She hates brownies. (Seriously.) As the Logistics Co-Chair for the Student Research Conference, she looks forward to seeing everyone there on April 8th!