The Resurrection of the Body: Evidence, Reasoning, and Belief

This is my term paper for my Christianity and Evidence course. The topic is the philosophical basic for the Christian belief in Resurrection of the Body. It’s pretty theological and intellectual, so read at your own risk. (P.S. I got an A 🙂

            N. T. Wright’s argument for the bodily resurrection of Christ is compelling, providing substantial evidence for the philosophical framework of Christianity.  But for Christians, Jesus’s resurrection is the beginning of the story.  An essential tenet of Christian faith is that the faithful will resurrect on the last day, just as Jesus did.  This belief was controversial from the beginning among disciples and nonbelievers alike.  None demonstrated this dissonance better than St. Augustine when he commented, “On no point does the Christian faith encounter more opposition than on the resurrection of the body” (CCC, 996).  Though a significant point of contention, this doctrine deals with a future event (the end of the world) rather than a past event; because of this, questions about the resurrection of the body classically fall under eschatology rather than philosophy.  Wright’s abduction argument fails to address this issue in great depth, restricting any claims about the bodily resurrection of believers to discussion of 1 Corinthians 15.  For the sake of this argument, let’s suppose Wright’s conclusion is true.  After considering this and exploring evidence from multiple disciplines, I will attempt to demonstrate that belief in bodily resurrection is logically sound and extends benefits to those who accept it.  Indeed, this precept provides meaning to the Christian life while increasing comfort, hope, and an urgency to live a meaningful life.  To begin this discussion, the origins of resurrection belief must be explored.

           Resurrection has always been a radical concept, largely because it seems to oppose the laws of nature.  Though the tradition was promulgated by first century followers of Jesus, it has Jewish roots and was most popular among the Pharisees, a major group during the time of Jesus.  The Jewish people had been victims of oppression for thousands of years, but the Pharisees held that one day oppressors, both Jewish and foreign, would be overthrown by a revolt of resurrected righteous believers (Wright, 2003).  All Jews believed that creation was good (Gen. 1:1–2:4a); however, the world God created is filled with choices.  Some people chose to live a life pleasing to God, while others rejected God and lived for their own gain.  Resurrection meant that, somehow, those favored by God would one day rise from the dead bodily, overthrow the selfish, oppressive rulers, and live in the glorified city of Israel (which was in the created world).  While there was no consensus throughout Judaism, this tradition stems from interpretations of various Old Testament texts.  Such passages include the psalmist’s soul being saved from Sheol (Ps. 49:16), Ezekiel’s account of dry bones coming to life (Ezek. 37: 1, 14), and the accounts of persecuted Jewish leaders receiving their bodies again (2 Macc. 7).  Though these passages can be understood in different ways, they are the original inspiration for belief in bodily resurrection.

            In essence, Jews who believed in resurrection believed all righteous people would rise at once and lead a revolt.  This perspective shifted when people started believing in Christ’s resurrection.  The best accounts of the early Christian view on resurrection are within Saint Paul’s epistles, specifically 1 Corinthians 15.  Paul’s resurrection accounts clarify concerns among early followers, namely that resurrection will be bodily, transformative, and have some continuance of personal identities (Mercer, 2017).  As opposed to Jewish belief, the Christian understanding became that Christ was the “firstfruits”—the first portion of harvest, sacrificed to consecrate the rest of the harvest—of the resurrection (1 Cor. 15:20ff).  So Christ rose first, and all others would rise, as he did, during the second coming of Christ, the consummation of history (CCC 1001).  Though Paul admits the how of the resurrection is cannot be fully known, he describes what he believes the “resurrection body” will be like.  He compares the resurrection body to a seed, maintaining that our earthly bodies, which are sown in rebellion to God, will be transformed to “spiritual bodies” (in Greek, soma pneumatikon) that will be raised in harmony with God (Wright, 2003).  Paul believes that our resurrection must logically follow Christ’s, for God showed that it is possible and we are made in His image, both earthly and spiritually (1 Cor. 15:49).

            Belief in resurrection is drastically different the from pagan convictions at the time, with the Platonic hypothesis of an immortal soul continuing in a disembodied afterlife being the predominant idea (Rausch, 2008).  Plato’s dualism generally held that the material world inhibits the accessibility of the world of the forms.  This concept of devaluing the material world—and even seeing created reality as evil—was adopted by the gnostic tradition as well (Brown, 2017).  Acceptance of a non-material afterlife has sustained popularity throughout history, including early Christianity.  In fact, Wright argues that Paul believes in a two-stage afterlife: a spiritual afterlife with God after death, and a bodily resurrection at the end of time (Wright, 2003).  Joseph Ratzinger, one of the most prominent eschatologists in the modern age, contends that this first stage isn’t merely the continuance of our once corporeal existence; rather, one’s soul transforms into a new creation (Gavin, 2017).  So, in a way, the Christian belief in resurrection is an amalgamation of previous traditions.  It honors the goodness of the material world, hoping that the faithful can one day enjoy the fruits of creation once again.  At the same time, it accepts Plato’s world of the forms and the possibilities of a disembodied, spiritual reality after death.

            This progressive revelation of resurrection is important.  It posits that early Christians didn’t merely adapt Jewish interpretations of resurrection, but they developed a mature, thoughtful concept about afterlife existence.  However, though Paul’s accounts are thorough they are eschatological in nature and based more so in faith than reason.  Luckily, philosophers like Saint Thomas Aquinas have considered resurrection of the body.  Montague Brown’s analysis of the Commentary on the First Epistle to the Corinthians can shed light on how this belief, namely the philosophical basis for bodily resurrection and transformation, can be logically established.

            Grounded in Aristotelian physics, Aquinas’s life work was to justify how the Christian faith can exist in the natural world.  One point of dispute between Aquinas and Plato is the relation of body to soul in the human being.  Plato postulates that the body hinders the soul, while Aquinas argues that the body is good for the soul because it allows for the soul to seek, learn, and grow within the human community (Brown, 2017).  He furthers this discussion by holding that the soul isn’t one’s primary substance, but the individual is.  Put in another way, the soul is a part of one’s body: just as I use my eyes to see, I also use my soul to direct my will (Brown, 2017).  The body, too, is an intrinsic part of the individual.  Aquinas claims that human beings are rational animals, unique throughout creation because we possess an intellect and a will.  This dichotomy—a rational animal that has a body and soul—is the very essence of humanity (Brown, 2017).  Justification for resurrection comes when Aquinas details the soul’s separation from the body after death.  Though the soul transcends the body after death, it exists in a state where the individual, which exists as a unity of body and soul, is not complete, not in full actuality (Brown, 2017). 

           Aquinas proposes three reasons why this separateness cannot exist forever, eventually concluding that there must be a reunion of body and soul through resurrection.  First, Aquinas deems this separated state to be unnatural, and no unnatural state can exist forever (Brown, 2017).  The body has allowed the soul to express and discover one’s individual identity, so a soul devoid of its body—its mode of being—is imperfect and unnatural.  Secondly, Aquinas talks about happiness.  He says that complete happiness cannot be generalized to a population but must concern the individual (Brown, 2017).  Aquinas argues that happiness is the pursuit of God, so complete happiness would be union with God, something only possible in death (Aquinas, 1975).  Because of this, an individual’s complete happiness can only exist if that individual shares life with God as a complete person, both body and soul.  If just of the soul continued, happiness would be incomplete.  Aquinas’s last justification for the resurrection considers final causality.  An individual makes decisions in life through free will, so the individual (body and soul) should be held accountable for his choices if there comes a time of final judgement (Brown, 2017).  In all three of these arguments, Aquinas shows that the unity of body and soul within the individual points to bodily resurrection as a necessity for afterlife existence.

            These defenses add merit to the theoretical possibility of bodily resurrection, but Aquinas must still deal with the questions Paul raises in 1 Corinthians 15, specifically how and what kind of body.  One of the main objections to the resurrection of the body is the lack of an efficient, natural cause to restore body to soul.  To this, Aquinas agrees that there is no natural cause; rather, resurrection can only occur through divine power (Brown, 2017).  Nature doesn’t create the same thing twice, so just as God creates each soul ex nihlo, He will so recreate each union of soma pneumatikon to soul ex nihlo (Brown, 2017).  The specific how and what kind questions of the resurrection cannot be known by human reason but are based in the faith in God’s power (CCC, 1000).  If God raised Jesus in a glorified body, he is also able to raise humanity in this way.  Because God is full actuality, God is anything that He can be and God does anything that He can do (Aquinas, 1975).  So our resurrection is an effect of, and is made possible by, Christ’s.  To summate the remainder of Aquinas’s justification, we must reconsider two aforementioned points: that creation is good and that humans have free will.  Aquinas contends that creation is good, but because the first humans freely chose themselves over God, death entered the picture (Brown, 2017).  It’s only at the consummation of history that creation’s full goodness can be restored and the potential of humanity can be actualized, all through the resurrection of the body.

            So far, based on the acceptance of Wright’s argument, I have discussed the historical and philosophical evidence that suggests bodily resurrection is plausible.  At the very least, it can be logically justified on the grounds that Jesus was raised from the dead bodily.  The conversation to follow will be about what this belief means for the Christian faithful.  To better grasp the significance of this belief, we must explore both the eschatology of the end times and what philosophers have to say about death as part of the human experience.

            Before going any further, I must state the obvious: people are afraid to die.  Human consciousness allows us to be aware of the life we’re living and to question when that life will end.  What is the meaning of life? is one of the most important human questions.  Most of us want the answer to inspire hope, but for many, asking this unanswerable question yielded a great deal of despair.  When Søren Kirkegaard sought the meaning of life in the 19th century, it made him so distressed that he identified existential angst as a prevalent human problem (O’Brien).  Kirkegaard supposed that having this angst about life and your inevitable death can be a good thing, for it can spur you to live with a sense of urgency.  Later in the 19th century, the esteemed novelist Leo Tolstoy had everything a man could want in life, but sustained a profound crisis of meaning.  He was on the verge of suicide when he discovered what he believed made life meaningful: the faith of unlearned, working-class people (O’Brien).  These people lived in humility and kindness, and a deep connection with their passion was enough for Tolstoy to continue living.

            The work of these two men suggests that life may need meaning to be bearable and that the imminence of death can ignite one to live with urgency and purpose.  For Christians, the meaning of one’s life is deeply interwoven with what happens when we die.  As these events are in the future, they fall under eschatology where the four last things—death, judgement, heaven, and hell—are theorized.  Though seemingly daunting, these four last things provide personal and communal hope to the faithful.  Christians are encouraged to desire death because it means union with Christ in some form in the afterlife (CCC, 1010).  A final judgement means that what was done on earth, both the good and the bad, matter, all counting amidst God’s justice (Ryan, 2017).  Lastly, heaven offers promise to those who lived for God while hell helps Christians understand that it’s not too late to take responsibility for their shortcomings (Rausch, 2008).  But if these last four things were merely for the soul, they would be imperfect, for the full expression of a human being is the union of body and soul.  So stands the Christian belief: bodily resurrection enables the continuance of individual identity in a way that a disembodied afterlife couldn’t support.  At the same time, the soma pneumatikon promised in resurrection consoles believers into considering that the suffering endured on earth will be worth it in the end (Rausch, 2008).  Indeed, the resurrection promises to bring humanity to its fullness and make persecution whole.

            For the Christian faithful, death is to be rejoiced in.  Paul’s letter to the Philippians accounts, “For to me life is Christ and death is gain” (Phil. 1:21).  Likewise, St. Theresa of Avila declares, “I want to see God, and in order to see him, I must die” (CCC, 1011).  Death means union with the creator through a resurrection with and in Christ, but this doesn’t mean that life on earth is meaningless.  On the contrary, Christianity holds that earthly existence is a chance to secure eternity with God through righteous living, generous loving, and compassionate stewardship.  If Christians lived with the regular remembrance of why they are living—for the eventual share in the resurrection with Christ—then this belief would have tremendous power.  A global community that honors the goodness of creation, accepts the suffering of life, and applies meaning to daily interaction has the power to change society at large.  This radically improved reality is what belief in resurrection of the body could mean if Christians fully believed it.

            The work of Wright, Aquinas, and a number of other philosophers provides evidence suggesting the bodily resurrection of human beings could happen in the future.  Indeed, if God rose Jesus from the dead then he could raise ordinary humans at the appointed time, too.  I find the evidence presented in this paper to be significant and substantial for belief in bodily resurrection, but not conclusive (no prediction of the future can ever be stated with certainty).  Because this is a future event, it requires belief more than intelligent philosophical reason.  As philosophers of death would conquer, it is acceptable—for the betterment and bearableness of one’s life—to make existential conclusions based on faith once given enough thought.  This is often the only way forward given questions of meaning and death.  Coupled with Wright’s argument for the resurrection of Jesus, faith in the resurrection of the body seems reasonable and helpful, if not comforting.  In the end, belief in bodily resurrection can inspire in one a sense of urgency to live a better life, a personal hope for one’s individual continuance after death, and a deepened appreciation for the goodness of creation.  If Christians around the world came to a consensus on this philosophically sound belief, the power of Christian witness may be enough to bring lasting change to society.

Industrialization & Distance in a Culture of More Food, Faster

This was my final exam essay for my Food in American History course. I reviewed course material and crafted a 1500 word essay about how industrialization of food processes and systems has distanced Americans from their food. This essay may not be for you, but it is if you want to learn about the effects of the industrialization of food. (P.S. I got a 98 🙂

            As the American system of producing, distributing, and consuming food has advanced throughout the country’s history, so too have American perspectives about food.  This global shift can be attributed, in large part, to industrialization.  While new methods of growing and transporting food came out of necessity during wartime, key inventions to accelerate production arose from the capitalist principle of improvement.  Indeed, in time reapers, cans, and chemicals turned the American farm into the American factory, while crop surpluses increased the need for new products and fresh marketing strategies.  Grocery stores changed how food was purchased, automobiles changed the accessibility of restaurants, and the television changed how family dinners were shared.  All the while, the American public was slowly becoming distanced from the food on their tables.  Food that was once grown in backyard gardens and by neighborhood farmers—food that people knew—was progressively phased out by unfamiliar but welcome packaged foods.  As industrialization has produced more food and more opportunity for the country, it has come at the cost an unsustainable, fragile food system that is hidden from the public eye.  To better understand how this shift happened and what can be done to ameliorate it, the gradual changes throughout the American food system should be examined.

            When considering the production of food, there are two major ways that industrialization changed the food supply: advancements on the farm and innovations in the factory.  Since its inception, the United States has had abundant land with rich soil.  To spur the land’s cultivation, Thomas Jefferson’s proposed the ideal of the independent yeoman farmer who could supply for himself and contribute to his community (Class Notes, 2/3/20).  These small farmers working on land grants ignited the early economy; however, the introduction of new, expensive technology steepened the barrier to entry.  The McCormick reaper of the 1830s was the first mechanical advancement, allowing farmers to reap fifteen acres a day instead of two (2/3/20).  This expensive machine—and others like it—created a divide between the industrial farmer and the yeoman farmer.  While prosperous farms flourished, unsuccessful farmers had to find other means of sustenance and purchase their food instead of growing it. 

           Luckily, a surplus of crops like wheat lead to innovations within factories, creating a new sector for jobs.  Flour was now mass-produced at a large mill instead of by small farmers (1/31/20).  At the same time, advancements in food preparation tools and appliances such as the cast iron stove created new opportunities for women cooks.  Making meals and baked goods was now easier but more complicated, so cooking itself became a skill.  While women worked in the kitchen, men were now able to seek out jobs in factories, thereby furthering the Industrial Revolution (1/31/20).  What began as a boom on the farms ignited brand-new avenue for processed food products, thereby creating a market that rapidly expanded the economy.

           This same theme of increased agricultural production followed by the invention of novelty foods appeared again in the beginning of the twentieth century.  Perhaps the most significant agricultural development in history came not from the United States, but Germany.  Working at separate times on the same concept, two German scientists, Fritz Haber and Carl Bosch, discovered a way to turn atmospheric nitrogen into absorbable chemical fertilizer (Pollan, 2004).  The 1909 Haber-Bosch process essentially mechanized the farm, making it a system of inputs and outputs as opposed to a balanced ecology (3/30/20).  Because farmers didn’t have to wait for essential nutrient restoration by fallowing, crop production skyrocketed.  Key chemical companies like Dow and Monsanto entered the fold, furthering the industrialization of American farms (2/19/20).  A greater abundance of crops like wheat, corn, and soy led to the boom in processed food production.  Companies like Heinz, Campbell’s, and Pillsbury grew rapidly because of their investment in continuous process production (2/21/20).  Though an abundance of inexpensive processed foods, widespread hesitancies about adulteration still lingered.  To mitigate these worries, food marketing grew to persuade consumers of the safety and health of processed foods (2/21/20).  In time, these foods became the norm in American society, furthering the distance between farm and table.

           Before further consideration goes into these consumer trends, we must take a step back and look at how industrialization has changed food distribution.  Though early agricultural success fueled 19th century economy, the Civil War changed the course of food history.  To feed Union battlefield soldiers, innovative factories began canning foods, especially meat (2/10/20).  As new packaged, preserved foods fueled soldiers, railroads were being built, extending the distance these foods could travel (2/10/20).  Preserved meat and railroad networks collided when the Chicago Union Stockyard opened in 1865.  Here, live cattle and swine were shipped from across the Midwest to be disassembled and processed (2/19/20).  Live animals were shipped into the stockyard and dressed meat was shipped out.  This furthered the need for refrigeration cars to preserve butchered meat, an invention which would eventually lead to widespread distribution of not only meat but other fresh produce (2/19/20).  These new distribution methods were very successful, increasing the possibilities for agricultural and economic growth.

           Though impressive distribution allowed food to cross state lines, not everybody benefitted.  Local butchers, family grocers, and neighborhood farmers suffered from more affordable processed food prices (2/24/20).  Small operations that couldn’t compete with national brand giants were forced to close.  All the while, though food was becoming more available and affordable, Americans were being further separated from their food supply.  This distance furthered with the introduction of a new way to buy food: the self-service grocery store.  With Piggly Wiggly leading the charge in 1916, chain grocery stores began opening up throughout the United States (3/2/20).  These self-service stores were “Progressive;” no longer would clerks do the shopping, but customers were now able to choose between products by comparison shopping (3/2/20).  As the aforementioned national brands competed with store brands, price became a crucial factor to buying behaviors.  Local farmers couldn’t compete with grocery store prices, and affordability outweighed quality.  Food became more accessible than ever before, though at a cost.  Factory farms were producing more food than ever but with great impacts to the surrounding environment and the health of animals (3/30/20).  Consciousness of this didn’t reach the public until much later.  In fast-paced America, it was about more food being more available at a more affordable price.

           Lastly the changing attitudes around food consumption must be explored.  To be brief, the greatest change here was the progressive shift from family meals to “fast” food.  The consumption of meals in the 19th century was largely centered around the dinner table.  Generally speaking, while men worked in factories, women stayed at home.  When the man returned from work, the family would unite with an intentionally prepared family dinner (1/31/20).  Gradually over the next century, industrialization fragmented the dinner table, favoring convenience over home-cooked meals.  The automobile boom synchronized with the emergence of fast food restaurants, making “eating out” a norm for Americans (3/27/20).  Successful franchises like McDonald’s and White Castle worked to accelerate this shift.  Simultaneously, frozen dinners entered the fold, furthering the ideal of convenience over complicated meals (Hamilton, 2003).  Adding fuel to the fast-paced fire, the explosion of television in the late 20th century brought about TV dinners, a development which further split the dinner table (Buford, 2006).  In recent years, services like Uber Eats and Door Dash further antiquate cooking and family meals.  These changes make the overall trend evident: the more Americans buy into convenient arrangements that support the Industrial Food system, the less Americans care about where their food comes from.

           Due to the industrialization of food production, distribution, and consumption patterns over the past 200 years, American society has lost touch with its food.  That is, until recently. The COVID-19 pandemic has shed light on just how fragile our supply chain.  For the first time in recent memory, our collective society is thinking about the origins and implications of the food we buy at the store.  This is a good thing.  And this is why food history matters. 

           Without knowing how we got here, it’s impossible to know where we should go in the future.  For instance, knowing how meatpacking operations began in the 1860s can shed light on the importance of supporting local farms to bolster food security.  Likewise, understanding that frozen foods are a recent phenomenon can spur people to plan for what may happen if nationwide distribution networks shut down (i.e. learning how to grow food may be a helpful skill).  If we use this moment of national crisis to educate ourselves about where our food comes from, perhaps we can reconnect with the land we have been separated from.  The benefits would be a more sustainable agricultural system, a more stable distribution network, and a greater awareness of who is affected by our buying decisions.  This can be the moment that reconnects us with our food.  This can be the next phase in American food history, that is, if we’re willing to change.

A sad essay

I have to write a final exam essay for my Food in American History course. I chose to write about how industrialization of the food system has created separation between people and their food. We’re allowed to use our class notes and readings, but no outside sources.

Luckily, I take good notes and weaving in the key themes of the course will be seamless. Unfortunately, my notes are so good that I vividly remember every lecture, every discussion, and every emotion.

That means I’ve been remembering all those good classes on campus in January and February. And all the not-so-good ones from March to April. My note taking superpower has become my kryptonite, and this is essay, as the kids say, has me in my bag.

Spring of 2020 hasn’t been easy. Not for anybody. Surely one day we’ll look back and feel some emotions. Hopefully, though, we’ll be able to identify key themes and significant moments, both as a collective and as individuals.

Maybe we’ll see that this was just one bad paragraph in a powerful, meaningful essay. A paragraph that was hard to get through, but critical to the development of following section.

Lunchbox Lemma

I wrote this essay for my Food in American History course. WE were tasked with detailing our own personal food history. I had a lot of fun with this assignment and decided to share. Hope you enjoy 🙂

Nearly every day, I fortuitously kick my metal Power Rangers lunchbox and disrupt class. In many ways, loudis the best word to describe this lunchbox: not only does the bright yellow tin box make inconvenient loud noises, but it’s become my unique identifier across campus.  It catches people’s attention.  Many comment saying how they loved that show, wondering which is my favorite Ranger, and asking if I had it since my childhood.  I hate to break the news that I never watched the Power Rangers and that I bought the lunchbox on Amazon, so I tell them the Red Ranger is my favorite and that, indeed, I’ve had it for a long while.  Truth is, this lunchbox is far more functional than fashionable and its existence, while loud, says an awful lot about my seldom-spoken perspectives on food more generally.  There were several factors influencing my purchase of this lunchbox, but they all relate back to my distaste for unethically raised agriculture, unsustainable food systems, and overly processed foods, as well as a devotion to become my healthiest self.  My lunchbox reflects the importance I place on my food choices and my recognition that what I choose to consume matters to my health and the health of the planet in a real way.  In many ways, the food protected by the five Rangers symbolizes my love for the process of cooking, a pursuit I hold as a spiritual practice.  But that’s a remarkably long way away from frozen chicken nuggets and marshmallow fluff on white bread, so let’s give this story due justice.

I was a picky eater the moment I discovered that some foods taste better than others.  Simple tastes determined my childhood favorites, foods like bread and butter, strawberries and whipped cream, chicken nuggets and honey.  I have memories of being in my high chair, chowing down on cauliflower and broccoli.  Then I was presented chicken nuggets, and vegetables were out of the question.  In no time I became a chicken nugget connoisseur. I implemented the “Dinosaur or Don’t Bother” policy in my household, maintaining that dinosaur-shaped nuggets were the only allowable form.  In a desperate attempt to nourish her stubborn child, my mother let me dip the nuggets in honey (I had outlawed ketchup, too).  Eating out was a spectacle: when the dinner rolls were consumed, I would top off the first course with a sole packet of butter (I was onto this keto thing long before popular culture).  Thanksgiving used to be my least favorite holiday.  I didn’t like turkey, sweet potatoes, or cranberry sauce, but I loved bread.  So I had bread and topped it off with whipped cream because the adults didn’t want me to cause a fuss.  In grade school, the thought of jelly on bread made me sick, so marshmallow fluff and peanut butter was on the menu every day—except every other Friday when they served triangle pizza with cheesy crust.  To avoid harassment, I made sure to start eating lunch meat by middle school.

As my taste buds matured past early childhood, pasta with parmesan cheese became a staple.  At times, it was without a doubt my favorite food.  At eight years old, pasta was the first thing I learned how to cook.  This wheat-filled pasta, as we all know, is great fuel for physical activities and remained central to my life as athletics and “high performance nutrition” became a part of my life.  From chocolate milk after a lift to protein bars and Gatorade after practice, I consumed anything with a “protein” label.  In middle school and high school, my mom packed my lunch in a brown bag with a cold cut sandwich, pretzels, some veggies, fruit, and often a protein bar to be eaten after school.  This diet seemed “healthy” to any outside observer, surely better than the pizza and fries eaten by my friends.  On an unrelated note, I got a stomach ache every day around sixth period.  This trend of simple-tasting, quick-fuel food was the story of my food journey until age sixteen when I got a job at the Craft Ale House, a gastropub with farm-to-table meals.

For two and a half years I was exposed to different foods and culinary styles in the restaurant world.  As a food runner and bar back, I saw more ahi-tuna variations and memorized more charcuterie plate cheeses than I care to remember.  Although I never worked behind the line, I became a part of the mealtime experience.  Fresh cracked pepper, a topped off soda, and extra remoulade went a long way for customers eager to enjoy a night out.  Relishing in the dining itself, I learned, was as important as the food being served. When demanding schedules forced me out of food running, I began dishwashing.  Though torturous work, I developed a knack for scrubbing pots and pans. Knuckles bloodied from steel wool and scolding hot water, I always left work with a sense of accomplishment: with my help, the chefs and cooks were able to prepare elegant meals and memorable dishes.  I played a role in the restaurant experience by being the best dishwasher I could be. And once I perfected my craft, I actually began to enjoy it.  Not to mention, working in the back of house meant I got to try the chef’s creations. From coffee ground-rubbed bison to deep fried, crab-stuffed avocado, my palate was expanding by the shift.  

As my final years of high school engendered a sense of culinary adventure, that hope for never-before tasted dishes on a regular basis was squashed when I entered college.  First year dorms don’t have kitchens, so all freshmen are required to have meal plans. I explored Campion Dining Hall with an open mind and an ambition to make the best of what was offered. Unfortunately, the best of Campion was omelets for breakfast, wraps for lunch, and pasta for dinner.  Rinse and repeat.  I not only got bored of my options but ended up getting sick with sinus infections, colds, and intense seasonal allergies on a regular basis. Date nights with my girlfriend were the only reprieve.  We dove head-first into Asian cuisine including Thai, Vietnamese, and Japanese, as well as brunch, America’s greatest tradition.  While Narberth, Ardmore, and Manayunk yielded many new foods, I wanted a change for my daily nutrition.  I wanted to learn how to cook before entering my sophomore year apartment, fully equipped with a kitchen.  That summer I read a book called How to Eat, Move, and Be Healthy!by Paul Chek, a health coach and therapist I’d known about for some time.  As clichĂ© as it sounds, this book changed my life forever.

Paul Chek transformed my perspective on food.  He explained that human beings aren’t evolutionarily designed to thrive on highly processed foods and how whole foods ought to be the center of our diet. This book introduced me to simple concepts like eating foods that are alive (or raw), how fat isn’t the enemy, and how added sugar is wreaking havoc on the health of our nation.  Paul explained the telltale signs of gluten intolerance: stomach ache, headache, a weakened immune system, etc.  Remember those post-lunch stomach aches in high school? I was gluten intolerant, confirmed it by a period of eliminating gluten and watching my symptoms dissipate.  Also introduced in How to Eat, Move, and Be Healthy!was the concept of metabolic typing, that is, that different people fair better on specific diets.  I learned that I do best on a diet higher in fat and protein and lower in carbohydrates, especially refined sugars. Paul also justified the importance of buying organic produce, grass fed beef, pasture raised chickens, and wild caught fish.  I later learned about the perils of commercial agriculture, from the destruction of ecosystems due to overused chemical fertilizers to the carcinogenic impacts of glyphosate, or RoundUp.  Other concepts like buying and supporting local farmers rounded out Paul’s work.

Thus, the food I choose to purchase, cook, eat, and share with my loved ones matters.  In a very real way, I am voting with my fork and my knife, with the dollars I spend on groceries.  If I purchase feedlot meat and highly-processed, commercial tofu, I am supporting operations that contribute to global climate change, maltreat livestock, destroy our disappearing soils, and put small, local farmers out of business. However, if I instead purchase local, grass-fed beef and edamame grown on an organic farm in my county, I am supporting people who are doing their part to heal the planet and produce healthy, nourishing food.  This is an intentional process, one that has to do with the whole system of food production and consumption.  From the health of soil to the health of the meal on my plate, I’ve come to see eating and cooking as a spiritual practice.  I thought about it like this: the food I eat literally becomes me.  If I am what I eat, then I want to be the healthiest Me possible, because it is only with my health that I can live out my mission on this earth.  The extra price of maintaining this holistic, nutritional approach is the best investment I could ever make because sooner or later, my health will be my number one concern.

That edict is quite a long way from chicken nuggets and whipped cream.  What began as a desire for simple mouth pleasures has become a quest to discover what food is best for me.  As I’ve realized what true nutrition ought to be, I understood that my commercially stocked dining hall couldn’t meet my health standards.  Cooking came out of necessity to get the simplistic, whole foods nutrition I needed without the additives and chemicals of dining hall meals.  So I began sautĂ©ing and searing and baking and slow-cooking and calling my mom when I messed up.  I started seasoning with sea salt and pepper, while slowly moving into more complex tastes like rosemary, cayenne, and turmeric.  With chicken and rice as staples, I began to venture into unknown waters.  I experimented with chicken stocks, with cutlets, with vegetable chili, with pork soup dumplings.  Eggs and avocado, eggs and oatmeal, eggs and ground beef, and eggs and kale have all entered the fold.  While my cooking isn’t quite exquisite, I cook almost every day, blending flavors and trying new concoctions.  But every meal I cook begins with the same thing: quality ingredients—organic for sure, local if possible.

I’ve found that many people dislike cooking because they dislike cleaning up. Luckily for me, my dishwashing stint exposed me to the mental anguish of cleaning, showing me that, in the end, scrubbing pots and pans doesn’t have to be painful.  When my mother cooks a meal, she uses every dish in the house and refuses to clean them (rightfully so).  Out of necessity, I brought the art of dishwashing home and have actually begun to enjoy it.  You heard that right, I enjoy cleaning up.  I see it as a meditation.  It’s the most peaceful and orderly moment of my day.  Coupled with the spiritual act of combining ingredients that will become me, dishwashing rounds out the experience of eating I have each day. The dishwasher is helpful, but nothing can outweigh the joy that comes with a clean sink.  I truly believe more people would cook if they didn’t fear cleaning up so much.  It should be cleaning first, then cooking.

And so we’ve arrived back to the metal Power Rangers lunchbox.  In an effort to support sustainable agriculture and local, community farms that produce nourishing whole foods, I lug my lunchbox across campus.  Filled with turmeric-salmon salad, overnight oats, or chicken legs and rice, this trusted tin gives me the freedom cook and emboldens me with the knowledge of where my food comes from.  That, and because my intolerance to gluten rules out sandwiches.  All of this to sustain a healthy body so that I can have a healthy mind so that I can work to create a healthier world.  Now, I by no means follow these principles incredibly well. I still love chocolate, ice cream, and have a weakness for blue corn tortilla chips.  But an ideal is something to strive towards, and strive I do, day in and day out.  

About every other week, you can find me in the dessert section at Whole Foods with my girlfriend (the same one), picking out the perfect cannoli.  I believe life is about balance, not strict adherence to a dietary philosophy.  The goal is to create robust health so that an ice cream cone here or there won’t destroy you.  If I had watched the Power Rangers, I’d make a reference about how the Rangers protected people and fought for the common good.  But I didn’t, so I’ll just go finish cleaning up my dishes.