When Discipline Stops Working

What Women Were Never Told About Weight, Aging, and Control

The Science They Never Told Us

This is the first episode of 2026, and I wanted to start the year by slowing things down, getting a bit personal instead of chasing the latest talking points.

At the end of last year, I spent time reading a few books that genuinely stopped me in my tracks. Not because they offered a new diet or a new protocol, but because they challenged something much deeper: the story we’ve been told about discipline, control, and women’s bodies.

There is a reason women’s bodies change across the lifespan. And it has very little to do with willpower, discipline, or personal failure.

In Why Women Need Fat, evolutionary biologists William Lassek and Steven Gaulin make the case that most modern conversations about women’s weight are fundamentally misinformed. Not because women are doing something wrong, but because we’ve built our expectations on a misunderstanding of what female bodies are actually designed to do.

A major part of their argument focuses on how industrialization radically altered the balance of omega-6 to omega-3 fatty acids in the modern food supply, particularly through seed oils and ultra-processed foods. They make a compelling case that this shift plays a role in rising obesity and metabolic dysfunction at the population level.

I agree that this imbalance matters, and it’s a topic that deserves its own full episode. At the same time, it does not explain every woman’s story. Diet composition can influence metabolism, but it cannot override prolonged stress, illness, hormonal disruption, nervous system dysregulation, or years of restriction. In my own case, omega-6 intake outside of naturally occurring sources is relatively low and does not account for the changes I’ve experienced. That matters, because it reminds us that biology is layered. No single variable explains a complex adaptive system.

One of the most important ideas in the book is that fat distribution matters more than fat quantity.

Women do not store fat the same way men do. A significant portion of female body fat is stored in the hips and thighs, known as gluteofemoral fat. This fat is metabolically distinct from abdominal or visceral fat. It is more stable, less inflammatory, and relatively enriched in long-chain fatty acids, including DHA, which plays a key role in fetal brain development.

From an evolutionary standpoint, this makes sense. Human infants are born with unusually large, energy-hungry brains. Women evolved to carry nutritional reserves that could support pregnancy and lactation, even during times of scarcity. In that context, having fat on your lower body was not a flaw or a failure. It was insurance.

From this perspective, fat is not excess energy. It is deferred intelligence, stored in anticipation of future need. This is where waist-to-hip ratio enters the conversation.

Across cultures and historical periods, a lower waist-to-hip ratio in women has been associated with reproductive health, metabolic resilience, and successful pregnancies. This is not about thinness, aesthetics, or moral worth. It is about fat function, not fat fear, and about how different tissues behave metabolically inside the body. It is about where fat is stored and how it functions.

And in today’s modern culture we have lost that distinction.

Instead of asking what kind of fat a woman carries, we became obsessed with how much. Instead of understanding fat as tissue with purpose, we turned it into a moral scoreboard. Hips became a problem. Thighs became something to shrink. Curves became something to discipline.

Another central idea in Why Women Need Fat is biological set point.

The authors argue that women’s bodies tend to defend a natural weight range when adequately nourished and not under chronic stress. When women remain below that range through restriction, over-exercise, or prolonged under-fueling, the body does not interpret that as success. It interprets it as threat.

Over time, the body adapts, not out of defiance, but out of protection.

Metabolism slows. Hunger and fullness cues become unreliable. Hormonal systems compensate. When the pressure finally eases, weight often rebounds, sometimes beyond where it started, because the body is trying to restore safety.

From this perspective, midlife weight gain, post-illness weight gain, or weight gain after years of restriction is not mysterious. It is not rebellion. It is regulation.

None of this is taught to women.

Instead, we are told that if our bodies change, we failed. That aging is optional. That discipline and botox should override biology. That the number on the scale tells the whole story.

So, before we talk about culture, family, trauma, or personal experience, this matters:

Women’s bodies are not designed to stay static.
They are designed to adapt.

Once you understand that, everything else in this conversation changes.


Why the Body Became the Battlefield

This is where historian Joan Jacobs Brumberg’s work in The Body Project: An Intimate History of American Girls, provides essential context, but it requires some precision.

Girls have not always been free from shame. Shame itself is not new. What has changed is what women are taught to be ashamed of, and how that shame operates in daily life.

Brumberg asks a question that still feels unresolved today:
Why is the body still a girl’s nemesis? Shouldn’t sexually liberated girls feel better about themselves than their corseted counterparts a century ago?

Based on extensive historical research, including diaries written by American girls from the 1830s through the 1990s, Brumberg shows that although girls today enjoy more formal freedoms and opportunities, they are also under more pressure and at greater psychological risk. This is due to a unique convergence of biological vulnerability and cultural forces that turned the adolescent female body into a central site of social meaning during the twentieth century.

In the late nineteenth and early twentieth centuries, girls did not typically grow up fixated on thinness, calorie control, or constant appearance monitoring. Their diaries were not filled with measurements or food rules. Instead, they wrote primarily about character, self-restraint, moral development, relationships, and their roles within family and community.

One 1892 diary entry reads:

“Resolved, not to talk about myself or feelings. To think before speaking. To work seriously. To be self-restrained in conversation and in actions. Not to let my thoughts wander. To be dignified. Interest myself more in others.”

In earlier eras, female shame was more often tied to behavior, sexuality, obedience, and virtue. The body mattered, but primarily as a moral symbol rather than an aesthetic project requiring constant surveillance and correction.

That changed dramatically in the twentieth century.

Brumberg documents how the mother-daughter connection loosened, particularly around menstruation, sexuality, and bodily knowledge. Where female relatives and mentors once guided girls through these transitions, doctors, advertisers, popular media, and scientific authority increasingly stepped in to fill that role.

At the same time, mass media, advertising, film, and medicalized beauty standards created a new and increasingly exacting ideal of physical perfection. Changing norms around intimacy and sexuality also shifted the meaning of virginity, turning it from a central moral value into an outdated or irrelevant one. What replaced it was not freedom from scrutiny, but a different kind of pressure altogether.

By the late twentieth century, girls were increasingly taught that their bodies were not merely something they inhabited, but something they were responsible for perfecting.

A 1982 diary entry captures this shift starkly:

“I will try to make myself better in any way I possibly can with the help of my budget and baby-sitting money. I will lose weight, get new lenses, already got a new haircut, good makeup, new clothes and accessories.”

What changed was not the presence of shame, but its location. Shame moved inward.

Rather than being externally enforced through rules and prohibitions, it became self-policed. Girls were taught to monitor themselves constantly, to evaluate their bodies from the outside, and to treat appearance as the primary expression of identity and worth.

Brumberg is explicit on this point. The fact that American girls now make their bodies their central project is not an accident or a cultural curiosity. It is a symptom of historical changes that are only beginning to be fully understood.

This is where more recent work, such as Louise Perry’s The Case Against the Sexual Revolution, helps extend Brumberg’s analysis into the present moment. Perry argues that while sexual liberation promised autonomy and empowerment, it often left young women navigating powerful biological and emotional realities without the social structures that once offered protection, guidance, or meaning. In that vacuum, the body became one of the few remaining sites where control still seemed possible.

The result is a paradox. Girls are freer in theory, yet more burdened in practice. The body, once shaped by communal norms and shared female knowledge, becomes a solitary project, managed under intense cultural pressure and constant comparison.

For many girls, this self-surveillance does not begin with magazines or social media. It begins at home, absorbed through tone, comments, and modeling from the women closest to them.

Brumberg argues that body dissatisfaction is often transmitted from mother to daughter, not out of cruelty, but because those mothers inherited the same aesthetic anxieties. Over time, body shame becomes a family inheritance, passed down quietly and persistently.

Some mothers transmit it subtly.

Others do it bluntly.

This matters not because my experience is unique, but because it illustrates what happens when a body shaped by restriction, stress, and cultural pressure is asked to perform indefinitely. Personal stories are often dismissed as anecdotal, but they are where biological theory meets lived reality.

If you want to dive deeper into this topic:


Where It All Began: The Messages That Shape Us

I grew up in a household where my body was not simply noticed. It was scrutinized, compared, and commented on. Comments like that do not fade with time. They shape how you see yourself in mirrors and photographs. They teach you that your body must be managed and monitored. They plant the belief that staying small is the price of safety.

So, I grew up believing that if I could control my body well enough, I could avoid humiliation. I could avoid becoming the punchline. I could avoid being seen in the wrong way.

For a while, I turned that fear into discipline.


The Years Before the Collapse: A Lifetime of Restriction and Survival

Food never felt simple for me. Long before bodybuilding, chronic pain, or COVID, I carried a strained relationship with eating. Growing up in a near constant state of anxiety meant that hunger cues often felt unpredictable. Eating was something to plan around or push through. It rarely felt intuitive or easy.

Because of this, I experimented with diets that replaced real meals with cereal or shakes. I followed plans like the Special K diet. I relied on Carnation Instant Breakfast instead of full meals. My protein intake was low. My fear of gaining weight was high. Restriction became familiar.

Top left is when I started working out obsessively at age 16, top right and bottom photo are from middle school when I was at my “heaviest” that drove the disordered behaviors.

In college, I became a strict vegetarian out of compassion for animals, but I did not understand how to meet my nutritional needs. I was studying dietetics and earning personal training certifications while running frequently and using exercise as a way to maintain control. From the outside, I looked disciplined. Internally, my relationship with food and exercise remained tense and inconsistent.

Later, I became involved in a meal-replacement program through an MLM. I replaced two meals a day with shakes and practiced intermittent fasting framed as “cleanse days.” In hindsight, this was structured under-eating presented as wellness. It fit seamlessly into patterns I had lived in for years.

Eating often felt overwhelming. Cooking felt like a hurdle. Certain textures bothered me. My appetite felt fragile and unreliable. This sensory sensitivity existed long before the parosmia that would come years later. From early on, food was shaped by stress rather than nourishment.

During this entire period, I was also on hormonal birth control, first the NuvaRing and later the Mirena IUD, for nearly a decade. Long-term hormonal modulation can influence mood, inflammation, appetite, and weight distribution. It added another layer of complexity to a system already under strain.

Looking back, I can see that my teens and twenties were marked by near constant restriction. Restriction felt normal. Thriving did not.

The book Why Women Need Fat discusses the idea of a biological weight “set point,” the range a body tends to return to when conditions are stable and adequately nourished. I now understand that I remained below my natural set point for years through force rather than balance. My biology never experienced consistency or safety.

This was the landscape I carried into my thirties.


The Body I Built and the Body That Broke

By the time I entered the bodybuilding world in 2017 and 2018, I already had years of chronic under-eating, over-exercising, and nutrient gaps behind me. Bodybuilding did not create my issues. It amplified them.

I competed in four shows. People admired the discipline and the physique. Internally, my body was weakening. I was overtraining and undereating. By 2019, my immune system began to fail. I developed severe canker sores, sometimes twenty or more at once. I started noticing weight-loss resistance. Everything I had done in the past, was no longer working. On my thirty-fifth birthday, I got shingles. My energy crashed. My emotional bandwidth narrowed. My body was asking for rest, but I did not know how to slow down.

Dive deeper into my body building journey here:

Around this time, I was also navigating eating disorder recovery. Learning how to eat without panic or rigid control was emotionally exhausting even under ideal circumstances… but little did I know things were about to take a massive turn for the worst.


COVID, Sensory Loss, and the Unraveling of Appetite

After getting sick with the ‘vid late 2020, everything shifted again. I developed parosmia, a smell and taste distortion that made many foods taste rotten or chemical. Protein and cooked foods often tasted spoiled. Herbs smelled like artificial chemical. Eating became distressing and, at times, impossible.

My appetite dropped significantly. There were periods where my intake was very low, yet my weight continued to rise. This is not uncommon following illness or prolonged stress. The body often shifts into energy conservation, prioritizing survival overweight regulation.

Weight gain became another source of grief. Roughly thirty pounds over the next five years. I feel embarrassed and avoid photographs. I often worry about how others will perceive me.

If this experience resonates, it is important to say this clearly: your body is not betraying you. It is responding to stress, illness, and prolonged strain in the way bodies are designed to respond.


Why Women’s Bodies Adapt Instead of “Bounce Back”

When years of restriction, intense exercise, chronic stress, illness, hormonal shifts, and emotional trauma accumulate, the body often enters a protective state. Metabolism slows. Hormonal signaling shifts. Hunger cues become unreliable. Weight gain or resistance to weight loss can occur even during periods of low intake, because energy regulation is being driven by survival physiology rather than simple calorie balance.

This is not failure. It is physiology.

The calories-in, calories-out model does not account for thyroid suppression, nervous system activation, sleep disruption, pain, trauma, or metabolic adaptation. It reduces a complex biological system to arithmetic.

Women are not machines. We are adaptive systems built for survival. Sometimes resilience looks like holding onto energy when the body does not feel safe.


The Systems That Reinforce Shame

Despite this biological reality, we live in a culture that ties women’s value to discipline and appearance. When women gain weight, even under extreme circumstances, we blame ourselves before questioning the system.

Diet culture frames shrinking as virtue.

Toxic positivity encourages acceptance without context.

Industrial food environments differ radically from those our ancestors evolved in.

Medical systems often dismiss women’s pain and metabolic complexity.

Social media amplifies comparison and moralizes body size.

None of this is your fault. And all of it shapes your experience.

This is why understanding the science matters. This is why telling the truth matters. This is why sharing stories matters.


In the book, More Than a Body, Lindsay and Lexie Kite describe how women are taught to relate to themselves through constant self-monitoring. Instead of living inside our bodies, we learn to watch ourselves from the outside. We assess how we look, how we are perceived, and whether our bodies are acceptable in a given moment.

This constant self-surveillance does real harm. It pulls attention away from hunger, pain, fatigue, and intuition. It trains women to override bodily signals in favor of appearance management. And over time, it creates a split where the body is treated as a project to control rather than a system to understand or care for.

When you layer this kind of self-objectification on top of chronic stress, restriction, illness, and trauma, the result is not empowerment. It is disconnection. And disconnection makes it even harder to hear what the body needs when something is wrong.

Weight gain is not just a biological response. It becomes a moral verdict. And that is how women end up fighting bodies that are already struggling to keep them alive.

The Inheritance Ends Here

For a long time, I believed that breaking generational cycles only applied to mothers and daughters. I do not have children, so I assumed what I inherited would simply end with me, unchanged.

Brumberg’s work helped me see this differently.

What we inherit is not passed down only through parenting. It moves through tone, silence, and self-talk. It appears in how women speak about their bodies in front of others. It lives in the way shame is normalized.

I inherited a legacy of body shame. Even on the days when I still feel its weight, I am choosing not to repeat it.

For me, the inheritance ends with telling the truth about this journey and refusing to speak to my body with the same cruelty I absorbed growing up. It ends here.


Closing the Circle: Your Body Is Not Broken

I wish I could end this with a simple story of resolution. I cannot. I am still in the middle of this. I still grieve. I still struggle with eating and movement. I am still learning how to inhabit a body that feels unfamiliar.

But I know this: my body is not my enemy. She is not malfunctioning. She is adapting to a lifetime of stress, illness, restriction, and emotional weight.

If you are in a similar place, I hope this offers permission to stop fighting yourself and start understanding the patterns your body is following. Not because everything will suddenly improve, but because clarity is often the first form of compassion.

Your body is not betraying you. She is trying to keep you here.

And sometimes the most honest thing we can do is admit that we are still finding our way.


References

  1. Brumberg, J. J. (1997). The Body Project: An Intimate History of American Girls. Random House.
  2. Lassek, W. D., & Gaulin, S. J. C. (2011). Why Women Need Fat: How “Healthy” Food Makes Us Gain Excess Weight and the Surprising Solution to Losing It Forever. Hudson Street Press.
  3. Kite, L., & Kite, L. (2020). More Than a Body: Your Body Is an Instrument, Not an Ornament. Houghton Mifflin Harcourt.

Scientific and academic sources

  1. Lassek, W. D., & Gaulin, S. J. C. (2006). Changes in body fat distribution in relation to parity in American women. Evolution and Human Behavior, 27(3), 173–185.
  2. Lassek, W. D., & Gaulin, S. J. C. (2008). Waist–hip ratio and cognitive ability. Proceedings of the Royal Society B, 275(1644), 193–199.
  3. Dulloo, A. G., Jacquet, J., & Montani, J. P. (2015). Adaptive thermogenesis in human body-weight regulation. Obesity Reviews, 16(S1), 33–43.
  4. Fothergill, E., et al. (2016). Persistent metabolic adaptation after weight loss. Obesity, 24(8), 1612–1619.
  5. Kyle, U. G., et al. (2004). Body composition interpretation. American Journal of Clinical Nutrition, 79(6), 955–962.
  6. Simopoulos, A. P. (2016). Omega-6/omega-3 balance and obesity risk. Nutrients, 8(3), 128.

Trauma, stress, and nervous system context

  1. Sapolsky, R. M. (2004). Why Zebras Don’t Get Ulcers. Henry Holt and Company.
  2. Walker, P. (2013). Complex PTSD: From Surviving to Thriving. Azure Coyote Books.

Projection, Power, and the Pagan Revival

When Belief Becomes Control

This episode isn’t about religion versus religion.
It’s about power, fear, and what happens inside belief systems when conformity becomes more important than honesty.

In this conversation, I’m joined by Sigrin, founder of Universal Pagan Temple.

She’s a practicing Pagan, a witch, a public educator, and someone who speaks openly about leaving Christianity after experiencing fear-based theology, spiritual control, and shame. I want to pause here, because even as an agnostic, when I hear the word witch, my brain still flashes to the cartoon villain version. Green. Ugly. Evil. That image didn’t come from nowhere. It was taught.

One of the things we get into in this conversation is how morality actually functions in Pagan traditions, and how different that framework is from what most people assume.

She describes leaving Christianity not as rebellion, but as self-preservation. And what pushed her out wasn’t God. It was other Christians.

For many people, Christianity isn’t learned from scripture.
It’s learned from other Christians.

The judgment.
The constant monitoring.
The fear of being seen as wrong, dangerous, or spiritually compromised.

In high-control Christian environments, conformity equals safety. Questioning creates anxiety. And the fear of social punishment often becomes stronger than belief itself.

When belonging is conditional, faith turns into survival.


What We Cover in This Conversation:

Paganism Beyond Aesthetics

A lot of people hear “Paganism” and immediately picture vibes, trends, or cosplay. We spend time breaking that assumption apart.

  • Sigrin explains that many beginners jump straight into ritual without actually invoking or dedicating to the divine.
  • She talks about the difference between aesthetic practice and intentional practice.
  • For people who don’t yet feel connected to a specific god or goddess, she offers grounded guidance on how to approach devotion without forcing it.
  • We talk about the transition she experienced moving from Christianity, to atheism, to polytheism.
  • We explore the role of myth, story, and symbolism in spiritual life.
  • She shares her experience of feeling an energy she couldn’t deny, even after rejecting belief entirely.
  • We touch on the wide range of ways Pagans relate to pantheons, including devotional, symbolic, ancestral, and experiential approaches.

The takeaway here isn’t “believe this.”
It’s that Paganism isn’t shallow, trendy, or uniform. It’s relational.


No Holy Book, No Central Authority

One of the most misunderstood aspects of Paganism is the absence of a single text or governing authority.

  • Sigrin references a line she often uses: “If you get 20 witches in a room, you’ll have 40 different beliefs.”
  • We talk about how Pagan traditions don’t operate under enforced doctrine or centralized belief.
  • She brings up the 42 Negative Confessions from ancient Egyptian tradition as an example of ethical self-statements rather than commandments.
  • These function more like reflections on character than laws imposed from above.
  • We compare this to moral storytelling across different myth traditions rather than rigid rule-following.
  • She emphasizes intuition and empathy as core tools for ethical decision-making.
  • I add the role of self-reflection and introspection in systems without external enforcement.

This raises an important question: without a script, responsibility shifts inward.

Why This Can Be Hard After Christianity

We also talk honestly about why this freedom can be uncomfortable, especially for people leaving authoritarian religion.

  • Sigrin notes how difficult it can be to release belief in hell, even after leaving Christianity.
  • Fear doesn’t disappear just because belief changes.
  • When morality was once externally enforced, internal trust has to be rebuilt.
  • Pagan paths often require learning how to sit with uncertainty rather than replacing one authority with another.

This isn’t easier.
It’s quieter.
And it asks more of the individual.

That backdrop matters, because it shapes how Paganism gets misunderstood, misrepresented, and framed as dangerous.


The “Pagan Threat” Narrative

One of the reasons Pagan Threat has gained attention and sparked controversy is not just its content, but whose voice it carries and how it’s framed at the outset.

  • The book was written by Pastor Lucas Miles, a senior director with Turning Point USA Faith and author of other conservative religious critiques. The project is positioned as a warning about what Miles sees as threats to the church and American society. The foreword was written by Charlie Kirk, founder of Turning Point USA. His introduction positions the book as urgent for Christians to read.

From there, the book makes a striking claim:

  • It describes Christianity as a religion of freedom, while framing Paganism as operating under a hive mind or collective groupthink.

A key problem is which Paganism the book is actually engaging.

  • The examples Miles focuses on overwhelmingly reflect liberal, online, or activist-adjacent Pagan spaces, particularly those aligned with progressive identity politics.
  • That narrow focus gets treated as representative of Paganism as a whole.
  • Conservative Pagans, reconstructionist traditions, land-based practices, and sovereignty-focused communities are largely ignored.

As a result, “wokeness” becomes a kind of explanatory shortcut.

  • Modern political anxieties get mapped onto Paganism.
  • Gender ideology, progressive activism, and left-leaning culture get blamed on an ancient and diverse spiritual category.
  • Paganism becomes a convenient container for everything the author already opposes.

We also talk openly about political realignment, and why neither of us fits cleanly into the right/left binary anymore. I raise the importance of actually understanding Queer Theory, rather than using “queer” as a vague identity umbrella.

To help visualize this, I reference a chart breaking down five tiers of the far left, which I’ll include here for listeners who want context.

Next, in our conversation, Sigrin explains why the groupthink accusation feels completely inverted to anyone who has actually practiced Paganism.

  • Pagan traditions lack central authority, universal doctrine, or an enforcement mechanism.
  • Diversity of belief isn’t a flaw. It’s a defining feature.
  • Pagan communities often openly disagree, practice differently, and resist uniformity by design.

The “hive mind” label ignores that reality and instead relies on a caricature built from a narrow and selective sample.

 “Trotter and Le Bon concluded that the group mind does not think in the restricted sense of the word. In place of thoughts, it has impulses, habits, and emotions. Lacking an independent mind, its first impulse is usually to follow the example of a trusted leader. This is one of the most firmly established principles of mass psychology.”  Propaganda by Edward L. Bernays

We contrast this with Christian systems that rely on shared creeds, orthodoxy, and social enforcement to maintain cohesion.

Accusations of groupthink, in that context, often function as projection from environments where conformity is tied to spiritual safety.

In those systems, agreement is often equated with faithfulness and deviation with danger.

Globalism, Centralization, and Historical Irony

We end the conversation by stepping back and looking at the bigger historical picture.

  • The book positions Christianity as the antidote to globalism.
  • At the same time, it advocates coordinated religious unification, political mobilization, and cultural enforcement.
  • That contradiction becomes hard to ignore once you zoom out historically.

Sigrin points out that pre-Christian Pagan worlds were not monolithic.

  • Ancient polytheist societies were highly localized.
  • City-states and regions had their own gods, rituals, myths, and customs.
  • Religious life varied widely from place to place, even within the same broader culture.

I reference The Darkening Age by Catherine Nixey, which documents this diversity in detail.

  • Pagan societies weren’t unified under a single doctrine.
  • There was no universal creed to enforce across regions.
  • Difference wasn’t a problem to be solved. It was normal.

Christianity, by contrast, became one of the first truly globalizing religious systems.

  • A single truth claim.
  • A centralized authority structure.
  • A mandate to replace local traditions rather than coexist with them.

That history makes the book’s framing ironic.

  • Paganism gets labeled “globalist,” despite being inherently local and decentralized.
  • Christianity gets framed as anti-globalist, while proposing further consolidation of belief, power, and authority.

What This Is Actually About

This isn’t about attacking Christians as people.
And it’s not about defending Paganism as a brand.

It is a critique of how certain forms of Christianity function when belief hardens into certainty and certainty turns into control.

Fear-based religion and fear-based ideology share the same problem.
They promise safety.
They demand conformity.
And they struggle with humility.

That doesn’t describe every Christian.
But it does describe systems that rely on fear, surveillance, and moral enforcement to survive.

What I appreciate about this conversation is the reminder that spirituality doesn’t have to look like domination, hierarchy, or a battle plan.

It can be rooted. Local. Embodied.

It can ask something of you without erasing you.

And whether someone lands in Paganism, Christianity, or somewhere else entirely, the question isn’t “Which side are you on?”

It’s whether your beliefs make you more honest, more grounded, and more responsible for how you live.

That’s what I hope people sit with after listening.

Ways to Support Universal Pagan Temple 

Every bit of support helps keep the temple lights on, create more free content, and maintain our community altar. Thank you from the bottom of my heart! 

🖤
☕

 Buy me a coffee (one-time support)  
https://www.buymeacoffee.com/UniversalPaganTemple

💝

 Make a direct donation to the temple  
https://www.paypal.com/donate?hosted_button_id=6TMJ4KYHXB36U

🌟

 Become a Patreon/Subscribestar member (monthly perks & exclusive content)  
https://www.patreon.com/universalpagantemple
https://www.subscribestar.com/the-pagan-prepper

📜

 Join our Substack community (articles, rituals & updates)  
https://universalpagantemple.substack.com

🔮

 Book a Rune or Tarot reading (Etsy)  
https://www.etsy.com/shop/RunicGifts

📚

Grab our books on Amazon  
 •Wicca & Magick: Complete Beginner’s Guide  
https://www.amazon.com/Wicca-Magick-Complete-Beginners-Guide-ebook/dp/B019MZN8LQ

* Runes: Healing and Diet by Sigrún and Freya Aswynn
https://www.amazon.com/dp/B08FP25KH4#averageCustomerReviewsAnchor

• The Egyptian Gods and Goddesses for Beginners  
https://www.amazon.com/Egyptian-Gods-Goddesses-Beginners-Worshiping/dp/1537100092

Even just watching, liking, commenting, and sharing is a huge help!  
Blessed be 

🌀

The Older Story Beneath Christmas

A History of Yule and Cultural Amnesia

Every December, the same argument erupts like clockwork.

“Christmas is pagan.”
“No it isn’t, stop lying.”
“Actually, it’s Saturnalia.”
“Actually, it’s Jesus’ birthday.”

Christian Calling others out 😮

And honestly, the argument itself is the least interesting part.

Because Christmas didn’t replace older solstice traditions.
It grew out of them.

Long before doctrine, people were already gathering at midwinter. Lighting fires. Sharing food. Hanging evergreens. Leaving offerings. Watching the sun closely. Trying to survive the longest night of the year.

Most of what we now call “Christmas spirit” (the lights, the feasting, the greenery, the warmth, even the winter gift-giver) is older than Christian theology by centuries.

And yet, when I converted to Christianity in 2022, none of that felt magical.

It felt dangerous.


My First Christian Christmas: Panic, Purging, and Fear

I was only a few months into my short-lived Christian phase when December arrived, and I suddenly found myself terrified that Christmas was pagan, demonic, or spiritually contaminated.

I burned books.
I threw away crystals.
I cleaned my home like I was preparing for divine inspection.
I interrogated every decoration like it might open a portal.

I’m not exaggerating. I recently found an old document I wrote during that time, and reading it now is unsettling. It reads like I took an entire bucket of fundamentalist talking points, sprinkled in some Wikipedia conspiracies, and shook it like a snow globe.

Here are real lines I wrote in 2022:

“Christmas is a religious holiday. But it’s not Christian.”
“Christmas is the birthday of the sun god Tammuz.”
“Mistletoe came from druids who used it for demonic occult powers.”
“Santa Claus is based on Odin and meant to deceive children.”
“Jesus does not want you to celebrate Christmas.”

I believed every word of it.

Because fear-based Christianity works by shrinking your imagination.
It makes symbols dangerous.
History suspicious.
The world a spiritual minefield.

That was my first clue this wasn’t JUST about theology. It was about fear.
And the inability to hold layered meaning.


Why Winter Was Sacred Long Before Religion

For pre-industrial people, winter wasn’t cozy.

It wasn’t aesthetic. It wasn’t symbolic. It was dangerous.

Food stores ran low. Animals died. Illness spread. Darkness swallowed the day.

When the sun disappeared, it wasn’t metaphorical. It was existential.

That’s why midwinter mattered everywhere, not because cultures shared gods, but because they shared bodies, seasons, and risk.

Homes were built from thick logs, stone, and earth. Materials with thermal mass that held heat long after the fire dimmed. Hearths weren’t decorative. They were survival technology. Families and animals gathered together because warmth meant life.

This wasn’t primitive living. It was skilled living. And it shaped belief.

Seasonal rites weren’t abstract spirituality.
They were instructions for how to endure.


This Isn’t Just Capitalism — It’s Cultural Amnesia

It’s tempting to blame modern capitalism for the way winter has been flattened into noise, urgency, and forced cheer. And capitalism absolutely accelerated the problem.

But that explanation skips a much older rupture.

Pre-Christian seasonal traditions already honored limits. Rest. Darkness. Slowness. Winter was understood as a time of contraction, not productivity. You didn’t push harder in December. You pulled inward. You conserved. You waited.

Those rhythms were disrupted long before department stores and advertising campaigns.

First came religious overwrite… seasonal intelligence reframed into theological narratives that demanded certainty and transcendence over embodiment. Then came industrialization, which severed daily life from land, daylight, and season entirely. Artificial light erased night. Clocks replaced the sun. Productivity became moral.

By the time capitalism arrived in its modern form, much of the damage was already done. Capitalism didn’t invent our disconnection from seasonal limits. It inherited it.

What we’re really dealing with isn’t just exploitation.

It’s amnesia.

We forgot how winter works. We forgot how rest works. We forgot how darkness functions as part of a healthy cycle. And once that memory was gone, it became easy to sell us endless brightness in the darkest part of the year.


What Yule Actually Was. Before Christianity Rewrote It

This is where the history gets interesting….

The earliest surviving written reference to Yule comes from the 8th century, recorded by the Christian monk Bede. Like much of what we know about pre-Christian traditions, it was documented after conversion had already begun. The traditions themselves are older, but the written record is fragmentary and filtered.

The Venerable Bede, an English monk and missionary, was among the earliest writers to record the existence of Yule.

That timing matters.

Like much of what we know about pre-Christian Europe, Yule was documented after conversion had already begun. Earlier traditions were primarily oral, and many were actively suppressed or destroyed, which means the written record is incomplete and filtered through Christian authors.

That does not mean the traditions were new.

It means Christianity arrived late to write them down.

Later sources, such as Snorri Sturluson in Heimskringla (12th–13th century), describe Yule as a midwinter feast involving communal drinking, oath-making, ceremonial meals, ancestor honoring, and celebrations lasting multiple days, often twelve. By the time Snorri was writing, Christianity had already reshaped much of Nordic life, yet the seasonal patterns he records remain strikingly consistent.

The record is not pristine. But it is consistent enough to tell us this:
Yule was a land-based, seasonal response to winter, practiced long before Christianity and remembered imperfectly afterward.

So, when people talk about the “Twelve Days of Christmas,” they’re unintentionally echoing Yule, not the Gospels.


Yule Was Never One Thing — or One Date

There was never a single Yule and never a single calendar.

Some communities marked the solstice itself. Others observed the days before it.
Others celebrated after, once the sun’s return was perceptible.

Yule could last days or weeks, depending on latitude, climate, and local conditions. This diversity wasn’t confusion. It was responsiveness.

Seasonal traditions bent to land, not doctrine.
And that flexibility is one reason they survived so long.


Ancestors, Offerings, and the Household

Yule wasn’t only about gods. It was about the dead.

Midwinter was understood as a liminal time when ancestors drew near. The boundary between worlds thinned. Homes became places of hospitality not just for the living, but for those who came before.

Offerings were left. Food. Drink. Light. We still do this…. even if we pretend it’s just for children.

Milk and cookies for Santa didn’t come out of nowhere.
They echo something far older: leaving nourishment overnight, acknowledging unseen visitors, participating in reciprocity.

The modern story makes it cute.
The older story makes it sacred.


Before Santa, the Sky Was Crowded

Across Northern and Eastern Europe, winter solstice was associated with feminine figures of light, fertility, and renewal— many of whom traveled the sky.

In Baltic traditions, Saule carried the sun across the heavens. Among the Sámi, Beiwe rode through the winter sky in a sleigh pulled by reindeer, restoring fertility to the frozen land.

Darkness wasn’t evil. It was gestational.

The womb is dark. Seeds germinate underground.
Transformation happens unseen. That imagery didn’t disappear.

It migrated.


When Christmas Was Once Illegal

Here’s a part of the story that tends to surprise people.

Christmas was not always embraced by Christianity in America.
In fact, it was once illegal.

In the mid-1600s, Puritan leaders in New England viewed Christmas as pagan, Catholic, and morally corrupt. Everything associated with it raised suspicion.

Evergreens were considered pagan.
Feasting was considered pagan.
Dancing, games, and excess were condemned.
Even taking the day off work was seen as spiritually dangerous.

In 1659, the Massachusetts Bay Colony passed a law banning the celebration of Christmas outright. The statute read:

“Whosoever shall be found observing any such day as Christmas or the like, either by forbearing labour, feasting, or any other way… every such person so offending shall pay for every such offence five shillings.”

Celebrating Christmas was a finable offense.

The ban remained in effect until 1681. And even after it was repealed, many New England towns treated December 25th as an ordinary workday well into the 1700s.

Early American Christianity didn’t preserve Christmas.

It rejected it.

And yet, winter rituals have a way of surviving rejection.


How Christmas Quietly Returned

Christmas didn’t re-enter American life through theology or church decree.

It returned through households.

Throughout the 1700s and early 1800s, winter customs persisted in small, domestic ways. Evergreen branches were brought indoors. Candles were lit in windows. Food was shared. Stories of winter figures and gift-givers circulated quietly within families.

These practices weren’t organized or ideological. They were inherited.

Passed down the way people pass down recipes, songs, and seasonal habits, especially in communities tied to land, season, and home.

They survived because they worked.

They made winter bearable.
They gave rhythm to darkness.
They anchored people to memory and place.

Over time, these household customs accumulated. By the mid-1800s, Christmas re-emerged into public culture, not as a restored Christian holy day, but as a reassembled seasonal festival shaped by folklore, family practice, and winter necessity.

Only later was it fully absorbed, standardized, and commercialized.

That shift, from household memory to mass reproduction…. changed everything.


Santa Claus, Commercialism, and My Mom’s Coca-Cola Bathroom

Santa is one of the clearest examples of what happens when household tradition gives way to mass culture. Early versions of Santa look nothing like the modern mascot. Long robes. Staffs. Hoods. Sometimes thin. Sometimes eerie. Often dressed in green, brown, or deep red.

These figures echo older winter travelers. Odin riding the sky, spirits roaming during Yule, ancestors moving close. This transformation accelerated in the 1800s, when American illustrators and writers began merging European folklore with newly invented holiday imagery.

By then, Santa took shape again.

My husband and I recently found a reproduction Santa figure based on an 1897 illustration. He’s dressed in a long green robe with a staff in hand. This style was common in the 1800s, especially in Germanic and Scandinavian traditions where the winter gift-giver was closer to a folkloric spirit than a cozy grandfather. Seeing him in that deep forest green, with that hooded, old-world posture, makes it obvious how far the modern Santa has drifted from his roots.


By the 1900s, Coca-Cola standardized him. Red suit. White trim. Jolly. Brand-safe. Growing up, this wasn’t abstract for me.

My mom worked for Coca-Cola when the company was based in Richmond, Virginia, in the early 1980s. My first word was “Coke.” Coca-Cola wasn’t just a brand in our house, it was part of the atmosphere.

My mom loved Coca-Cola décor. We had Coca-Cola signs, collectibles, and even a full Coca-Cola bathroom. At the time, it just felt normal. Cozy, even. Americana. Tradition.

I didn’t realize until much later how completely my sense of “holiday spirit” had been shaped by corporate nostalgia rather than ancestral memory. What I thought of as timeless wasn’t old at all. It was manufactured, standardized, and sold back to us as heritage.

That doesn’t make it evil. But it does matter.

Because when branding replaces ritual, something gets flattened. The symbols remain, but the relationship is gone. What was once seasonal, local, and embodied becomes aesthetic. Consumable. Safe.

And for many of us, that’s the only version of winter we were ever given.

That’s not a judgment. It’s just reality. Most of us weren’t raised with ritual.
We were raised with branding.

What was lost in that transformation wasn’t belief. It was relationship— to land, to season, to memory.

And the people who held onto that relationship longest were already labeled for it.


Why “Heathen” Never Meant Godless

The word heathen never originally meant immoral or evil.

It meant rural.

Its earliest known form, haithno, is feminine and means “woman of the heath” — the open, uncultivated land beyond cities and roads. From there it spread through Germanic languages: Anglo-Saxon hǣþen, Old Norse heidinn, Old High German heidan.

Clergy used heathen to describe those who kept ancestral customs while cities converted. The 8th-century monk Paulus Diaconus wrote of heidenin commane (the rural people) calling them “the wild heathen.”

Offerings to trees, springs, and stones were condemned as sacrilege. Over time, heathen merged with Latin paganus, meaning “rural dweller,” and gentilis, meaning “of another tribe.”

What began as a description of people who would not leave the wild became a moral accusation.

Later, the same language was exported outward… applied to colonized lands as uncivilized or heathen.

The fear was never really about gods. It was about land that refused to be controlled.

What Actually Happened, and Why the Old Ways Are Calling Back

The same patterns repeat across centuries: suppression, survival, absorption, and forgetting.

But we need to be honest about what that suppression looked like.

This was not a gentle handoff.
It was not mutual exchange.
It was not respectful evolution.

Christianity did not simply reinterpret older traditions.
It destroyed them where it could.

This is not rhetoric. It is history.

Historian Catherine Nixey documents this process in The Darkening Age. Early Christianity treated pagan traditions not as ancestors, but as enemies. Temples were smashed. Statues were defaced. Sacred groves were cut down. Libraries were burned. Seasonal rites that had structured life for centuries were criminalized.

This destruction was not hidden or accidental. It was celebrated.

Christian writers praised the demolition of temples. They mocked the old gods as demons. Beauty, pleasure, ritual, and joy were reframed as moral danger. Festivals became obscene. Feasting became gluttony. The body itself became suspect.

What could not be eradicated outright was stripped, renamed, and absorbed, while its origins were denied.

The solstice became Christ’s birth.
The returning sun became metaphor.
Evergreens became safe symbols.
Ancestor offerings were reduced to children’s fantasy.

This was not borrowing. It was conquest, followed by selective inheritance.

When that conquest met resistance in rural places, in households, and in women’s hands, it adapted. It waited. It layered itself over what remained.

That is why the seams still show. That is why Christmas has always felt haunted.
Layered. Conflicted. Unstable.

What survived did so despite institutional Christianity, not because of it.

It survived in kitchens and hearths. In fields and forests.
In winter nights and quiet ritual.
In land-based people who refused to forget how the seasons worked.

Centuries later, capitalism finished what religion began. What remained was flattened into nostalgia, branding, and spectacle.

Not because the old ways were weak.
But because they were powerful.


Why the Call Feels Loud Again

The pull people feel now toward solstice, ancestors, darkness, rest, and land is not aesthetic.

It is memory.

It is the body remembering rhythms it was trained to forget.
It is the psyche rejecting constant light, constant productivity, constant cheer.
It is old intelligence resurfacing after centuries of suppression.

The old gods were never gone. They were buried. Winter has a way of thawing buried things.

If something in you responds to the fire, the darkness, the offering, or the pause, that does not mean you are rejecting modern life or indulging fantasy.

It means you are responding to a pattern older than doctrine.
Older than empire. Older than the fear that tried to erase it.

What was destroyed is stirring. What was taken is being remembered.

In a few days, I’ll be sitting down with Universal Pagan Temple for a conversation on pagan culture, ritual, history, and lived practice, with Sigrún Gregerson, Pagan priestess and educator. If this piece brought up questions for you, about Yule, Mother’s Night, ancestor work, or what reclaiming these traditions actually looks like, I’d love to carry them into that conversation. Feel free to leave your questions in the comments or send them my way.

This is how the old ways return.
Quietly. Carefully. Through memory, practice, and conversation.

My Mother’s Night Altar 12.20.25

The Historical Jesus Fact or Fiction?

Nailed: Ten Christian Myths That Show Jesus Never Existed at All

Today’s episode is one I’ve been looking forward to for a long time. I sat down with author and researcher David Fitzgerald, whose book Nailed: Ten Christian Myths That Show Jesus Never Existed at All has stirred up both fascination and controversy in both historical and secular circles.

Before anyone clutches their pearls — or their study Bible — this conversation isn’t about bashing belief. It’s about asking how we know what we think we know, and whether our historical standards shift when faith enters the equation.

Fitzgerald has spent over fifteen years investigating the evidence — or lack of it — surrounding the historical Jesus. In this first part of our series, we cover Myth #1 (“The idea that Jesus being a myth is ridiculous”) and Myth #4 (“The Gospels were written by eyewitnesses”). We also start brushing up against Myth #5, which explores how the Gospels don’t even describe the same Jesus.

We didn’t make it to Myth #7 yet — the claim that archaeology confirms the Gospels…. so, stay tuned for Part Two.

And for my visual learners!! I’ve got you. Scroll below for infographics, side-by-side Gospel comparisons, biblical quotes, and primary source references that make this episode come alive.

🧩 The 10 Myths About Jesus — According to Nailed

Myth #1: “The idea that Jesus was a myth is ridiculous!”
→ Fitzgerald argues that the assumption of Jesus’ historicity persists more from cultural tradition than actual historical evidence, and that questioning it isn’t fringe. It’s legitimate historical inquiry.

Myth #2: “Jesus was wildly famous — but somehow no one noticed.”
→ Despite claims that Jesus’ miracles and teachings drew massive crowds, there’s an eerie silence about him in the records of contemporaneous historians and chroniclers who documented far lesser figures.

Myth #3: “Ancient historian Josephus wrote about Jesus.”
→ The so-called “Testimonium Flavianum” passages in Josephus’ work are widely considered later Christian insertions, not authentic first-century testimony.

Myth #4: “Eyewitnesses wrote the Gospels.”
→ The Gospels were written decades after the events they describe by unknown authors relying on oral traditions and earlier written sources, not firsthand experience.

Myth #5: “The Gospels give a consistent picture of Jesus.”
→ Each Gospel portrays a strikingly different version of Jesus — from Mark’s suffering human to John’s divine Logos — revealing theological agendas more than biographical consistency.

Myth #6: “History confirms the Gospels.”
→ When examined critically, historical records outside the Bible don’t corroborate the key events of Jesus’ life, death, or resurrection narrative.

Myth #7: “Archaeology confirms the Gospels.”
→ Archaeological evidence supports the general backdrop of Roman-era Judea but fails to verify specific Gospel claims or the existence of Jesus himself.

Myth #8: “Paul and the Epistles corroborate the Gospels.”
→ Paul’s letters — the earliest Christian writings — reveal no awareness of a recent historical Jesus, focusing instead on a celestial Christ figure revealed through visions and scripture.

Myth #9: “Christianity began with Jesus and his apostles.”
→ Fitzgerald argues that Christianity evolved from earlier Jewish sects and mystery religions, with “Jesus” emerging as a mythologized figure around whom older beliefs coalesced.

Myth #10: “Christianity was totally new and different.”
→ The moral teachings, rituals, and savior motifs of early Christianity closely mirror surrounding pagan traditions and Greco-Roman mystery cults.


📘 Myth #1: “The Idea That Jesus Being a Myth Is Ridiculous”

This one sets the tone for the entire book — because it’s not even about evidence at first. It’s about social pressure.

Fitzgerald opens Nailed by calling out how the mythicist position (the idea that Jesus might never have existed) gets dismissed out of hand…even by secular historians. As he points out, the problem isn’t that the evidence disproves mythicism. The problem is that we don’t apply the same historical standards we would to anyone else.

Case in point: Julius Caesar crossing the Rubicon.

Julius Caesar crossing the Rubicon at the head of his army, 49 BC. Illustration from Istoria Romana incisa all’acqua forte da Bartolomeo Pinelli Romano (Presso Giovanni Scudellari, Rome, 1818-1819).

When historians reconstruct that event, we have:

  • Multiple contemporary accounts from major Roman historians like Suetonius, Plutarch, Appian, and Cassius Dio.
  • Physical evidence — coins, inscriptions, and monuments produced during or shortly after Caesar’s lifetime.
  • Political and military documentation aligning with the timeline.

In contrast, for Jesus, we have:

  • No contemporary accounts.
  • No archaeological or physical evidence.
  • Gospels written decades later by anonymous authors who never met him.

That’s the difference between history and theology.

Even historian Bart Ehrman, who does believe Jesus existed, has called mythicists “the flat-earthers of the academic world.” Fitzgerald addresses that in the interview (not defensively, but critically) asking why questioning this one historical figure provokes so much emotional resistance.

As he puts it, if the same level of evidence existed for anyone else, no one would take it seriously.


✍️ Myth #4: “The Gospels Were Written by Eyewitnesses”

We dive into the authorship problem — who actually wrote the Gospels, when, and why it matters.


🔀 Myth #5: “The Gospels Don’t Describe the Same Jesus”

⚖️ Contradictions Between the Gospels

1. Birthplace of Jesus — Bethlehem or Nazareth?

Matthew 2:1 – “Jesus was born in Bethlehem of Judea in the days of Herod the king.”
Luke 2:4–7 – Joseph travels from Nazareth to Bethlehem for the census, and Jesus is born there.
John 7:41–42, 52 – Locals say, “The Messiah does not come from Galilee, does he?” implying Jesus was known as a Galilean, not from Bethlehem.

🔍 Mythicist take:
Bethlehem was retrofitted into the story to fulfill the Messianic prophecy from Micah 5:2. In early Christian storytelling, theological necessity (“he must be born in David’s city”) trumps biographical accuracy.

2. Jesus’ Genealogy — Two Lineages, Zero Agreement

Matthew 1:1–16 – Jesus descends from David through Solomon.
Luke 3:23–38 – Jesus descends from David through Nathan.
Even Joseph’s father differs: Jacob (Matthew) vs. Heli (Luke).

🔍 Mythicist take:
Two contradictory genealogies suggest not historical memory but theological marketing. Each author tailors Jesus’ lineage to fit symbolic patterns — Matthew emphasizes kingship; Luke, universality.

3. The Timing of the Crucifixion — Passover Meal or Preparation Day?

Mark 14:12–17 – Jesus eats the Passover meal with his disciples before his arrest.
John 19:14 – Jesus is crucified on the day of Preparation — before Passover begins — at the same time lambs are being slaughtered in the Temple.

🔍 Mythicist take:
This isn’t a detail slip; it’s theology. John deliberately aligns Jesus with the Paschal lamb, turning him into the cosmic sacrifice — a theological metaphor, not an eyewitness timeline.

4. Jesus’ Last Words — Four Versions, Four Theologies

Mark 15:34 – “My God, my God, why have you forsaken me?” → human anguish.
Luke 23:46 – “Father, into your hands I commit my spirit.” → serene trust.
John 19:30 – “It is finished.” → divine completion.
Matthew 27:46 – Echoes Mark’s despair, but adds cosmic drama (earthquake, torn veil).

🔍 Mythicist take:
Each Gospel shapes Jesus’ death to reflect its theology — Mark’s suffering human, Luke’s faithful martyr, John’s omniscient divine being. This isn’t eyewitness diversity; it’s evolving mythmaking.

5. Who Found the Empty Tomb — and What Did They See?

Mark 16:1–8Three women find the tomb open, see a young man in white, flee in fear, tell no one.
Matthew 28:1–10Two women see an angel descend, roll back the stone, and tell them to share the news.
Luke 24:1–10Several women find the stone already rolled away; two men in dazzling clothes appear.
John 20:1–18Mary Magdalene alone finds the tomb, then runs to get Peter; later she meets Jesus himself.

🔍 Mythicist take:
If this were a consistent historical event, we’d expect some harmony. Instead, we see mythic escalation: from a mysterious empty tomb (Mark) → to heavenly intervention (Matthew) → to divine encounter (John).


6. The Post-Resurrection Appearances — Where and to Whom?

Matthew 28:16–20 – Jesus appears in Galilee to the eleven.
Luke 24:33–51 – Jesus appears in Jerusalem and tells them to stay there.
Acts 1:4–9 – Same author as Luke, now extends appearances over forty days.
Mark 16 (longer ending) – A later addition summarizing appearances found in the other Gospels.

🔍 Mythicist take:
The resurrection narrative grows with time — geographically, dramatically, and theologically. Early silence (Mark) gives way to detailed appearances (Luke/John), mirroring the development of early Christian belief rather than eyewitness memory.


🌿 Final Thought

Whether you end up agreeing with Fitzgerald or not, the point isn’t certainty… it’s curiosity. The willingness to look at history without fear, even when it challenges what we’ve always been told.

And here’s the fun part! David actually wants to hear from you. If you’ve got questions, pushback, or something you want him to unpack next time, drop it in the comments or send it my way. I’ll collect your submissions and bring a few of them into Part Two when we dig into Myth #7 — “Archaeology Confirms the Gospels.”

and as always, maintain your curiosity, embrace skepticism, and keep tuning in. 🎙️

📖 Further Reading 📖 

Foundational Mythicist Works:

  • Richard Carrier – On the Historicity of Jesus
  • Robert M. Price – The Christ-Myth Theory and Judaizing Jesus 
  • Earl Doherty – The Jesus Puzzle
  • Gospel Fictions – Randel Helms
  • The Fable of Christ – Joseph Wheless
  • The Pagan Christ – Tom Harpur
  • The Historical Jesus – William Benjamin Smith
  • The mythic past : biblical archaeology and the myth of Israel

Did Jesus Exist? Jacob Berman and Dr. Jack Bull Versus Dr. Aaron Adair and Neil Godfrey

Mainstream Scholarship & Context

  • Bart Ehrman – Did Jesus Exist?
  • Jonathan Haidt – The Righteous Mind Why Good People are Divided by Religion and Politics

Critiques of Bart Ehrman

Broader Philosophical & Cultural Context

  • Christianity before Christ  –  John G Jackson
  • The World’s Sixteen Crucified Saviors – Kersey Graves
  • The Christ Conspiracy – Acharya S (D.M. Murdock)


Sacred or Strategic? Rethinking the Christian Origin Story

The Bible Isn’t History and Trump Isn’t Your Savior

It’s Been a Minute… Let’s Get Real

Hey Hey, welcome back to Taste of Truth Tuesdays! it’s been over a month since my last episode, and wow—a lot has happened. Honestly, I’ve been doing some serious soul-searching and education, especially around some political events that shook me up.

I was firmly against Trump’s strikes on Iran. And the more I dug in, the more I realized how blind I’d been completely uneducated and ignorant about the massive political power Zionism holds in this country. And it’s clear now: Trump is practically bent over the Oval Office for Netanyahu. The Epstein files cover-up only confirms that blackmail and shadow control are the real puppet strings pulling at the highest levels of power. Our nation has been quietly occupied since Lyndon B. Johnson’s presidency and that’s a whole other episode I’ll get into later.

But what really cracked something in me was this:

In the 1990s, Trump sponsored Elite’s “Look of the Year” contest—a glitzy, global modeling search that lured teenage girls with promises of fame and fashion contracts. Behind the scenes, it was a trafficking operation. According to The Guardian’s Lucy Osborne and the BBC documentary Scouting For Girls: Fashion’s Darkest Secret, these girls weren’t being scouted—they were being sold to rich businessmen.

This wasn’t just proximity. Trump was part of it.

Once I saw that, the religious right’s worship of him stopped looking like misguided patriotism and started looking like mass delusion. Or complicity. Either way, I couldn’t unsee it.

And that’s when I started asking the bigger questions: What else have we mistaken for holy? What else have we accepted as truth without scrutiny?

For now, I want to cut to the heart of the matter: the major problem at the root of so much chaos: the fact that millions of Christians still believe the Bible is a literal historical document.

This belief doesn’t just distort faith-it fuels political agendas, end-times obsession, and yes, even foreign policy disasters. So, let’s dig into where this all began, how it’s evolved, and why it’s time we rethink everything we thought we knew about Scripture.

Thanks for reading Taste of Truth! Subscribe for free to receive new posts and support my work.

For most Christians, the Bible is more than a book-it’s the blueprint of reality, the inspired Word of God, infallible and untouchable. But what if that belief wasn’t original to Christianity? What if it was a reaction…. a strategic response to modern doubt, historical criticism, and the crumbling authority of the Church?

In this episode, we’re pulling back the veil on the doctrine of biblical inerrancy, the rise of dispensationalism, and the strange marriage of American politics and prophetic obsession. From the Scofield Bible to the belief that modern-day Israel is a fulfillment of God’s plan, we’re asking hard questions about the origins of these ideas.

As Dr. Mark Gregory Karris said when he joined us on a previous episode: “Can you imagine two different families? One, the Bible is the absolute inerrant word of God every.Word, every jot and title, so to speak, is meant to be in there due to the inspiration of God. And so every story you read, you know, God killing Egyptian babies and God flooding the entire planet and thinking, well yeah, there’s gonna be babies gasping for air and drowning grandmothers and all these animals. And that is seen as absolute objective truth. But then in another family, oh, these are, these are myths. These are sacred myths that people can learn from. No, that wasn’t like God speaking and smiting them and burning them alive because they touch this particular arc or now that this is how they thought given their minds at the time, given their understandings of and then like you talked about oh look at that aspect of humanity interesting that they portrayed god and not like it becomes like wow that’s cool instead of like oh my gosh i need 3-4 years of therapy because I was taught the bible in a particular way.”

Once you trace these doctrines back to their roots, it’s not divine revelation you find: it’s human agendas.

Let’s get uncomfortable. Was your faith formed by sacred truth… or centuries of strategic storytelling?

How Literalism Took Over

In the 19th century, biblical literalism became a kind of ideological panic room. As science, archaeology, and critical scholarship began to chip away at traditional interpretations, conservative Christians doubled down. Instead of exploring the Bible as a complex, layered anthology full of metaphor, moral instruction, and mythology, they started treating it like a divine press release. Every word had to be accurate. Every timeline had to match. Every contradiction had to be “harmonized” away.

The Myth of Inerrancy

One of the most destructive byproducts of this era was the invention of biblical inerrancy. Yes, invention. The idea that the Bible is “without error in all that it affirms” isn’t ancient…. it’s theological propaganda, most notably pushed by B.B. Warfield and his peers at Princeton. Rogers and McKim wrote extensively about how this doctrine was manufactured and not handed down from the apostles as many assume. We dive deeper into all that—here.

Inerrancy teaches that the Bible is flawless, even in its historical, scientific, and moral claims. But this belief falls apart under even basic scrutiny. Manuscripts don’t agree. Archaeological timelines conflict with biblical ones. The Gospels contradict each other. And yet this doctrine persists, warping believers’ understanding and demanding blind loyalty to texts written by fallible people in vastly different cultures.

That’s the danger of biblical inerrancy: it treats every verse as historical journalism rather than layered myth, metaphor, or moral instruction. But what happens when you apply that literalist lens to ancient origin stories?

📖 “Read as mythology, the various stories of the great deluge have considerable cultural value, but taken as history, they are asinine and absurd.” — John G. Jackson, Christianity Before Christ

And yet, this is the foundation of belief for millions who think Noah’s Ark was a literal boat and not a borrowed flood myth passed down and reshaped across Mesopotamian cultures. This flattening of myth into fact doesn’t just ruin the poetry-it fuels bad politics, end-times obsession, and yes… Zionism.

And just to be clear, early Christians didn’t read the Bible this way. That kind of rigid literalism didn’t emerge until centuries later…long after the apostles were gone. We’ll get to that.

When we cling to inerrancy, we’re not preserving truth. We’re missing it entirely.

Enter: Premillennial Dispensationalism

If biblical inerrancy was the fuel, C.I. Scofield’s 1909 annotated Bible was the match. His work made premillennial dispensationalism a household belief in evangelical churches. For those unfamiliar with the term, here’s a quick breakdown:

  • Premillennialism: Jesus will return before a literal thousand-year reign of peace.
  • Dispensationalism: History is divided into distinct eras (or “dispensations”) in which God interacts with humanity differently.

When merged, this theology suggests we’re living in the “Church Age,” which will end with the rapture. Then comes a seven-year tribulation, the rise of the Antichrist, and finally, Jesus returns for the ultimate battle after which He’ll rule Earth for a millennium. Sounds like the plot of a dystopian film, right? And yet, this became the dominant lens through which American evangelicals interpret reality.

The result? A strange alliance between American evangelicals and Zionist nationalism. You get politicians quoting Revelation like it’s foreign policy, pastors fundraising for military aid, and millions of Christians cheering on war in the Middle East because they think it’ll speed up Jesus’ return.

But here’s what I want you to take away from this episode today: none of this works unless you believe the Bible is literal, infallible, and historically airtight.

How This Shaped Evangelical Culture and Politics

The Scofield Bible didn’t just change theology. It changed culture. Dispensationalist doctrine seeped into seminaries like Dallas Theological Seminary and Moody Bible Institute, influencing generations of pastors. It also exploded into popular culture through Hal Lindsey’s The Late Great Planet Earth and the Left Behind series. Fiction, prophecy, and fear blurred into one big spiritual panic attack.

But perhaps the most alarming shift came in the political realm. Dispensationalist belief heavily influences evangelical support for the modern state of Israel. Why? Because many believe Israel’s 1948 founding was a prophetic event. Figures like Jerry Falwell turned theology into foreign policy. His organization, the Moral Majority, was built on an unwavering belief that supporting Israel was part of God’s plan. Falwell didn’t just preach this, he traveled to Israel, funded by its government, and made pro-Israel advocacy a cornerstone of evangelical identity.

This alignment between theology and geopolitics hasn’t faded. In the 2024 election cycle, evangelical leaders ranked support for Israel on par with anti-abortion stances. Ralph Reed, founder of the Faith and Freedom Coalition, explicitly said as much. Donald Trump even quipped that “Christians love Israel more than Jews.” Whether that’s true or not, it reveals just how deep this belief system runs.

And the propaganda doesn’t stop there…currently Israel’s Foreign Ministry is funding a week-long visit for 16 prominent young influencers aligned with Donald Trump’s MAGA and America First movements, part of an ambitious campaign to reshape Israel’s image among American youth.

But Let’s Talk About the Red Flags

This isn’t just about belief-it’s about control. Dispensationalist theology offers a simple, cosmic narrative: you’re on God’s winning team, the world is evil, and the end is near. There’s no room for nuance, no time for doubt. Just stay loyal, and you’ll be saved.

This thinking pattern isn’t exclusive to Christianity. You’ll find it in MLMs, and some conspiracy theory communities. The recipe is the same: create an in-group with secret knowledge, dangle promises of salvation or success, and paint outsiders as corrupt or deceived. It’s classic manipulation-emotional coercion wrapped in spiritual language.

And let’s not forget the date-setting obsession. Hal Lindsey made a career out of it. People still point to blood moons, earthquakes, and global politics as “proof” that prophecy is unfolding. If you’ve ever been trapped in that mindset, you know how addictive and anxiety-inducing it can be.

BY THE WAY, it’s not just dispensationalism or the Scofield Bible that fuels modern Zionism. The deeper issue is, if you believe the Bible is historically accurate and divinely orchestrated, you’re still feeding the ideological engine of Zionism. Because at its core, Christianity reveres Jewish texts, upholds Jewish chosenness, and worships a Jewish messiah. That’s not neutrality it’s alignment.

If this idea intrigued you, you’re not alone. There’s a growing body of work unpacking how Christianity’s very framework serves Jewish supremacy, whether intentionally or not. For deeper dives, check out Adam Green’s work over at Know More News on Rumble, and consider reading The Jesus Hoax: How St. Paul’s Cabal Fooled the World for Two Thousand Years. You don’t have to agree with everything to realize: the story you were handed might not be sacred it might be strategic.

Why This Matters for Deconstruction

For me, one of the most painful parts of deconstruction was realizing I’d been sold a false bill of goods. I was told the Bible was the infallible word of God. That it held all the answers. That doubt was dangerous. But when I began asking real questions, the entire system started to crack.

The doctrine of inerrancy didn’t deepen my faith… it limited it. It kept me from exploring the Bible’s human elements: its contradictions, its cultural baggage, and its genuine beauty. The truth is that these texts were written by people trying to make sense of their world and their experiences with the divine. They are not divine themselves.

Modern Scholarship Breaks the Spell

Modern biblical scholarship has long since moved away from the idea of inerrancy. When you put aside faith-based apologetics and look honestly at the evidence, the traditional claims unravel quickly:

  • Moses didn’t write the Torah. Instead, the Pentateuch was compiled over centuries by multiple authors, each with their own theological agendas (see the JEDP theory).
  • King David is likely a mythic figure. Outside of the Bible, there’s no solid evidence he actually existed, much less ruled a vast kingdom.
  • The Gospels weren’t written by Matthew, Mark, Luke, and John. Those names were added later. The original texts are anonymous and they often contradict each other.
  • John didn’t write Revelation. Not the Apostle John, anyway. The Greek and style are completely different from the Gospel of John. The real author was probably some unknown apocalyptic mystic on Patmos, writing during Roman persecution.

And yet millions still cling to these stories as literal fact, building entire belief systems and foreign policies on myths and fairy tales.


🧠 Intellectual Starvation in Evangelicalism

Here’s the deeper scandal: it’s not just that foundational Christian stories crumble under modern scrutiny. It’s that the church never really wanted you to think critically in the first place.

Mark Noll, a respected evangelical historian, didn’t mince words when he wrote:

“The scandal of the evangelical mind is that there is not much of an evangelical mind.”

In The Scandal of the Evangelical Mind, Noll traces how American evangelicalism lost its intellectual life. It wasn’t shaped by a pursuit of truth, but by populist revivalism, emotionalism, and a hyper-literal obsession with “the end times.” The same movements that embraced dispensationalism and biblical inerrancy also gutted their communities of academic rigor, curiosity, and serious theological reflection.

The result? A spiritually frantic but intellectually hollow faith—one that discourages questions, mistrusts scholarship, and fears nuance like it’s heresy.

Noll shows that instead of grappling with ambiguity or cultural complexity, evangelicals often default to reactionary postures. This isn’t just a relic of the past. It’s why so many modern Christians cling to false authorship claims, deny historical context, and accept prophecy as geopolitical fact. It’s why Revelation gets quoted to justify Zionist foreign policy without ever asking who actually wrote the book or when, or why.

This anti-intellectualism isn’t an accident. It was baked in from the start.

But Noll doesn’t leave us hopeless. He offers a call forward: for a faith that engages the world with both heart and mind. A faith that can live with tension, welcome complexity, and evolve beyond fear-driven literalism.

What Did the Early Church Actually Think About Scripture?

Here’s what gets lost in modern evangelical retellings: the earliest Christians didn’t treat Scripture the way today’s inerrantists do.

For the first few centuries, Christians didn’t even have a finalized Bible. There were letters passed around, oral traditions, a few widely recognized Gospels, and a whole lot of discussion about what counted as authoritative. It wasn’t until the fourth century that anything close to our current canon was even solidified. And even then, it wasn’t set in stone across all branches of Christianity.

Church fathers like Origen, Clement of Alexandria, and Irenaeus viewed Scripture as spiritually inspired but full of metaphor and mystery. They weren’t demanding literal accuracy; they were mining the texts for deeper meanings. Allegory was considered a legitimate, even necessary, interpretive method. Scripture was read devotionally and theologically, not scientifically or historically. In other words, it wasn’t inerrancy that defined early Christian engagement with Scripture, it was curiosity and contemplation.

For a deeper dive, check out The Gnostic Informant’s incredible documentary that uncovers the first hundred years of Christianity, a period that has been systematically lied about and rewritten. It reveals how much of what we take for granted was shaped by political and theological agendas far removed from the original followers of Jesus.

If you’re serious about understanding the roots of your faith or just curious about how history gets reshaped, this documentary is essential viewing. It’s a reminder that truth often hides in plain sight and that digging beneath the surface is how we reclaim our own understanding.

Protestantism: A Heretical Offshoot Disguised as Tradition

The Protestant Reformation shook things up in undeniable ways. Reformers like Martin Luther and John Calvin challenged the Catholic Church’s abuses and rightly demanded reform. But what’s often missed (or swept under the rug) is how deeply Protestantism broke with the ancient, historic Church.

By insisting on sola scriptura—Scripture alone—as the sole authority, the Reformers rejected centuries of Church tradition, councils, and lived community discernment that shaped orthodox belief. They didn’t invent biblical inerrancy as we know it today, but their elevation of the Bible above all else cracked the door wide open for literalism and fundamentalism to storm in.

What began as a corrective movement turned into a theological minefield. Today, Protestantism isn’t a single coherent tradition; it’s a sprawling forest of over 45,000 different denominations, all claiming exclusive access to “the truth.”

This fragmentation isn’t accidental…. it’s the logical outcome of rejecting historic continuity and embracing personal interpretation as the final authority.

Far from preserving the faith of the ancient Church, Protestantism represents a fractured offshoot: one that often contradicts the early Church’s beliefs and teachings. It trades the richness of lived tradition and community wisdom for a rigid, literalistic, and competitive approach to Scripture.

The 20th century saw this rigid framework perfected into a polished doctrine demanding total conformity and punishing doubt. Protestant fundamentalism turned into an ideological fortress, where questioning is treated as betrayal, and theological nuance is replaced by black-and-white dogma.

If you want to understand where so much of modern evangelical rigidity and end-times obsession comes from, look no further than this fractured legacy. Protestantism’s break with the ancient Church set the stage for the spiritual and intellectual starvation that Mark Noll so powerfully exposes.

Rethinking the Bible

Seeing the Bible as a collection of human writings about God rather than the literal word from God opens up space for critical thinking and compassion. It allows us to:

  • Study historical context and cultural influences.
  • Embrace the diversity of perspectives in Scripture.
  • Let go of rigid interpretations and seek core messages like love, justice, and humility.
  • Move away from proof-texting and toward spiritual growth.
  • Reconcile faith with science, reason, and modern ethics.

When we stop demanding that the Bible be perfect, we can finally appreciate what it actually is: a complex, messy, beautiful attempt by humans to understand the sacred.

This shift doesn’t weaken faith…. I believe it strengthens it.

It moves us away from dogma disguised as certainty and into something deeper…. something alive. It opens the door for real relationship, not just with the divine, but with each other. It makes space for growth, for disagreement, for honesty.

And in a world tearing itself apart over whose version of truth gets to rule, that kind of open-hearted spirituality isn’t just refreshing-it’s essential.

Because if your faith can’t stand up to questions, history, or accountability… maybe it was never built on truth to begin with.

Let’s stop worshiping the paper and start seeking the presence.

🔎 Resources Worth Exploring:

  • “The Jesus Hoax: How St. Paul’s Cabal Fooled the World for Two Thousand Years” by David Skrbina
  • “Christianity Before Christ” by John G. Jackson
  • The Scandal of the Evangelical Mind” by Mark Noll – A scathing but sincere critique from within the evangelical tradition itself. Noll exposes how anti-intellectualism, biblical literalism, and cultural isolationism have gutted American Christianity’s ability to engage the world honestly.
  • Check out Adam Green’s work at Know More News on Rumble for more on the political and mythological implications of Christian Zionism
  • And don’t miss my interview with Dr. Mark Gregory Karris, author of The Diabolical Trinity: Wrathful God, Sinful Self, and Eternal Hell, where we dive deep into the psychological damage caused by toxic theology

When “Helping the Homeless” Becomes a Trojan Horse

Why Trump’s new executive order deserves close scrutiny

President Trump signed an executive order on July 24, 2025, calling on states and cities to clear homeless encampments and expand involuntary psychiatric treatment, framed as a move to improve public safety and compassion

At first glance, it seems reasoned: address the homelessness crisis in many progressive cities, restore order, & help those with severe mental illness. But when I read it closely, and the language….phrases like “untreated mental illness,” “public nuisance,” and “at risk of harm”is vague enough, subjective enough, and feels ripe for misuse 😳

This goes beyond homelessness. It marks a shift toward normalizing forced institutionalization, a trend with deep roots in American psychiatric history.

We explored this dark legacy in a recent episode, Beneath the White Coats 🥼 and if you listened to that episode, you’ll know that

compulsory commitment isn’t new.

Historically, psychiatric institutions in the U.S. served not just medical needs but social control. Early 20th-century asylums housed the poor, the racially marginalized, and anyone deemed “unfit.”

The International Congress of Eugenics’ Logo 1921

The eugenics movement wasn’t a fringe ideology….it was supported by mainstream medical groups, state law, and psychiatry. Forced sterilization, indefinite confinement, and ambiguous diagnoses like “moral defectiveness” were justified under the guise of public health.

Now, an executive order gives local governments incentives (and of course funding 💰 is always tied to compliance) to loosen involuntary commitment laws and redirect funding to those enforcing anti-camping and drug-use ordinances instead of harm reduction programs

Once states rewrite their laws to align with the order’s push toward involuntary treatment and if “public nuisance” or “mental instability” are to be interpreted broadly…

Now, you don’t have to be homeless to be at risk. A public disturbance, a call from a neighbor, even a refusal to comply with treatment may trigger involuntary confinement.

Is it just me, or does this feel like history is repeating?

We’ve seen where badly defined psychiatric authority leads: disproportionate targeting, loss of civil rights, and institutionalization justified as compassion. Today’s executive order could enable a similar expansion of psychiatric control.

So.. what do you think? Is this just a homelessness policy? or is it another slippery slope?

Beneath the White Coats: Psychiatry, Eugenics, and the Forgotten Graves

Dogma in a Lab Coat

We like to believe science is self-correcting—that data drives discovery, that good ideas rise, and bad ones fall. But when it comes to mental health, modern society is still tethered to a deeply flawed framework—one that pathologizes human experience, medicalizes distress, and often does more harm than good.

Psychiatry has long promised progress, yet history tells a different story. From outdated treatments like bloodletting to today’s overprescription of SSRIs, we’ve traded one form of blind faith for another. These drugs—still experimental in many respects—carry serious risks, yet are handed out at staggering rates. And rather than healing root causes, they often reinforce a narrative of victimhood and chronic dysfunction.

The pharmaceutical industry now drives diagnosis rates, shaping public perception and clinical practice in ways that few understand. What’s marketed as care is often a system of control. In this episode, we revisit the dangers of consensus-driven science—how it silences dissent and rewards conformity.

Because science, like religion or politics, can become dogma. Paradigms harden. Institutions protect their power. And the costs are human lives.

But beneath this entire structure lies a deeper, more uncomfortable question—one we rarely ask:

What does it mean to be a person?

Are we just bodies and brains—repairable, programmable, replaceable? Or is there something more?

Is consciousness a glitch of chemistry, or is it a window into the soul?

Modern psychiatry doesn’t just treat symptoms—it defines the boundaries of personhood. It tells us who counts, who’s disordered, who can be trusted with autonomy—and who can’t.

But what if those definitions are wrong?

We’ve talked before about the risks of unquestioned paradigms—how ideas become dogma, and dogma becomes control. In a past episode, How Dogma Limits Progress in Fitness, Nutrition, and Spirituality, we explored Rupert Sheldrake’s challenge to the dominant scientific worldview—his argument that science itself had become a belief system, closing itself off to dissent. TED removed that talk, calling it “pseudoscience.” But many saw it as an attempt to protect the status quo—the high priests of data and empiricism silencing heresy in the name of progress. We will revisit his work later on in our conversation. 

We’ve also discussed how science, more than politics or religion, is often weaponized to control behavior, shape belief, and reinforce social hierarchies. And in a recent Taste Test Thursday episode, we dug into how the industrial food system was shaped not just by profit but by ideology—driven by a merger of science and faith.

To read more:

This framework—that science is never truly neutral—becomes especially chilling when you look at the history of psychiatry.

To begin this conversation, we’re going back—not to Freud or Prozac, but further. To the roots of American psychiatry. To two early figures—John Galt and Benjamin Rush—whose ideas helped define the trajectory of an entire field. What we find there presents a choice: a path toward genuine hope, or a legacy of continued harm.

This  story takes us into the forgotten corners of that history, a place where “normal” and “abnormal” were declared not by discovery, but by decree.

Clinical psychiatrist Paul Minot put it plainly:

“Psychiatry is so ashamed of its history that it has deleted much of it.”

And for good reason.

Psychiatry’s early roots weren’t just tangled with bad science—they were soaked in ideology. What passed for “treatment” was often social control, justified through a veneer of medical language. Institutions were built not to heal, but to hide. Lives were labeled defective. 

We would like to think that medicine is objective, that the white coat stands for healing. But behind those coats was a mission to save society from the so-called “abnormal.”
But who defined normal?
And who paid the price?


The Forgotten Legacy of Dr. John Galt

Lithograph, “Virginia Lunatic Asylum at Williamsburg, Va.” by Thomas Charles Millington, ca.1845. Block & Building Files – Public Hospital, Block 04, Box 07. Image citation: D2018-COPY-1104-001. Special Collections.

Long before DSM codes and Big Pharma, the first freestanding mental hospital  in America called Eastern Lunatic Asylum opened its doors in 1773—just down the road from where I live, in Williamsburg, Virginia. Though officially declared a hospital, it was commonly known as “The Madhouse.” For most who entered, institutionalization meant isolation, dehumanization, and often treatment worse than what was afforded to livestock. Mental illness was framed as a threat to the social order—those deemed “abnormal” were removed from society and punished in the name of care.

But one man dared to imagine something different.

Dr. John Galt II, appointed as the first medical superintendent of the hospital (later known as Eastern State), came from a family of alienists—an old-fashioned term for early psychiatrists. The word comes from the Latin alienus, meaning “other” or “stranger,” and referred to those considered mentally “alienated” from themselves or society. Today, of course, the word alien has taken on very different connotations—especially in the heated political debates over immigration. It’s worth clarifying: the historical use of alienist had nothing to do with immigration or nationality. It was a clinical label tied to 19th-century psychiatry, not race or citizenship. But like many terms, it’s often misunderstood or manipulated in modern discourse.

Galt, notably, broke with the harsh legacy of many alienists of his time. Inspired by French psychiatrist Philippe Pinel—often credited as the first true psychiatrist—Galt embraced a radically compassionate model known as moral therapy. Where others saw madness as a threat to be controlled, Galt saw suffering that could be soothed. He believed the mentally ill deserved dignity, freedom, and individualized care—not chains or punishment. He refused to segregate patients by race. He treated enslaved people alongside the free. And he opposed the rising belief—already popular among his fellow psychiatrists—that madness was simply inherited, and the mad were unworthy of full personhood.

Credit: The Valentine
Original Author: Cook Collection
Created: Late nineteenth to early twentieth century

Rather than seeing madness as a biological defect to be subdued or “cured,” Galt and Pinel viewed it as a crisis of the soul. Their methods rejected medical manipulation and instead focused on restoring dignity. They believed that those struggling with mental affliction should be treated not as deviants but as ordinary people, worthy of love, freedom, and respect.

Dr. Marshall Ledger, founder and editor of Penn Medicine, once quoted historian Nancy Tomes to summarize this period:

“Medical science in this period contributed to the understanding of mental illness, but patient care improved less because of any medical advance than because of one simple factor: Christian charity and common sense.”

Galt’s asylum was one of the only institutions in the United States to treat enslaved people and free Black patients equally—and even to employ them as caregivers. He insisted that every person, regardless of race, had a soul of equal moral worth. His belief in equality and metaphysical healing put him at odds with nearly every other psychiatrist of his time.

And he paid the price.

The psychiatric establishment, closely allied with state power and emerging medical-industrial interests, rejected his human-centered model. Most psychiatrists of the era endorsed slavery and upheld racist pseudoscience. The prevailing consensus was rooted in hereditary determinism—that madness and criminality were genetically transmitted, particularly among the “unfit.”

This growing belief—that mental illness was a biological flaw to be medically managed—was not just a scientific view, but an ideological one. Had Galt’s model of moral therapy been embraced more broadly, it would have undermined the growing assumption that biology and state-run institutions offered the only path to sanity. It would have challenged the idea that human suffering could—and should—be controlled by external authorities.

Instead, psychiatry aligned with power.

Moral therapy was quietly abandoned. And the field moved steadily toward the medicalized, racialized, and state-controlled version of mental health that would pave the way for both eugenics and the modern pharmaceutical regime.

“The Father of American Psychiatry”

Long before Auschwitz. Long before the Eugenics Record Office. Long before sterilization laws and IQ tests, there was Dr. Benjamin Rush—signer of the Declaration of Independence, founder of the first American medical school, and the man still honored as the “father of American psychiatry.” His portrait hangs today in the headquarters of the American Psychiatric Association.

Though many historians point to Francis Galton as the father of eugenics, it was Rush—nearly a century earlier—who laid much of the ideological groundwork. He argued that mental illness was biologically determined and hereditary. And he didn’t stop there.

Rush infamously diagnosed Blackness itself as a form of disease—what he called “negritude.” He theorized that Black people suffered from a kind of leprosy, and that their skin color and behavior could, in theory, be “cured.” He also tied criminality, alcoholism, and madness to inherited degeneracy, particularly among poor and non-white populations.

These ideas found a troubling ally in Charles Darwin’s emerging theories of evolution and heredity. While Darwin’s work revolutionized biology, it was often misused to justify racist notions of racial hierarchy and biological determinism.

Rush’s medical theories were mainstream and deeply influential, shaping generations of physicians and psychiatrists. Together, these ideas reinforced the belief that social deviance and mental illness were rooted in faulty bloodlines—pseudoscientific reasoning that provided a veneer of legitimacy to racism and social control within medicine and psychiatry.

The tragic irony? While Rush advocated for the humane treatment of the mentally ill in certain respects, his racial theories helped pave the way for the pathologizing of entire populations—a mindset that would fuel both American and European eugenics movements in the next century.

American Eugenics: The Soil Psychiatry Grew From

Before Hitler, there was Cold Spring Harbor. Founded in 1910, the Eugenics Record Office (ERO) operated out of Cold Spring Harbor Laboratory in New York with major funding from the Carnegie Institution, later joined by Rockefeller Foundation money. It became the central hub for American eugenic research, gathering family pedigrees to trace so-called hereditary defects like “feeblemindedness,” “criminality,” and “pauperism.”

Between the early 1900s and 1970s, over 30 U.S. states passed forced sterilization laws targeting tens of thousands of people deemed unfit to reproduce. The justification? Traits like alcoholism, poverty, promiscuity, deafness, blindness, low IQ, and mental illness were cast as genetic liabilities that threatened the health of the nation.

The practice was upheld by the U.S. Supreme Court in 1927 in the infamous case of Buck v. Bell. In an 8–1 decision, Justice Oliver Wendell Holmes Jr. wrote, “Three generations of imbeciles are enough,” greenlighting the sterilization of 18-year-old Carrie Buck, a young woman institutionalized for being “feebleminded”—a label also applied to her mother and child. The ruling led to an estimated 60,000+ sterilizations across the U.S.

And yes—those sterilizations disproportionately targeted African American, Native American, and Latina women, often without informed consent. In North Carolina alone, Black women made up nearly 65% of sterilizations by the 1960s, despite being a much smaller share of the population.

Eugenics wasn’t a fringe pseudoscience. It was mainstream policy—supported by elite universities, philanthropists, politicians, and the medical establishment.

And psychiatry was its institutional partner.

The American Journal of Psychiatry published favorable discussions of sterilization and even euthanasia for the mentally ill as early as the 1930s. American psychiatrists traveled to Nazi Germany to observe and advise, and German doctors openly cited U.S. laws and scholarship as inspiration for their own racial hygiene programs.

In some cases, the United States led—and Nazi Germany followed.

The International Congress of Eugenics’ Logo 1921

This isn’t conspiracy. It’s history. Documented, peer-reviewed, and disturbingly overlooked.


From Ideology to Institution

By the early 20th century, the groundwork had been laid. Psychiatry had evolved from a fringe field rooted in speculation and racial ideology into a powerful institutional force—backed by universities, governments, and the courts. But its foundation was still deeply compromised. What had begun with Benjamin Rush’s biologically deterministic theories and America’s eugenic policies now matured into a formalized doctrine—one that treated human suffering not as a relational or spiritual crisis, but as a defect to be categorized, corrected, or eliminated.

This is where the five core doctrines of modern psychiatry emerge.

The Five Doctrines That Shaped Modern Psychiatry

These five doctrines weren’t abandoned after World War II. They were rebranded, exported, and quietly absorbed into the foundations of American psychiatry.

1. The Elimination of Subjectivity

Patients were no longer seen as people with stories, pain, or meaning—they were seen as bundles of symptoms. Suffering was abstracted into clinical checklists. The Diagnostic and Statistical Manual of Mental Disorders (DSM) became the gold standard, not because it offered clear science, but because it offered utility: a standardized language that served pharmaceutical companies, insurance billing, and bureaucratic control. If you could name it, you could code it—and medicate it.

2. The Eradication of Spiritual and Moral Meaning

Struggles once understood through relational, existential, or moral frameworks were stripped of depth. Grief became depression. Anger became oppositional defiance. Existential despair was reduced to a neurotransmitter imbalance. The soul was erased from the conversation. As Berger notes, suffering was no longer something to be witnessed or explored—it became something to be treated, as quickly and quietly as possible.

3. Biological Determinism

Mental illness was redefined as the inevitable result of faulty genes or broken brain chemistry—even though no consistent biological markers have ever been found. The “chemical imbalance” theory, aggressively marketed throughout the late 20th century, was never scientifically validated. Yet it persists, in part because it sells. Selective serotonin reuptake inhibitors (SSRIs)—still widely prescribed—were promoted on this flawed premise, despite studies showing they often perform no better than placebo and come with serious side effects, including emotional blunting, dependence, and sexual dysfunction.

4. Population Control and Racial Hygiene

In Germany, this meant sterilizing and exterminating those labeled “life unworthy of life.” In the U.S., it meant forced sterilizations of African-American and Native American women, institutionalizing the poor, the disabled, and the nonconforming. These weren’t fringe policies—they were mainstream, upheld by law and supported by leading psychiatrists and journals. Even today, disproportionate diagnoses in communities of color, coercive treatments in prisons and state hospitals, and medicalization of poverty reflect these same logics of control.

5. The Use of Institutions for Social Order

Hospitals became tools for enforcing conformity. Psychiatry wasn’t just about healing—it was about managing the unmanageable, quieting the inconvenient, and keeping society orderly. From lobotomies to electroshock therapy to modern-day involuntary holds, psychiatry has long straddled the line between medicine and discipline. Coercive treatment continues under new names: community treatment orders, chemical restraints, and state-mandated compliance.

These doctrines weren’t discarded after the fall of Nazi Germany. They were imported. Adopted. Rebranded under the guise of “evidence-based medicine” and “public health.” But the same logic persists: reduce the person, erase the context, medicalize the soul, and reinforce the system.


Letchworth Village: The Human Cost

I didn’t simply read this in a textbook. I stood there—on the edge of those woods—next to rows of numbered graves.

In 2020, while waiting to close on our New York house, my husband and I were staying in an Airbnb in Rockland County. We were walking the dogs one morning nearing the end of Call Hollow Road, there is a wide path dividing thick woodland when we came across a memorial stone:

“THOSE WHO SHALL NOT BE FORGOTTEN.”

We had stumbled upon the entrance to Old Letchworth Village Cemetery, and we instantly felt it’s somber history. Beyond it, rows of T-shaped markers each one a muted testament to the hundreds of nameless victims who perished at Letchworth. Situated just half a mile from the institution, these weathered grave markers reveal only the numbers that were once assigned to forgotten souls—a stark reminder that families once refused to let their names be known. This omission serves as a silent indictment of a system that institutionalized, dehumanized, and ultimately discarded these individuals.

When we researched the history, the truth was staggering.

Letchworth was supposed to be a progressive alternative to the horrors of 19th-century asylums. Instead, it became one of them. By the 1920s, reports described children and adults left unclothed, unbathed, overmedicated, and raped. Staff abused residents—and each other. The dormitories were overcrowded. Funding dried up. Buildings decayed.

The facility was severely overcrowded. Many residents lived in filth, unfed and unattended. Children were restrained for hours. Some were used in vaccine trials without consent. And when they died, they were buried behind the trees—nameless, marked only by small concrete stakes.

I stood among those graves. Over 900 of them. A long row of numbered markers, each representing a life once deemed unworthy of attention, of love, of dignity.

But the deeper horror is what Letchworth symbolized: the idea that certain people were better off warehoused than welcomed, that abnormality was a disease to be eradicated—not a difference to be understood.

This is the real history of psychiatric care in America.


The Problem of Purpose

But this history didn’t unfold in a vacuum. It was built on something deeper—an idea so foundational, it often goes unquestioned: that nature has no purpose. That life has no inherent meaning. That humans are complex machines—repairable, discardable, programmable.

This mechanistic worldview didn’t just shape medicine. It has shaped what we call reality itself.

As Dr. Rupert Sheldrake explains in Science Set Free, the denial of purpose in biology isn’t a scientific conclusion—it’s a philosophical assumption. Beginning in the 17th century, science removed soul and purpose from nature. Plants, animals, and human bodies were understood as nothing more than matter in motion, governed by fixed laws. No pull toward the good. No inner meaning.

By the time Darwin’s Origin of Species arrived in the 19th century 1859, this mechanistic lens was fully established. Evolution wasn’t creative—it was random. Life wasn’t guided—it was accidental.

Psychiatry, emerging in this same cultural moment, absorbed this worldview. Suffering was pathologized, difference diagnosed, and the soul reduced to faulty genetics and broken wiring.

Today, that mindset is alive in the DSM’s ever-expanding labels, in the belief that trauma is a chemical imbalance, that identity issues must be solved with hormones and surgery, and in the reflex to medicate children who don’t conform.

But what if suffering isn’t a bug in the system?

What if it’s a signal?

What if these so-called “disorders” are cries for meaning in a world that pretends meaning doesn’t exist?

The graves at Letchworth aren’t just a warning about medical abuse. They are a mirror—reflecting what happens when we forget that people are not problems to be solved, but souls to be seen.

Sheldrake writes, “The materialist denial of purpose in evolution is not based on evidence, but is an assumption.” Modern science insists all change results from random mutations and blind forces—chance and necessity. But these claims are not just about biology. They influence how we see human beings: as broken machines to be repaired or discarded.

As we said, in the 17th century, the mechanistic revolution abolished soul and purpose from nature—except in humans. But as atheism and materialism rose in the 19th century, even divine and human purpose were dismissed, replaced by the ideal of scientific “progress.” Psychiatry emerged from this philosophical soup, fueled not by reverence for the human soul but by the desire to categorize, control, and “correct” behavior—by any mechanical means necessary.

What if that assumption is wrong? What if the people we label “disordered” are responding to something real? What if our suffering has meaning—and our biology is not destiny?

“Genetics” as the New Eugenics

Today, psychiatry no longer speaks in the language of race hygiene.

It speaks in the language of genes.

But the message is largely the same:

You are broken at the root.

Your biology is flawed.

And the only solution is lifelong medication—or medical intervention.

We now tell people their suffering is rooted in faulty wiring, inherited defects, or bad brain chemistry—despite decades of inconclusive or contradictory evidence.

We still medicalize behaviors that don’t conform.

We still pathologize pain that stems from trauma, poverty, or social disconnection.

We still market drugs for “chemical imbalances” that have never been biologically verified.

And we still pretend this is science—not ideology.

But as Dr. Rupert Sheldrake argues in Science Set Free, even the field of genetics rests on a fragile and often overstated foundation. In Chapter 6, he challenges one of modern biology’s core assumptions: that all heredity is purely material—that our traits, tendencies, and identities are completely locked in by our genes.

But this isn’t how people have understood inheritance for most of human history.

Long before Darwin or Mendel, breeders, farmers, and herders knew how to pass on traits. Proverbs like “like father, like son” weren’t based on lab results—they were based on generations of observation. Dogs were bred into dozens of varieties. Wild cabbage became broccoli, kale, and cauliflower. The principles of heredity weren’t discovered by science; they were named by science. They were already in practice across the world.

What Sheldrake points out is that modern biology took this folk knowledge, stripped it of its nuance, and then centralized it—until genes became the sole explanation for almost everything.

And that’s a problem.

Because genetics has been crowned the ultimate cause of everything from depression to addiction, from ADHD to schizophrenia. When the outcomes aren’t clear-cut, the answer is simply: “We haven’t mapped the genome enough yet.”

But what if the model is wrong?

What if suffering isn’t locked in our DNA?

What if genes are only part of the story—and not even the most important part?

By insisting that people are genetically flawed, psychiatry sidesteps the deeper questions:

  • What happened to you?
  • What story are you carrying?
  • What environments shaped your experience of the world?

It pathologizes people—and exonerates systems.

Instead of exploring trauma, we prescribe pills.

Instead of restoring dignity, we reduce people to diagnoses.

Instead of healing souls, we treat symptoms.

Modern genetics, like eugenics before it, promises answers. But too often, it delivers a verdict: you were born broken.

We can do better.

We must do better.

Because healing doesn’t come from blaming bloodlines or rebranding biology.

It comes from listening, loving, and refusing to reduce people to a diagnosis or a gene sequence.


The Hidden Truth About Trauma and Diagnosis

As Pete Walker references Dr. John Briere’s poignant observation: if Complex PTSD and the role of early trauma were fully acknowledged by psychiatry, the Diagnostic and Statistical Manual of Mental Disorders (DSM) could shrink from a massive textbook to something no larger than a simple pamphlet.

We’ve previously explored the crucial difference between PTSD and complex PTSD—topics like trauma, identity, neuroplasticity, stress, survival, and what it truly means to come home to yourself. This deeper understanding exposes a vast gap between real human experience and how mental health is often diagnosed and treated today.

Instead of addressing trauma with truth and compassion, the system expands diagnostic categories, medicalizes pain, and silences those who suffer.

The Cost of Our Silence

Many of us know someone who’s been diagnosed, hospitalized, or medicated into submission.

Some of us have been that person.

And we’re told this is progress. That this is compassion. That this is care.

But when I stood at the edge of those graves in Rockland County—row after row of anonymous markers—nothing about this history felt compassionate.

It felt buried. On purpose.

We must unearth it.

Not to deny mental suffering—but to reclaim the right to define it for ourselves.

To reimagine what healing could look like, if we dared to value dignity over diagnosis.

Because psychiatry hasn’t “saved” the abnormal.

It has often silenced, sterilized, and sacrificed them.

It has named pain as disorder.

Difference as defect.

Trauma as pathology.

The DSM is not a Bible.

The white coat is not a priesthood.

And genetics is not destiny.

We need better language, better questions, and better ways of relating to each other’s pain.

And that brings us full circle—to a man most people have never heard of: Dr. John Galt II.

Nearly 200 years ago, in Williamsburg, Virginia, Galt ran the first freestanding mental hospital in America. But unlike many of his peers, he rejected chains, cruelty, and coercion. He embraced what he called moral treatment—an approach rooted in truth, love, and human dignity. Galt didn’t see the “insane” as dangerous or defective. He saw them as souls.

He was influenced by Philippe Pinel, the French physician who famously removed shackles from asylum patients in Paris. Together, these early reformers dared to believe that healing began not with force, but with presence. With relationship. With care.

Galt refused to segregate patients by race. He treated enslaved people alongside the free. And he opposed the rising belief—already popular among his fellow psychiatrists—that madness was simply inherited, and the mad were unworthy of full personhood.

But what does it mean to recognize someone’s personhood?

Personhood is more than just being alive or having a body. It’s about being seen as a full human being with inherent dignity, moral worth, and rights—someone whose inner life, choices, and experiences matter. Recognizing personhood means acknowledging the whole person beyond any diagnosis, disability, or social status.

This question isn’t just philosophical—it’s deeply practical and contested. It’s at the heart of debates over mental health care, disability rights, euthanasia and even abortion. When does a baby become a person? When does someone with a mental illness or cognitive difference gain full moral consideration? These debates all circle back to how we define humanity itself.

In Losing Our Dignity: How Secularized Medicine Is Undermining Fundamental Human Equality, Charles C. Camosy warns that secular, mechanistic medicine can strip people down to biological parts—genes, symptoms, behaviors—rather than seeing them as full persons. This reduction risks denying people their dignity and the respect that comes with being more than the sum of their medical conditions.

Galt’s approach stood against this reduction. He saw patients as complex individuals with stories and struggles, deserving compassion and respect—not just as “cases” to be categorized or “disorders” to be fixed.

To truly recognize personhood is to honor that complexity and to affirm that every individual, regardless of race, mental health, or social status, has an equal claim to dignity and care.

But… Galt’s approach was pushed aside.

Why?

Because it didn’t serve the state.

Because it didn’t serve power.

Because it didn’t make money.

Today, we see a similar rejection of truth and compassion.

When a child in distress is told they were “born in the wrong body,” we call it gender-affirming care.

When a woman, desperate to be understood, is handed a borderline personality disorder label instead.

When medications with severe side effects are pushed as the only solution, we call it science.

But are we healing the person—or managing the symptoms?

Are we meeting the soul—or erasing it?

We’ve medicalized the human condition—and too often, we’ve called that progress.

We’ve spoken before about the damage done by Biblical counseling programs when therapy is replaced with doctrine—how evangelical frameworks often dismiss pain as rebellion, frame anger as sin, and pressure survivors into premature forgiveness.

But the secular system is often no better. A model that sees people as nothing more than biology and brain chemistry may wear a lab coat instead of a collar—but it still demands submission.

Both systems can bypass the human being in front of them.

Both can serve control over compassion.

Both can silence pain in the name of order.

What we truly need is something deeper.

To be seen.

To be heard.

To be honored in our complexity—not reduced to a diagnosis or a moral failing.

It’s time to stop.

It’s time to remember that human suffering is not a clinical flaw. It’s time to remember the metaphysical soul/psyche. 

Our emotional pain is not a chemical defect.

That being different, distressed, or deeply wounded is not a disease.

It’s time to recover the wisdom of Dr. John Galt II.

To treat those in pain—not as problems to be solved—but as people to be seen.

To offer truth and love, not labels, not sterilizing surgeries and lifelong prescriptions.

Because if we don’t, the graves will keep multiplying—quietly, behind institutions, beneath a silence we dare not disturb.

But we must disturb it.

Because they mattered.

And truth matters.

And the most powerful medicine has never been compliance or chemistry.

It’s being met with real humanity.

Being listened to. Believed.

Not pathologized. Not preached at. Not controlled.

But loved—in the deepest, most grounded sense of the word.

The kind of love that doesn’t look away.

The kind that tells the truth, even when it’s costly.

The kind that says: you are not broken—you are worth staying with.

Because to love someone like that…

is to recognize their personhood.

And maybe that’s the most radical act of all.

SOURCES:

  • “Director of the Kaiser Wilhelm Institute for Anthropology, Human Heredity, and Eugenics from 1927 to 1942, [Eugen] Fischer authored a 1913 study of the Mischlinge (racially mixed) children of Dutch men and Hottentot women in German southwest Africa. Fischer opposed ‘racial mixing, arguing that “negro blood” was of ‘lesser value and that mixing it with ‘white blood’ would bring about the demise of European culture” (United States Holocaust Memorial Museum, “Deadly Medicine: Creating the Master Race,” HMM Online: https://www.ushmm.org/exhibition/deadly-medicine/ profiles/). See also, Richard C. Lewontin, Steven Rose, and Leon J. Kamin, Not in Our Genes: Biology, Ideology, and Human Nature 2nd edition (Chicago: Haymarket Books, 2017), 207.
  • Gonaver, The Making of Modern Psychiatry
  • Saving Abnormal-The Disorder of Psychiatric Genetics-Daneil R Berger II
  • Lost Architecture: Eastern State Hospital – Colonial Williamsburg
  • 📘 General History of American Eugenics
    Lombardo, Paul A.
    Three Generations, No Imbeciles: Eugenics, the Supreme Court, and Buck v. Bell (2008)
    This book is the definitive account of Buck v. Bell and American eugenics law. It documents how widespread sterilizations were and provides legal and historical context.
    Black, Edwin.
    War Against the Weak: Eugenics and America’s Campaign to Create a Master Race (2003)
    Covers the U.S. eugenics movement in depth, including funding by Carnegie and Rockefeller, Cold Spring Harbor, and connections to Nazi Germany.
    Kevles, Daniel J.
    In the Name of Eugenics: Genetics and the Uses of Human Heredity (1985)
    A foundational academic history detailing how early American psychiatry and genetics were interwoven with eugenic ideology.

    🧬 Institutions & Funding
    Cold Spring Harbor Laboratory Archives
    https://www.cshl.edu
    Documents the history of the Eugenics Record Office (1910–1939), its funding by the Carnegie Institution, and its influence on U.S. and international eugenics.
    The Rockefeller Foundation Archives
    https://rockarch.org
    Shows how the foundation funded eugenics research both in the U.S. and abroad, including programs that influenced German racial hygiene policies.

    ⚖️ Sterilization Policies & Buck v. Bell
    Supreme Court Decision: Buck v. Bell, 274 U.S. 200 (1927)
    https://supreme.justia.com/cases/federal/us/274/200/
    Includes Justice Holmes’ infamous quote and the legal justification for forced sterilization.
    North Carolina Justice for Sterilization Victims Foundation
    https://www.ncdhhs.gov
    Reports the disproportionate targeting of Black women in 20th-century sterilization programs.
    Stern, Alexandra Minna.
    Eugenic Nation: Faults and Frontiers of Better Breeding in Modern America (2005)
    Explores race, sterilization, and medical ethics in eugenics programs, with data from states like California and North Carolina.

    🧠 Psychiatry’s Role & Nazi Connections
    Lifton, Robert Jay.
    The Nazi Doctors: Medical Killing and the Psychology of Genocide (1986)
    Shows how American eugenics—including psychiatric writings—helped shape Nazi ideology and policies like Aktion T-4 (the euthanasia program).
    Wahl, Otto F.
    “Eugenics, Genetics, and the Minority Group Mentality” in American Journal of Psychiatry, 1985.
    Traces how psychiatric institutions were complicit in promoting eugenic ideas.
    American Journal of Psychiatry Archives
    1920s–1930s issues include articles in support of sterilization and early euthanasia rhetoric.
    Available via https://ajp.psychiatryonline.org

1984 and The Handmaid’s Tale: Misplaced Parallels and Liberal Delusion

Breaking Free: A Conversation with Yasmine Mohammed on Radical Islam, Empowerment, and the West’s Blind Spots

After finishing George Orwell’s 1984, I noticed its resurgence in popularity, especially after Trump’s election. Ironically, it’s not the conservative right but the progressive left that increasingly mirrors Orwellian themes. Similarly, Margaret Atwood’s The Handmaid’s Tale has become a rallying cry for liberals who claim to be on the brink of a dystopian theocracy. Yet, as Yasmine Muhammad pointed out in this week’s episode, this comparison is not only absurd but deeply insulting to women who live under regimes where Atwood’s fiction is a grim reality.

1984: Rewriting Language and History

The Democratic Party’s obsession with redefining language is straight out of Orwell’s playbook. They tell us biology is bigotry and that there are infinite genders, forcing people to adopt nonsensical pronouns or risk social ostracism. This is not progress—it’s the weaponization of language to control thought, eerily similar to Orwell’s Newspeak.

But it doesn’t stop there. They actively rewrite history by renaming monuments, military bases, and even schools, erasing cultural markers in the name of ideological purity. This is doublespeak in action: the manipulation of truth for political orthodoxy. Orwell’s warning that “orthodoxy is unconsciousness” feels disturbingly apt when observing the modern left.

The Handmaid’s Tale: An Insult to Women Who Actually Suffer

In our conversation, Yasmine highlighted the absurdity of liberal claims that America is The Handmaid’s Tale come to life. Yasmine, who grew up under Islamic theocracy, knows firsthand what it’s like to live in a world where women have no autonomy. These women cannot see a doctor without a male guardian, are forced to cover every inch of their bodies, and are denied basic freedoms like education or the right to drive.

Contrast this with the West, where women have more freedom than any other point in history. Liberal women can run around naked at Pride parades, freely express their sexuality, and redefine what it means to be a woman altogether. And yet, they cry oppression because they are expected to pay for their own birth control or endure debates over abortion limits. This level of cognitive dissonance—claiming victimhood while living in unprecedented freedom—is a slap in the face to women who actually suffer under real patriarchal oppression.

Liberal Orthodoxy: Lost in the Sauce

What’s truly Orwellian is how the left uses its freedom to strip others of theirs. They shout about inclusivity but cancel anyone who disagrees. They claim to fight for justice while weaponizing institutions to enforce ideological conformity. Meanwhile, they are so consumed with their own victim complex that they fail to see how absurd their comparisons to dystopian fiction really are.

Orwell and Atwood warned against unchecked power and ideological extremism. If liberals actually read these books instead of using them as aesthetic props, they might realize they’re mirroring the very authoritarianism they claim to oppose. Instead, they’re lost in the sauce, preaching oppression in a society where they have more freedom than they can handle.

As Yasmine said, “You want to see The Handmaid’s Tale? Try being a woman in Saudi Arabia, Iran, or Afghanistan.” The left would do well to remember that before playing the victim in their cosplay dystopia.

Understanding the Evolution of Witch Hunts

Welcome to Taste of Truth Tuesdays, where we unravel the strange, the mysterious, and today—the terrifying. This post delves into one of history’s darkest chapters: the witch hunts. We’ll explore how fear, superstition, and control shaped centuries of persecution and how these patterns are still evident in the modern world. Witch hunts aren’t just a thing of the past—they’ve evolved.

The European Witch Hunts – Early Modern Europe

Let’s start in early modern Europe. Scholar Peter Maxwell-Stuart illuminates the rise of demonology, where the fear of magic and the devil became a weapon of control for those in power. Beginning in the 1500s, political and religious leaders manipulated entire populations by tapping into their deep-rooted fears of ‘evil forces.’ The Church, in particular, weaponized these beliefs, positioning itself as the protector against witches—women (and sometimes men) believed to consort with devils or conjure dark forces. As the idea took hold that witches could be behind every famine, illness, or death, this created a perfect storm of paranoia.

Stuart argues that demonology texts—many sanctioned by the Church—fueled mass hysteria, feeding the narrative that witches were not just local troublemakers but cosmic agents of Satan, hell-bent on destroying Christendom. Ordinary people lived in constant fear of betrayal by their neighbors, leading to accusations that could swiftly escalate into brutal trials, with the accused often tortured into confessing their ‘diabolical’ crimes.

To understand how demonology in Europe gained such traction, we need to go back to Augustine of Hippo. We have mentioned him before in previous episodes, whose writings in the 4th and 5th centuries laid the foundation for Christian perceptions of the devil and demons. Augustine’s ideas, especially in City of God, emphasized the constant spiritual warfare between good and evil, casting demons as agents of Satan working tirelessly to undermine God’s plan. He argued that humans were caught in this cosmic battle, susceptible to the devil’s temptations and tricks.

‘Augustine before a group of demons’, from ‘De civitate Dei’ by Augustine, trans. by Raoul de Presles, late 15th Century

Augustine’s Doctrine of Demons

According to Augustine, demons were fallen angels who had rebelled and now sought to deceive and destroy humanity. While Augustine didn’t explicitly discuss witches, his interpretation of demons helped fuel the belief that humans could be manipulated by evil spirits—whether through pacts, possession, or magical practices. This idea later influenced medieval and early modern European demonology.

Augustine’s views on original sin—that humanity is inherently flawed and in need of salvation—also intensified fears that people, especially women (who were seen as ‘weaker’ spiritually), were more vulnerable to the devil’s influence.

SIDE NOTE: We have discussed the theological concept of original sin in previous episodes: Franciscan wisdom navigating spiritual growth and challenges with Carrie Moore, we specifically spun the doctrine of original sin on its head and then also Unpacking Religious Trauma: Navigating the Dynamics of Faith Deconstruction with Doctor Mark Karris.

In the centuries that followed, these ideas were weaponized to justify witch hunts. Augustine’s legacy is evident in how later theologians and demonologists, such as Heinrich Kramer (author of the infamous Malleus Maleficarum), built upon his ideas of demonic interference to condemn witchcraft as a real, existential threat to Christian society.

Maxwell-Stuart reveals that the creation of demonology wasn’t just religious but deeply political. Kings and clergy alike realized they could consolidate power by stoking the flames of fear, casting witches and sorcerers as a common enemy. The trials served a dual purpose: they reinforced the Church’s supremacy over the spiritual realm and gave ruling elites a tool for maintaining social order. Accusing someone of witchcraft was an effective way to silence dissent or settle personal scores.

Fear as a Tool of Control

Fear wasn’t just manufactured by rulers—it was deeply ingrained in the societal, religious, and legal systems of the time. Scholar Sophie Page reveals how beliefs in magic and the supernatural were not fringe ideas but core components of medieval and early modern life. Magic wasn’t merely a mysterious force; it was a pervasive explanation for any calamity. Failed harvests, plagues, or unexplained illnesses were often attributed to witches or the devil, creating a society constantly on edge, where supernatural forces were believed to lurk behind every misfortune.

By embedding these beliefs into legal codes, authorities could target suspected witches or sorcerers under the guise of protecting the community. Page’s work illustrates how rituals once seen as protective or healing gradually became demonized. Harmless folk practices and herbal remedies, used for centuries, began to be recast as witchcraft, especially when things went wrong. People, particularly those in rural areas, were vulnerable to this thinking because religion and superstition were inseparable from daily life.

Partisan scholars have long debated whether Catholics or Protestants were the “real” witch hunters, but they’ve made little headway. One important change in Christian morality, as discussed by John Bossie, occurred between the 14th and 16th centuries. The moral focus shifted from the Seven Deadly Sins—pride, greed, lust, envy, gluttony, anger, and sloth—to the Ten Commandments. This change, influenced by reform movements that shaped the Protestant Reformation, prioritized sins against God over those against the community. Idolatry and the worship of false gods became viewed as the gravest offenses.

This redefinition of witchcraft followed suit. Instead of being seen as harmful actions toward neighbors, witchcraft was now linked directly to devil worship and regarded as serious heresy. Scholars and church leaders began merging various forms of folk magic and healing into this new narrative, suggesting that practitioners were either knowingly or unknowingly making deals with the devil. Confessions of pacts or attendance at “witch gatherings” were shaped to highlight community failings, like envy and resentment. Consequently, educated society began to see witchcraft as a real threat rather than mere superstition. While traditional beliefs about magic still existed, they were overshadowed by fears of violent backlash from reformers.

The Power of Dualistic Thinking

This dualistic thinking, influenced by St. Augustine, gave rise to a semi-Manichean worldview, where the struggle between good and evil became more pronounced. Manichaeism, an ancient belief system, viewed the world as a battleground between equal forces of good and evil. Although orthodox Christianity rejected this dualism, the focus on the devil’s role in everyday life blurred those lines for many people. By emphasizing the devil’s pervasive influence, religious leaders inadvertently created a belief system in which evil seemed as powerful as good.

In this semi-Manichean view, the devil was not just a tempter of individuals but a corrupting force within communities and even within political and religious practices deemed heretical. Fears of devil-worshipping conspiracies became intertwined with anxieties about witchcraft and moral decay. Reformers, particularly in Protestant movements, fueled these fears by branding idolatry, Catholic rituals, and even folk healing as dangerous openings for the devil’s influence. This perspective transformed witchcraft from a local issue into a broader threat against God and society.

The result was a potent mix of dualistic thinking and an intense focus on spiritual warfare. This not only intensified the persecution of supposed witches but also reinforced the obsession with eliminating anything considered “satanic.” The ideological shift redefined witchcraft as a communal danger, turning innocent healing practices into accusations of demonic pacts.

Every village had its own ‘cunning folk’—individuals skilled in healing and folk magic—yet these very people could easily become scapegoats when something went wrong. The legal structures played a vital role in perpetuating this cycle of fear. Church courts, bolstered by theologians and demonologists, were empowered to try individuals accused of witchcraft, and the accusations quickly spiraled into mass hysteria. Trials often relied on tortured confessions, reinforcing the belief that witches and the devil were real and tangible threats to society. This institutionalized paranoia was a perfect storm of religion, fear, and control.

The Rise of Organized Witch Hunts

Beginning in the late 15th century, witch trials escalated into full-blown hunts, particularly after the publication of the Malleus Maleficarum in 1487. This infamous witch-hunting manual, written by Heinrich Kramer and endorsed by the Pope, offered legal and theological justifications for hunting down witches. It encouraged harsh interrogations and set guidelines for identifying witches based on superficial evidence like birthmarks, behaviors, and confessions extracted under torture. The legal system, which had already started to turn against folk healers, now had a codified method for persecuting them.

In regions like Germany, Scotland, and Switzerland, these legal trials turned into widespread witch hunts. Hundreds, even thousands, of individuals—predominantly women—were accused and executed. What’s particularly fascinating is that these witch hunts often peaked during periods of societal or economic instability when fear and uncertainty made people more susceptible to attributing their misfortunes to external, supernatural forces.

By institutionalizing the persecution of witches, rulers and religious leaders could manage social unrest and solidify their authority. The trials often reinforced the power structures by demonstrating that anyone perceived as a threat to societal order—whether through suspected witchcraft or merely social nonconformity—could be eradicated.

Witch Hunts and Gender

The scapegoating of women played a crucial role in these witch hunts. Owen Davies’ work reveals how the demonization of witches intersected with misogyny, turning the hunts into a gendered form of control. Midwives, healers, or outspoken women were more likely to be targeted, reinforcing patriarchal authority. The very skills that had once been valued, such as healing and midwifery, were redefined as dangerous and linked to dark powers.

As witch hunts spread, the legal frameworks across Europe became more refined and institutionalized, creating a climate where fear of witches and demonic possession became the norm. The trials’ obsession with confessions—often coerced under brutal conditions—further fueled public paranoia, as the more people confessed to witchcraft, the more tangible the ‘threat’ seemed.

The Modern Echoes of Witch Hunts

Fast forward to today, and we find that the legacy of witch hunts lingers. The tactics of fear-mongering, scapegoating, and social control can still be observed in modern contexts. Contemporary movements often mirror historical witch hunts, targeting marginalized groups through accusations and public shaming. Just as witch hunts flourished in times of societal uncertainty, modern societies can succumb to similar dynamics.

In the age of social media, legal accusations spread like wildfire, and the court of public opinion often acts faster than the courts themselves. Political enemies are dragged through the mud with allegations that may or may not have a basis in fact.

The case of Michael Jackson serves as a poignant example of how media narratives can distort reality. The beloved pop icon faced multiple allegations of child molestation, with the most notable case occurring in 2005 during a highly publicized trial. Accusers claimed that Jackson had abused them, yet the defense presented compelling counterarguments, including challenges to the credibility of the witnesses and highlighting inconsistencies in their testimonies. After a lengthy trial, Jackson was acquitted of all charges, but the media frenzy surrounding the case fueled public debate and sensationalism, earning him the derogatory nickname “Wacko Jacko.” This smear campaign perpetuated false narratives about his character and actions. Behind the scenes, Jackson was embroiled in a lawsuit against Sony Music, a battle he was reportedly winning at the time of these allegations. Furthermore, his controversial doctor, Conrad Murray, who administered drugs to Jackson, faced serious legal consequences for his role in the singer’s death, including manslaughter charges. The intersection of these legal battles and the media frenzy created a complex narrative that ultimately tarnished Jackson’s legacy, and that’s what truly breaks my heart.

By the time these individuals have the chance to clear their names, their reputations—and often their careers—are already in ruins. Davies’ research shows us that while modern witch hunts don’t involve burning at the stake, they do involve trial by media and mob justice.

And we can’t talk about modern-day witch hunts without bringing the CIA into the conversation. Since its inception, the CIA has been at the heart of international political manipulations—using covert methods to shape public perception, interfere in foreign governments, and even influence elections here in the United States. In the 1960s, the agency coined the term ‘conspiracy theorist’ to discredit anyone who questioned the official narratives surrounding events like the assassination of JFK. Those who didn’t toe the line were labeled as ‘paranoid’ or ‘dangerous.’ It was the modern version of labeling someone a witch—turning them into a social outcast, not to be trusted.

Fast forward to today: we see similar tactics used against whistleblowers, journalists, and activists who challenge the powerful. Think about Edward Snowden, Julian Assange, and even political figures targeted by intelligence communities. The second they start exposing uncomfortable truths, they are vilified. Whether through leaks, smear campaigns, or selective legal action, these modern-day ‘witches’ face an onslaught of accusations, designed to discredit them before they can fully tell their story.

In many cases, the evidence behind these accusations is shaky at best. The CIA’s involvement in manipulating public perception goes all the way back to Operation Mockingbird, a secret program to influence media narratives, which showed that controlling information was one of the most powerful tools they had. During the Cold War, the United States engaged in a concerted effort to influence and control media narratives to align with its interests, which involved recruiting journalists and establishing relationships with major media outlets.

Edward Bernays, often referred to as the father of public relations, played a pivotal role in these discussions on media manipulation. Working with several major companies, including Procter & Gamble, General Electric, and the American Tobacco Company, Bernays was instrumental in promoting the cigarette brand Lucky Strike, famously linking it to the women’s liberation movement. His connections extend to notable figures like Sigmund Freud, who was Bernays’ uncle, Freud’s psychoanalytic theories significantly shaped Bernays’ PR strategies. Throughout his career, Bernays leveraged media to influence public perception and political leaders, raising profound questions about the power dynamics of media and its capacity to shape societal narratives. (If you’re intrigued by the intricate interplay of media and propaganda, this is a rabbit hole worth exploring!)

Today, that same fear-mongering tactic is played out on a much larger scale. Accusations, whether of conspiracy, treason, or subversion, become tools to silence anyone questioning the status quo. Just as witches in the past were seen as ‘different’ and thus dangerous, today’s targets are often people who challenge the system.

And while throughout the 1300-1600s, there was no due process for the accused witches, today, we see something similar in the digital realm. There’s no real accountability or fairness in the court of public opinion. All it takes is a viral accusation—a tweet, a blog post, or a video—and the person’s career, family, and mental health can be obliterated overnight. No evidence required, no trial, no defense.

So, what can we learn from this history? From the witch hunts of early modern Europe to today’s viral accusations and political fearmongering, there’s one key lesson: fear remains one of the most dangerous tools of control. When we allow fear to dictate our actions—whether it’s fear of witches, outsiders, or anyone who doesn’t fit into the mold—we lose sight of reason and humanity.

In closing, I’d like to examine the phenomenon of witch hunts through the lens of amygdala hijacking, a topic we discussed in a previous episode. This term refers to the brain’s immediate response to perceived threats, where the amygdala—the emotional center of the brain—takes control, often resulting in irrational and impulsive actions.

During the witch hunts, communities gripped by fear of the unknown succumbed to a mob mentality whenever someone fell ill or misfortune struck. The amygdala triggered a fight-or-flight response, compelling individuals to find scapegoats, with cunning folk and those deviating from societal norms becoming prime targets. As accusations spiraled, fear dominated decision-making instead of rational thought. Today, we observe similar patterns in how social media can incite panic, leading to modern witch hunts. When fear takes over, reason often fades, resulting in unjust vilification—echoing the dark lessons of history.

As we navigate our modern world, let’s remain vigilant against the echoes of this history, seeking truth and questioning the narratives that shape our beliefs. Fear may be powerful, but curiosity and critical thinking are our greatest allies in maintaining our autonomy and humanity.

Resources:

Briggs, Robin. Witches and Neighbors: The Social and Cultural Context of European Witchcraft. Oxford University Press, 1996.

  • This book provides a comprehensive exploration of the social dynamics surrounding witch hunts in early modern Europe, highlighting the interplay of fear, community, and cultural beliefs.

Maxwell-Stuart, Peter G.Witchcraft in Europe, 1100-1700: A Sourcebook. Palgrave Macmillan, 2010.

  • This sourcebook compiles essential documents related to the history of witchcraft in Europe, providing insights into how fear and persecution were constructed and justified.

Page, Sophie.Magic in the Middle Ages. Cambridge University Press, 2005.

  • This book offers an analysis of the cultural and religious practices surrounding magic during the medieval period, emphasizing how these beliefs shaped societal attitudes toward witchcraft.

Bossy, John.Christianity in the West, 1400-1700. Oxford University Press, 1985.

  • Bossy examines the transformation of Christian morality during the Reformation, providing context for the changing perceptions of witchcraft and heresy.

Davies, Owen. Popular Magic: Cunning Folk in English History. Continuum, 2007.

  • This work explores the role of cunning folk—those who practiced folk magic—and how their practices were perceived within the broader context of witchcraft accusations.

Baroja, J. C. Witches and Witchcraft. University of California Press, 1990.

  • Baroja’s work examines the historical and cultural significance of witchcraft, providing insights into the social conditions that fueled witch hunts and the cultural implications of these beliefs.

The first use of the term “conspiracy theory” is much earlier — and more interesting — than historians have thought.

Is Veganism a Psy-Op? Maybe. The Real Issue is Engineering Ourselves Away from Nature

In today’s complex world of nutrition and health, embracing skepticism and critical thinking is essential. Rather than accepting dominant narratives, challenge them to uncover the truth.

🥕 Veganism vs. Meat: What’s the Real Issue? 🥕

The debate over veganism often gets tangled in oversimplified conspiracies. However, the real concern lies in our growing disconnect from nature’s balance. Our modern lifestyles and diets are increasingly detached from natural ecosystems, which profoundly affects our health and well-being.

To truly grasp the nuances of nutrition and health, especially when it comes to veganism, we must examine how our beliefs have been shaped by science, history, and religion. Over the next few weeks, we will time traveling through the last century to see how these elements intertwine and influence our perspectives on veganism.

🔬Before Lobbyism: The Golden Age of Nutritional Science 🔬

Before the rise of lobbyism and industrial influence in the mid-20th century, nutritional science was marked by pioneering research that laid the groundwork for our understanding of essential nutrients. One such figure was Elmer McCollum: Vitamin Pioneer.

Elmer McCollum, a prominent nutrition researcher in the early 20th century, made groundbreaking discoveries regarding vitamins A, B, C, and D. His work was instrumental in identifying the role of these vitamins in preventing nutritional deficiencies.

Vitamin A (Retinol): McCollum’s work significantly advanced the understanding of vitamin A, which is crucial for vision, immune function, and skin health. Retinol, the active form of vitamin A, is primarily found in animal-based foods like liver, fish oils, eggs, and dairy products. Unlike plant-based sources, which provide provitamin A carotenoids like beta-carotene that the body must convert into retinol, animal sources deliver this vitamin in its ready-to-use form.

🧬 BCO1 Gene and Vitamin A 🧬

Did you know that about 45% of people have a genetic variation that makes it hard for them to get enough vitamin A from plant foods? This is because of a gene called BCO1.

The BCO1 gene is responsible for converting beta-carotene (found in carrots, sweet potatoes, and other plants) into active vitamin A, also known as retinol. But for almost half of the population, this gene doesn’t work very efficiently, meaning their bodies can’t make enough vitamin A from plants alone.

Vitamin A is crucial for things like good vision, a strong immune system, and healthy skin. If you can’t get enough from plants, you might need to include animal foods like liver, fish oils, or dairy in your diet to make sure you’re meeting your vitamin A needs.

This explains why some people might struggle with a vegan diet—they need the more easily absorbed form of vitamin A that comes from animal products.

McCollum’s research emphasized the importance of unprocessed, nutrient-rich foods in maintaining health. Diets high in refined grains can exacerbate nutritional deficiencies by displacing more nutrient-dense foods. This indirectly touches on the issues, we see today related to grain consumption, though McCollum’s era was more focused on preventing deficiencies than on inflammation.

The Refinement of Grains: A Double-Edged Sword

As the food industry grew and refined processing techniques became widespread, the nutritional value of grains was compromised. The removal of bran and germ during processing not only reduced the essential vitamins and minerals in grains but also increased their glycemic index. This shift contributed to inflammation and other metabolic issues, like Type-2 Diabetes a concern that has become more prominent in later research.

A Shift in Focus: From Nutritional Science to Industrial Influence

McCollum’s era represents a time when nutritional science was still largely driven by the quest to understand and prevent deficiencies. However, as we moved into the mid-20th century, the influence of lobbyists and industrial interests began to muddy the waters, promoting processed foods and refined grains that strayed from McCollum’s principles of whole, nutrient-rich foods.

🥕 The Influence of Religion and Early Health Movements 🥕

Ellen G. White, a key figure in the Seventh-day Adventist Church, significantly impacted early American dietetics with her advocacy for a plant-based diet and abstinence from alcohol, tobacco, and caffeine. Her health reforms, which emphasized vegetarianism and whole foods, were institutionalized through health institutions like the Battle Creek Sanitarium and figures like Dr. John Harvey Kellogg. The sanitarium’s success and the dissemination of these dietary principles led to the establishment of the American Dietetic Association in 1917, which originally promoted many of these plant-based, whole-food principles. The Adventist emphasis on preventive health care and diet principles laid the groundwork for many modern dietary guidelines and continue to influence discussions around veganism.

🔬 The Role of Science in Shaping Dietary Beliefs 🔬

In the early 20th century, scientific advancements also played a role in shaping nutrition. The Fetner Report highlighted the need for standardized nutritional guidelines and brought attention to the importance of vitamins and minerals. Meanwhile, innovations like Crisco introduced hydrogenated fats into American diets, shifting culinary practices and influencing our understanding of what constitutes a healthy diet.

In a future episode dropping 9/10, we’ll take a deeper dive into how industrialization, scientific reports, and influential figures like John D. Rockefeller and Ancel Keys have further impacted our dietary beliefs and public health policies. Stay tuned as we explore:

  • The Flexner Report: How it reshaped medical education and its ripple effects on nutrition science.
  • The Rise of Processed Foods: The transformation of our food supply and its long-term health implications.
  • Rockefeller’s Influence: The role of industrial interests in shaping modern dietary guidelines.
  • Ancel Key’s: His research became highly influential in the field of nutrition, primarily took place during the mid-20th century, particularly in the 1950s and 1960s. His most famous work, the Seven Countries Study, began in 1958 and was published over several decades. This research was pivotal in linking dietary fat, particularly saturated fat, to heart disease and played a significant role in shaping dietary guidelines that emphasized reducing fat intake to prevent cardiovascular disease. Now adays it is seen as deeply controversial due to several perceived flaws that have been widely discussed by critics over the years.

How does current research define the top nutrient-dense foods?

📰 Spotlight on Micronutrient Density: A Key to Combatting Global Deficiencies

A March 2022 study published in Frontiers in Nutrition titled “Priority Micronutrient Density in Foods” emphasizes the importance of nutrient-dense foods in addressing global micronutrient deficiencies, particularly in vulnerable populations. The research identifies organ meats, small fish, dark leafy greens, shellfish, and dairy products as some of the most essential sources of vital nutrients like vitamin A, iron, and B12. These findings could be instrumental in shaping dietary guidelines and nutritional policies.

🔗 Read more here.

🍽️ Plant vs. Animal Nutrients: Understanding Bioavailability 🍽️

When it comes to nutrient absorption, not all foods are created equal. The bioavailability of nutrients—the proportion that our bodies can absorb and use—varies significantly between plant and animal sources.

🌱 Plant-Based Nutrients: While plant foods are rich in essential vitamins and minerals, they also contain anti-nutrients like phytates and oxalates. These compounds can bind to minerals such as iron, calcium, and zinc, inhibiting their absorption. For example, non-heme iron found in plants is less efficiently absorbed compared to the heme iron from animal sources. Similarly, the vitamin A found in plants as beta-carotene requires conversion to retinol in the body, a process that is not always efficient, particularly in certain populations.

🍖 Animal-Based Nutrients: Animal products, on the other hand, often provide nutrients in forms that are more readily absorbed. Heme iron from meat, retinol from animal liver, and vitamin B12 from dairy and eggs are all examples of highly bioavailable nutrients. These forms are directly usable by the body without the need for complex conversions, making animal products a more reliable source for certain essential nutrients.

🌍 Global Property Rights: Gender Inequality 🌍

Promoting veganism can unintentionally undermine the very principles of women’s rights and social justice that the political left often advocates for. In many countries, women face significant legal and cultural barriers that prevent them from owning land, despite laws that may suggest otherwise. However, in these same regions, women often have the ability to own and manage livestock, which serves as a crucial economic resource and a form of wealth.

This disparity highlights the persistent challenges in achieving gender equality in property rights, especially in rural areas where land ownership is key to economic independence and security. While livestock ownership is valuable, it doesn’t offer the same level of security or social status as land ownership. The lack of land rights perpetuates gender inequality, limiting women’s economic power, social status, and access to resources.

🌿 Plant-Based Diets and Environmental Costs 🌿

Plant-based diets are often praised for their environmental benefits, yet it’s crucial to recognize the complexities involved. While the availability of vegan foods has significantly improved, making it easier than ever to follow a plant-based diet, this increased accessibility does not necessarily equate to better environmental outcomes.

Many vegan products rely heavily on industrial agriculture and monocropping practices. These methods can lead to deforestation, soil depletion, and the loss of biodiversity. The production of popular vegan ingredients, such as soy and almonds, often involves large-scale farming that can have detrimental effects on local ecosystems. Additionally, the industrial processes used to produce processed vegan foods, including heavy use of pesticides, fertilizers, and water, also contribute to environmental concerns.

Understanding these trade-offs is crucial for making informed dietary choices. Opting for sustainably farmed, organic produce and supporting local farmers can help mitigate some of these negative impacts. It’s not just about choosing plant-based foods, but also about how they are produced.

🔄 Ethical Food Choices 🔄

Making ethical food choices involves a comprehensive evaluation of your diet’s impact on health, the environment, and animal welfare. While plant-based diets can be a step towards reducing your carbon footprint, it’s important to consider the broader implications of industrial agriculture and monocropping. Strive for a balanced approach that aligns with your values and promotes sustainability. This might include supporting local and organic options, as well as exploring ways to minimize your environmental impact through diverse and responsible food choices.

By being mindful of these factors, you can better navigate the complexities of dietary decisions and work towards a more ethical and sustainable future.

🔍 Listen to Our Podcast for More 🔍

For an in-depth exploration of these topics and more, tune into our podcast. We offer detailed discussions and insights into how history, science, and societal trends shape our understanding of nutrition and health. Stay curious and informed!

In a future episode dropping 9/10, we’ll take a deeper dive into how industrialization, scientific reports, and influential figures like John D. Rockefeller have further impacted our dietary beliefs and public health policies. Stay tuned as we explore:

  • The Flexner Report: How it reshaped medical education and its ripple effects on nutrition science.
  • The Rise of Processed Foods: The transformation of our food supply and its long-term health implications.
  • Rockefeller’s Influence: The role of industrial interests in shaping modern dietary guidelines.

The interplay of religion, science, and industry has profoundly influenced our beliefs about veganism and nutrition. By understanding these historical and scientific contexts, we gain insight into the broader impact on our dietary choices and health.

Don’t miss the upcoming episode where we’ll explore these themes in greater depth!

Resources:

1. Historical and Nutritional Science:

“Nutrition and Physical Degeneration” by Weston A. Price: Examines traditional diets and their impact on health, providing historical context for nutritional science.

“The Adventist Health Study: 30 Years of Research” edited by Gary E. Fraser: Covers the impact of vegetarian diets advocated by the Seventh-day Adventists.

“Food Politics: How the Food Industry Influences Nutrition and Health” by Marion Nestle: Examines how food industries shape dietary guidelines and public perception.

“The Vitamin D Solution” by Michael F. Holick: Offers insights into the importance of Vitamin D, complementing McCollum’s work on essential nutrients.

Prophetess of Health: A Study of Ellen G. White (Library of Religious Biography) Paperback – July 2, 2008

Articles:

“Ellen G. White and the Origins of American Vegetarianism” from Journal of the American Dietetic Association: Explores the historical influence of Ellen G. White on American dietetics.

“Elmer McCollum: The Vitamin Pioneer” from The Journal of Nutrition: Provides an overview of McCollum’s contributions to nutritional science.

Genetic Factors and Vitamin A

  • Research Papers:
    • “The Role of Genetic Variability in Vitamin A Metabolism” by Steven A. Arneson et al. (Journal of Nutrition): Discusses the genetic factors affecting Vitamin A conversion.
    • “BCO1 Genetic Variation and Beta-Carotene Conversion” in American Journal of Clinical Nutrition: Explores how genetic differences impact the conversion of beta-carotene to Vitamin A.

The Impact of Industrial Agriculture

  • Books:
    • “The Omnivore’s Dilemma” by Michael Pollan: Investigates the industrial food system and its environmental impact.
    • “The End of Food” by Paul Roberts: Looks at the global food industry and its implications for health and the environment.
  • Articles:
    • “The Hidden Costs of Industrial Agriculture” from Environmental Research Letters: Analyzes the ecological impacts of industrial farming practices.

1. Regenerative Agriculture Principles and Practices

  • Books:
    • “Regenerative Agriculture: How to Create a Self-Sustaining Farm Ecosystem” by Richard Perkins: Provides a comprehensive guide to regenerative farming practices.
    • “The Regenerative Garden: How to Grow Healthy Soil and Manage Your Garden for the Future” by Maria Rodale: Focuses on regenerative techniques for gardening.
    • “Dirt to Soil: One Family’s Journey into Regenerative Agriculture” by Gabe Brown: Shares practical experiences and insights from a farmer who has successfully implemented regenerative practices.
  • Articles:
    • “Regenerative Agriculture: What Is It and Why Does It Matter?” from Regenerative Agriculture Initiative: Provides an overview of regenerative agriculture principles and benefits.
    • “The Benefits of Regenerative Agriculture for Soil Health and Sustainability” from Agronomy Journal: Discusses how regenerative practices impact soil health and sustainability.

2. Sustainable and Ecological Farming

  • Books:
    • “The Soil Will Save Us: How Scientists, Farmers, and Foodies Are Healing the Soil to Save the Planet” by Kristin Ohlson: Explores how soil health can be restored through sustainable practices.
    • “Beyond the Jungle: Regenerative Agroforestry and Resilient Communities” by S. H. Smith: Examines the role of agroforestry in regenerative practices and community resilience.
  • Articles:
    • “Sustainable Agriculture and Its Impact on Environmental Conservation” from Sustainable Agriculture Research: Analyzes how sustainable farming methods contribute to environmental conservation.
    • “Ecological Farming: Benefits Beyond the Farm Gate” from Ecology and Society: Looks at the broader ecological benefits of adopting ecological farming practices.

3. Soil Health and Carbon Sequestration

  • Books:
    • “The Carbon Farming Solution: A Global Toolkit of Perennial Crops and Regenerative Agriculture Practices for Climate Change Mitigation and Food Security” by Eric Toensmeier: Focuses on using regenerative practices to sequester carbon and improve soil health.
    • “Soil: The Incredible Story of What Keeps Us Alive” by David R. Montgomery: Provides an in-depth look at soil science and its crucial role in agriculture and climate stability.
  • Articles:
    • “Carbon Sequestration and Soil Health: The Role of Regenerative Agriculture” from Agricultural Systems: Discusses how regenerative agriculture practices contribute to carbon sequestration and soil health.
    • “Soil Organic Matter and Its Role in Carbon Sequestration” from Journal of Soil and Water Conservation: Explores the importance of soil organic matter in maintaining soil health and sequestering carbon.

4. Food Systems and Regenerative Practices

  • Books:
    • “The Ecology of Food: A Historical Perspective” by Peter M. Smith: Provides historical context on food systems and their ecological impact.
    • “The Omnivore’s Dilemma: A Natural History of Four Meals” by Michael Pollan: While it explores various food systems, it touches on sustainable and regenerative practices in agriculture.
  • Articles:
    • “The Future of Food: Regenerative Agriculture and Its Role in Sustainable Food Systems” from Food Policy: Examines the role of regenerative agriculture in creating sustainable food systems.
    • “Regenerative Agriculture and Food Security: An Integrative Approach” from Journal of Agricultural and Environmental Ethics: Looks at how regenerative practices contribute to food security and sustainability.

Gender Inequality and Property Rights

  • Books:
    • “Women, Work, and Property: Gender Inequality and the Economic Impact of Land Rights” by Elizabeth N. L. Allwood: Analyzes the intersection of gender, land ownership, and economic empowerment.
  • Articles:
    • “Gender and Land Rights: A Global Overview” from World Development: Examines gender disparities in land ownership and its implications for women’s economic status.

“Women in Half the World Still Denied Land, Property Rights Despite Laws.”