The Historical Jesus Fact or Fiction?

Nailed: Ten Christian Myths That Show Jesus Never Existed at All

Today’s episode is one I’ve been looking forward to for a long time. I sat down with author and researcher David Fitzgerald, whose book Nailed: Ten Christian Myths That Show Jesus Never Existed at All has stirred up both fascination and controversy in both historical and secular circles.

Before anyone clutches their pearls — or their study Bible — this conversation isn’t about bashing belief. It’s about asking how we know what we think we know, and whether our historical standards shift when faith enters the equation.

Fitzgerald has spent over fifteen years investigating the evidence — or lack of it — surrounding the historical Jesus. In this first part of our series, we cover Myth #1 (“The idea that Jesus being a myth is ridiculous”) and Myth #4 (“The Gospels were written by eyewitnesses”). We also start brushing up against Myth #5, which explores how the Gospels don’t even describe the same Jesus.

We didn’t make it to Myth #7 yet — the claim that archaeology confirms the Gospels…. so, stay tuned for Part Two.

And for my visual learners!! I’ve got you. Scroll below for infographics, side-by-side Gospel comparisons, biblical quotes, and primary source references that make this episode come alive.

🧩 The 10 Myths About Jesus — According to Nailed

Myth #1: “The idea that Jesus was a myth is ridiculous!”
→ Fitzgerald argues that the assumption of Jesus’ historicity persists more from cultural tradition than actual historical evidence, and that questioning it isn’t fringe. It’s legitimate historical inquiry.

Myth #2: “Jesus was wildly famous — but somehow no one noticed.”
→ Despite claims that Jesus’ miracles and teachings drew massive crowds, there’s an eerie silence about him in the records of contemporaneous historians and chroniclers who documented far lesser figures.

Myth #3: “Ancient historian Josephus wrote about Jesus.”
→ The so-called “Testimonium Flavianum” passages in Josephus’ work are widely considered later Christian insertions, not authentic first-century testimony.

Myth #4: “Eyewitnesses wrote the Gospels.”
→ The Gospels were written decades after the events they describe by unknown authors relying on oral traditions and earlier written sources, not firsthand experience.

Myth #5: “The Gospels give a consistent picture of Jesus.”
→ Each Gospel portrays a strikingly different version of Jesus — from Mark’s suffering human to John’s divine Logos — revealing theological agendas more than biographical consistency.

Myth #6: “History confirms the Gospels.”
→ When examined critically, historical records outside the Bible don’t corroborate the key events of Jesus’ life, death, or resurrection narrative.

Myth #7: “Archaeology confirms the Gospels.”
→ Archaeological evidence supports the general backdrop of Roman-era Judea but fails to verify specific Gospel claims or the existence of Jesus himself.

Myth #8: “Paul and the Epistles corroborate the Gospels.”
→ Paul’s letters — the earliest Christian writings — reveal no awareness of a recent historical Jesus, focusing instead on a celestial Christ figure revealed through visions and scripture.

Myth #9: “Christianity began with Jesus and his apostles.”
→ Fitzgerald argues that Christianity evolved from earlier Jewish sects and mystery religions, with “Jesus” emerging as a mythologized figure around whom older beliefs coalesced.

Myth #10: “Christianity was totally new and different.”
→ The moral teachings, rituals, and savior motifs of early Christianity closely mirror surrounding pagan traditions and Greco-Roman mystery cults.


📘 Myth #1: “The Idea That Jesus Being a Myth Is Ridiculous”

This one sets the tone for the entire book — because it’s not even about evidence at first. It’s about social pressure.

Fitzgerald opens Nailed by calling out how the mythicist position (the idea that Jesus might never have existed) gets dismissed out of hand…even by secular historians. As he points out, the problem isn’t that the evidence disproves mythicism. The problem is that we don’t apply the same historical standards we would to anyone else.

Case in point: Julius Caesar crossing the Rubicon.

Julius Caesar crossing the Rubicon at the head of his army, 49 BC. Illustration from Istoria Romana incisa all’acqua forte da Bartolomeo Pinelli Romano (Presso Giovanni Scudellari, Rome, 1818-1819).

When historians reconstruct that event, we have:

  • Multiple contemporary accounts from major Roman historians like Suetonius, Plutarch, Appian, and Cassius Dio.
  • Physical evidence — coins, inscriptions, and monuments produced during or shortly after Caesar’s lifetime.
  • Political and military documentation aligning with the timeline.

In contrast, for Jesus, we have:

  • No contemporary accounts.
  • No archaeological or physical evidence.
  • Gospels written decades later by anonymous authors who never met him.

That’s the difference between history and theology.

Even historian Bart Ehrman, who does believe Jesus existed, has called mythicists “the flat-earthers of the academic world.” Fitzgerald addresses that in the interview (not defensively, but critically) asking why questioning this one historical figure provokes so much emotional resistance.

As he puts it, if the same level of evidence existed for anyone else, no one would take it seriously.


✍️ Myth #4: “The Gospels Were Written by Eyewitnesses”

We dive into the authorship problem — who actually wrote the Gospels, when, and why it matters.


🔀 Myth #5: “The Gospels Don’t Describe the Same Jesus”

⚖️ Contradictions Between the Gospels

1. Birthplace of Jesus — Bethlehem or Nazareth?

Matthew 2:1 – “Jesus was born in Bethlehem of Judea in the days of Herod the king.”
Luke 2:4–7 – Joseph travels from Nazareth to Bethlehem for the census, and Jesus is born there.
John 7:41–42, 52 – Locals say, “The Messiah does not come from Galilee, does he?” implying Jesus was known as a Galilean, not from Bethlehem.

🔍 Mythicist take:
Bethlehem was retrofitted into the story to fulfill the Messianic prophecy from Micah 5:2. In early Christian storytelling, theological necessity (“he must be born in David’s city”) trumps biographical accuracy.

2. Jesus’ Genealogy — Two Lineages, Zero Agreement

Matthew 1:1–16 – Jesus descends from David through Solomon.
Luke 3:23–38 – Jesus descends from David through Nathan.
Even Joseph’s father differs: Jacob (Matthew) vs. Heli (Luke).

🔍 Mythicist take:
Two contradictory genealogies suggest not historical memory but theological marketing. Each author tailors Jesus’ lineage to fit symbolic patterns — Matthew emphasizes kingship; Luke, universality.

3. The Timing of the Crucifixion — Passover Meal or Preparation Day?

Mark 14:12–17 – Jesus eats the Passover meal with his disciples before his arrest.
John 19:14 – Jesus is crucified on the day of Preparation — before Passover begins — at the same time lambs are being slaughtered in the Temple.

🔍 Mythicist take:
This isn’t a detail slip; it’s theology. John deliberately aligns Jesus with the Paschal lamb, turning him into the cosmic sacrifice — a theological metaphor, not an eyewitness timeline.

4. Jesus’ Last Words — Four Versions, Four Theologies

Mark 15:34 – “My God, my God, why have you forsaken me?” → human anguish.
Luke 23:46 – “Father, into your hands I commit my spirit.” → serene trust.
John 19:30 – “It is finished.” → divine completion.
Matthew 27:46 – Echoes Mark’s despair, but adds cosmic drama (earthquake, torn veil).

🔍 Mythicist take:
Each Gospel shapes Jesus’ death to reflect its theology — Mark’s suffering human, Luke’s faithful martyr, John’s omniscient divine being. This isn’t eyewitness diversity; it’s evolving mythmaking.

5. Who Found the Empty Tomb — and What Did They See?

Mark 16:1–8Three women find the tomb open, see a young man in white, flee in fear, tell no one.
Matthew 28:1–10Two women see an angel descend, roll back the stone, and tell them to share the news.
Luke 24:1–10Several women find the stone already rolled away; two men in dazzling clothes appear.
John 20:1–18Mary Magdalene alone finds the tomb, then runs to get Peter; later she meets Jesus himself.

🔍 Mythicist take:
If this were a consistent historical event, we’d expect some harmony. Instead, we see mythic escalation: from a mysterious empty tomb (Mark) → to heavenly intervention (Matthew) → to divine encounter (John).


6. The Post-Resurrection Appearances — Where and to Whom?

Matthew 28:16–20 – Jesus appears in Galilee to the eleven.
Luke 24:33–51 – Jesus appears in Jerusalem and tells them to stay there.
Acts 1:4–9 – Same author as Luke, now extends appearances over forty days.
Mark 16 (longer ending) – A later addition summarizing appearances found in the other Gospels.

🔍 Mythicist take:
The resurrection narrative grows with time — geographically, dramatically, and theologically. Early silence (Mark) gives way to detailed appearances (Luke/John), mirroring the development of early Christian belief rather than eyewitness memory.


🌿 Final Thought

Whether you end up agreeing with Fitzgerald or not, the point isn’t certainty… it’s curiosity. The willingness to look at history without fear, even when it challenges what we’ve always been told.

And here’s the fun part! David actually wants to hear from you. If you’ve got questions, pushback, or something you want him to unpack next time, drop it in the comments or send it my way. I’ll collect your submissions and bring a few of them into Part Two when we dig into Myth #7 — “Archaeology Confirms the Gospels.”

and as always, maintain your curiosity, embrace skepticism, and keep tuning in. 🎙️

📖 Further Reading 📖 

Foundational Mythicist Works:

  • Richard Carrier – On the Historicity of Jesus
  • Robert M. Price – The Christ-Myth Theory and Judaizing Jesus 
  • Earl Doherty – The Jesus Puzzle
  • Gospel Fictions – Randel Helms
  • The Fable of Christ – Joseph Wheless
  • The Pagan Christ – Tom Harpur
  • The Historical Jesus – William Benjamin Smith
  • The mythic past : biblical archaeology and the myth of Israel

Did Jesus Exist? Jacob Berman and Dr. Jack Bull Versus Dr. Aaron Adair and Neil Godfrey

Mainstream Scholarship & Context

  • Bart Ehrman – Did Jesus Exist?
  • Jonathan Haidt – The Righteous Mind Why Good People are Divided by Religion and Politics

Critiques of Bart Ehrman

Broader Philosophical & Cultural Context

  • Christianity before Christ  –  John G Jackson
  • The World’s Sixteen Crucified Saviors – Kersey Graves
  • The Christ Conspiracy – Acharya S (D.M. Murdock)


Sacred or Strategic? Rethinking the Christian Origin Story

The Bible Isn’t History and Trump Isn’t Your Savior

It’s Been a Minute… Let’s Get Real

Hey Hey, welcome back to Taste of Truth Tuesdays! it’s been over a month since my last episode, and wow—a lot has happened. Honestly, I’ve been doing some serious soul-searching and education, especially around some political events that shook me up.

I was firmly against Trump’s strikes on Iran. And the more I dug in, the more I realized how blind I’d been completely uneducated and ignorant about the massive political power Zionism holds in this country. And it’s clear now: Trump is practically bent over the Oval Office for Netanyahu. The Epstein files cover-up only confirms that blackmail and shadow control are the real puppet strings pulling at the highest levels of power. Our nation has been quietly occupied since Lyndon B. Johnson’s presidency and that’s a whole other episode I’ll get into later.

But what really cracked something in me was this:

In the 1990s, Trump sponsored Elite’s “Look of the Year” contest—a glitzy, global modeling search that lured teenage girls with promises of fame and fashion contracts. Behind the scenes, it was a trafficking operation. According to The Guardian’s Lucy Osborne and the BBC documentary Scouting For Girls: Fashion’s Darkest Secret, these girls weren’t being scouted—they were being sold to rich businessmen.

This wasn’t just proximity. Trump was part of it.

Once I saw that, the religious right’s worship of him stopped looking like misguided patriotism and started looking like mass delusion. Or complicity. Either way, I couldn’t unsee it.

And that’s when I started asking the bigger questions: What else have we mistaken for holy? What else have we accepted as truth without scrutiny?

For now, I want to cut to the heart of the matter: the major problem at the root of so much chaos: the fact that millions of Christians still believe the Bible is a literal historical document.

This belief doesn’t just distort faith-it fuels political agendas, end-times obsession, and yes, even foreign policy disasters. So, let’s dig into where this all began, how it’s evolved, and why it’s time we rethink everything we thought we knew about Scripture.

Thanks for reading Taste of Truth! Subscribe for free to receive new posts and support my work.

For most Christians, the Bible is more than a book-it’s the blueprint of reality, the inspired Word of God, infallible and untouchable. But what if that belief wasn’t original to Christianity? What if it was a reaction…. a strategic response to modern doubt, historical criticism, and the crumbling authority of the Church?

In this episode, we’re pulling back the veil on the doctrine of biblical inerrancy, the rise of dispensationalism, and the strange marriage of American politics and prophetic obsession. From the Scofield Bible to the belief that modern-day Israel is a fulfillment of God’s plan, we’re asking hard questions about the origins of these ideas.

As Dr. Mark Gregory Karris said when he joined us on a previous episode: “Can you imagine two different families? One, the Bible is the absolute inerrant word of God every.Word, every jot and title, so to speak, is meant to be in there due to the inspiration of God. And so every story you read, you know, God killing Egyptian babies and God flooding the entire planet and thinking, well yeah, there’s gonna be babies gasping for air and drowning grandmothers and all these animals. And that is seen as absolute objective truth. But then in another family, oh, these are, these are myths. These are sacred myths that people can learn from. No, that wasn’t like God speaking and smiting them and burning them alive because they touch this particular arc or now that this is how they thought given their minds at the time, given their understandings of and then like you talked about oh look at that aspect of humanity interesting that they portrayed god and not like it becomes like wow that’s cool instead of like oh my gosh i need 3-4 years of therapy because I was taught the bible in a particular way.”

Once you trace these doctrines back to their roots, it’s not divine revelation you find: it’s human agendas.

Let’s get uncomfortable. Was your faith formed by sacred truth… or centuries of strategic storytelling?

How Literalism Took Over

In the 19th century, biblical literalism became a kind of ideological panic room. As science, archaeology, and critical scholarship began to chip away at traditional interpretations, conservative Christians doubled down. Instead of exploring the Bible as a complex, layered anthology full of metaphor, moral instruction, and mythology, they started treating it like a divine press release. Every word had to be accurate. Every timeline had to match. Every contradiction had to be “harmonized” away.

The Myth of Inerrancy

One of the most destructive byproducts of this era was the invention of biblical inerrancy. Yes, invention. The idea that the Bible is “without error in all that it affirms” isn’t ancient…. it’s theological propaganda, most notably pushed by B.B. Warfield and his peers at Princeton. Rogers and McKim wrote extensively about how this doctrine was manufactured and not handed down from the apostles as many assume. We dive deeper into all that—here.

Inerrancy teaches that the Bible is flawless, even in its historical, scientific, and moral claims. But this belief falls apart under even basic scrutiny. Manuscripts don’t agree. Archaeological timelines conflict with biblical ones. The Gospels contradict each other. And yet this doctrine persists, warping believers’ understanding and demanding blind loyalty to texts written by fallible people in vastly different cultures.

That’s the danger of biblical inerrancy: it treats every verse as historical journalism rather than layered myth, metaphor, or moral instruction. But what happens when you apply that literalist lens to ancient origin stories?

📖 “Read as mythology, the various stories of the great deluge have considerable cultural value, but taken as history, they are asinine and absurd.” — John G. Jackson, Christianity Before Christ

And yet, this is the foundation of belief for millions who think Noah’s Ark was a literal boat and not a borrowed flood myth passed down and reshaped across Mesopotamian cultures. This flattening of myth into fact doesn’t just ruin the poetry-it fuels bad politics, end-times obsession, and yes… Zionism.

And just to be clear, early Christians didn’t read the Bible this way. That kind of rigid literalism didn’t emerge until centuries later…long after the apostles were gone. We’ll get to that.

When we cling to inerrancy, we’re not preserving truth. We’re missing it entirely.

Enter: Premillennial Dispensationalism

If biblical inerrancy was the fuel, C.I. Scofield’s 1909 annotated Bible was the match. His work made premillennial dispensationalism a household belief in evangelical churches. For those unfamiliar with the term, here’s a quick breakdown:

  • Premillennialism: Jesus will return before a literal thousand-year reign of peace.
  • Dispensationalism: History is divided into distinct eras (or “dispensations”) in which God interacts with humanity differently.

When merged, this theology suggests we’re living in the “Church Age,” which will end with the rapture. Then comes a seven-year tribulation, the rise of the Antichrist, and finally, Jesus returns for the ultimate battle after which He’ll rule Earth for a millennium. Sounds like the plot of a dystopian film, right? And yet, this became the dominant lens through which American evangelicals interpret reality.

The result? A strange alliance between American evangelicals and Zionist nationalism. You get politicians quoting Revelation like it’s foreign policy, pastors fundraising for military aid, and millions of Christians cheering on war in the Middle East because they think it’ll speed up Jesus’ return.

But here’s what I want you to take away from this episode today: none of this works unless you believe the Bible is literal, infallible, and historically airtight.

How This Shaped Evangelical Culture and Politics

The Scofield Bible didn’t just change theology. It changed culture. Dispensationalist doctrine seeped into seminaries like Dallas Theological Seminary and Moody Bible Institute, influencing generations of pastors. It also exploded into popular culture through Hal Lindsey’s The Late Great Planet Earth and the Left Behind series. Fiction, prophecy, and fear blurred into one big spiritual panic attack.

But perhaps the most alarming shift came in the political realm. Dispensationalist belief heavily influences evangelical support for the modern state of Israel. Why? Because many believe Israel’s 1948 founding was a prophetic event. Figures like Jerry Falwell turned theology into foreign policy. His organization, the Moral Majority, was built on an unwavering belief that supporting Israel was part of God’s plan. Falwell didn’t just preach this, he traveled to Israel, funded by its government, and made pro-Israel advocacy a cornerstone of evangelical identity.

This alignment between theology and geopolitics hasn’t faded. In the 2024 election cycle, evangelical leaders ranked support for Israel on par with anti-abortion stances. Ralph Reed, founder of the Faith and Freedom Coalition, explicitly said as much. Donald Trump even quipped that “Christians love Israel more than Jews.” Whether that’s true or not, it reveals just how deep this belief system runs.

And the propaganda doesn’t stop there…currently Israel’s Foreign Ministry is funding a week-long visit for 16 prominent young influencers aligned with Donald Trump’s MAGA and America First movements, part of an ambitious campaign to reshape Israel’s image among American youth.

But Let’s Talk About the Red Flags

This isn’t just about belief-it’s about control. Dispensationalist theology offers a simple, cosmic narrative: you’re on God’s winning team, the world is evil, and the end is near. There’s no room for nuance, no time for doubt. Just stay loyal, and you’ll be saved.

This thinking pattern isn’t exclusive to Christianity. You’ll find it in MLMs, and some conspiracy theory communities. The recipe is the same: create an in-group with secret knowledge, dangle promises of salvation or success, and paint outsiders as corrupt or deceived. It’s classic manipulation-emotional coercion wrapped in spiritual language.

And let’s not forget the date-setting obsession. Hal Lindsey made a career out of it. People still point to blood moons, earthquakes, and global politics as “proof” that prophecy is unfolding. If you’ve ever been trapped in that mindset, you know how addictive and anxiety-inducing it can be.

BY THE WAY, it’s not just dispensationalism or the Scofield Bible that fuels modern Zionism. The deeper issue is, if you believe the Bible is historically accurate and divinely orchestrated, you’re still feeding the ideological engine of Zionism. Because at its core, Christianity reveres Jewish texts, upholds Jewish chosenness, and worships a Jewish messiah. That’s not neutrality it’s alignment.

If this idea intrigued you, you’re not alone. There’s a growing body of work unpacking how Christianity’s very framework serves Jewish supremacy, whether intentionally or not. For deeper dives, check out Adam Green’s work over at Know More News on Rumble, and consider reading The Jesus Hoax: How St. Paul’s Cabal Fooled the World for Two Thousand Years. You don’t have to agree with everything to realize: the story you were handed might not be sacred it might be strategic.

Why This Matters for Deconstruction

For me, one of the most painful parts of deconstruction was realizing I’d been sold a false bill of goods. I was told the Bible was the infallible word of God. That it held all the answers. That doubt was dangerous. But when I began asking real questions, the entire system started to crack.

The doctrine of inerrancy didn’t deepen my faith… it limited it. It kept me from exploring the Bible’s human elements: its contradictions, its cultural baggage, and its genuine beauty. The truth is that these texts were written by people trying to make sense of their world and their experiences with the divine. They are not divine themselves.

Modern Scholarship Breaks the Spell

Modern biblical scholarship has long since moved away from the idea of inerrancy. When you put aside faith-based apologetics and look honestly at the evidence, the traditional claims unravel quickly:

  • Moses didn’t write the Torah. Instead, the Pentateuch was compiled over centuries by multiple authors, each with their own theological agendas (see the JEDP theory).
  • King David is likely a mythic figure. Outside of the Bible, there’s no solid evidence he actually existed, much less ruled a vast kingdom.
  • The Gospels weren’t written by Matthew, Mark, Luke, and John. Those names were added later. The original texts are anonymous and they often contradict each other.
  • John didn’t write Revelation. Not the Apostle John, anyway. The Greek and style are completely different from the Gospel of John. The real author was probably some unknown apocalyptic mystic on Patmos, writing during Roman persecution.

And yet millions still cling to these stories as literal fact, building entire belief systems and foreign policies on myths and fairy tales.


🧠 Intellectual Starvation in Evangelicalism

Here’s the deeper scandal: it’s not just that foundational Christian stories crumble under modern scrutiny. It’s that the church never really wanted you to think critically in the first place.

Mark Noll, a respected evangelical historian, didn’t mince words when he wrote:

“The scandal of the evangelical mind is that there is not much of an evangelical mind.”

In The Scandal of the Evangelical Mind, Noll traces how American evangelicalism lost its intellectual life. It wasn’t shaped by a pursuit of truth, but by populist revivalism, emotionalism, and a hyper-literal obsession with “the end times.” The same movements that embraced dispensationalism and biblical inerrancy also gutted their communities of academic rigor, curiosity, and serious theological reflection.

The result? A spiritually frantic but intellectually hollow faith—one that discourages questions, mistrusts scholarship, and fears nuance like it’s heresy.

Noll shows that instead of grappling with ambiguity or cultural complexity, evangelicals often default to reactionary postures. This isn’t just a relic of the past. It’s why so many modern Christians cling to false authorship claims, deny historical context, and accept prophecy as geopolitical fact. It’s why Revelation gets quoted to justify Zionist foreign policy without ever asking who actually wrote the book or when, or why.

This anti-intellectualism isn’t an accident. It was baked in from the start.

But Noll doesn’t leave us hopeless. He offers a call forward: for a faith that engages the world with both heart and mind. A faith that can live with tension, welcome complexity, and evolve beyond fear-driven literalism.

What Did the Early Church Actually Think About Scripture?

Here’s what gets lost in modern evangelical retellings: the earliest Christians didn’t treat Scripture the way today’s inerrantists do.

For the first few centuries, Christians didn’t even have a finalized Bible. There were letters passed around, oral traditions, a few widely recognized Gospels, and a whole lot of discussion about what counted as authoritative. It wasn’t until the fourth century that anything close to our current canon was even solidified. And even then, it wasn’t set in stone across all branches of Christianity.

Church fathers like Origen, Clement of Alexandria, and Irenaeus viewed Scripture as spiritually inspired but full of metaphor and mystery. They weren’t demanding literal accuracy; they were mining the texts for deeper meanings. Allegory was considered a legitimate, even necessary, interpretive method. Scripture was read devotionally and theologically, not scientifically or historically. In other words, it wasn’t inerrancy that defined early Christian engagement with Scripture, it was curiosity and contemplation.

For a deeper dive, check out The Gnostic Informant’s incredible documentary that uncovers the first hundred years of Christianity, a period that has been systematically lied about and rewritten. It reveals how much of what we take for granted was shaped by political and theological agendas far removed from the original followers of Jesus.

If you’re serious about understanding the roots of your faith or just curious about how history gets reshaped, this documentary is essential viewing. It’s a reminder that truth often hides in plain sight and that digging beneath the surface is how we reclaim our own understanding.

Protestantism: A Heretical Offshoot Disguised as Tradition

The Protestant Reformation shook things up in undeniable ways. Reformers like Martin Luther and John Calvin challenged the Catholic Church’s abuses and rightly demanded reform. But what’s often missed (or swept under the rug) is how deeply Protestantism broke with the ancient, historic Church.

By insisting on sola scriptura—Scripture alone—as the sole authority, the Reformers rejected centuries of Church tradition, councils, and lived community discernment that shaped orthodox belief. They didn’t invent biblical inerrancy as we know it today, but their elevation of the Bible above all else cracked the door wide open for literalism and fundamentalism to storm in.

What began as a corrective movement turned into a theological minefield. Today, Protestantism isn’t a single coherent tradition; it’s a sprawling forest of over 45,000 different denominations, all claiming exclusive access to “the truth.”

This fragmentation isn’t accidental…. it’s the logical outcome of rejecting historic continuity and embracing personal interpretation as the final authority.

Far from preserving the faith of the ancient Church, Protestantism represents a fractured offshoot: one that often contradicts the early Church’s beliefs and teachings. It trades the richness of lived tradition and community wisdom for a rigid, literalistic, and competitive approach to Scripture.

The 20th century saw this rigid framework perfected into a polished doctrine demanding total conformity and punishing doubt. Protestant fundamentalism turned into an ideological fortress, where questioning is treated as betrayal, and theological nuance is replaced by black-and-white dogma.

If you want to understand where so much of modern evangelical rigidity and end-times obsession comes from, look no further than this fractured legacy. Protestantism’s break with the ancient Church set the stage for the spiritual and intellectual starvation that Mark Noll so powerfully exposes.

Rethinking the Bible

Seeing the Bible as a collection of human writings about God rather than the literal word from God opens up space for critical thinking and compassion. It allows us to:

  • Study historical context and cultural influences.
  • Embrace the diversity of perspectives in Scripture.
  • Let go of rigid interpretations and seek core messages like love, justice, and humility.
  • Move away from proof-texting and toward spiritual growth.
  • Reconcile faith with science, reason, and modern ethics.

When we stop demanding that the Bible be perfect, we can finally appreciate what it actually is: a complex, messy, beautiful attempt by humans to understand the sacred.

This shift doesn’t weaken faith…. I believe it strengthens it.

It moves us away from dogma disguised as certainty and into something deeper…. something alive. It opens the door for real relationship, not just with the divine, but with each other. It makes space for growth, for disagreement, for honesty.

And in a world tearing itself apart over whose version of truth gets to rule, that kind of open-hearted spirituality isn’t just refreshing-it’s essential.

Because if your faith can’t stand up to questions, history, or accountability… maybe it was never built on truth to begin with.

Let’s stop worshiping the paper and start seeking the presence.

🔎 Resources Worth Exploring:

  • “The Jesus Hoax: How St. Paul’s Cabal Fooled the World for Two Thousand Years” by David Skrbina
  • “Christianity Before Christ” by John G. Jackson
  • The Scandal of the Evangelical Mind” by Mark Noll – A scathing but sincere critique from within the evangelical tradition itself. Noll exposes how anti-intellectualism, biblical literalism, and cultural isolationism have gutted American Christianity’s ability to engage the world honestly.
  • Check out Adam Green’s work at Know More News on Rumble for more on the political and mythological implications of Christian Zionism
  • And don’t miss my interview with Dr. Mark Gregory Karris, author of The Diabolical Trinity: Wrathful God, Sinful Self, and Eternal Hell, where we dive deep into the psychological damage caused by toxic theology

When “Helping the Homeless” Becomes a Trojan Horse

Why Trump’s new executive order deserves close scrutiny

President Trump signed an executive order on July 24, 2025, calling on states and cities to clear homeless encampments and expand involuntary psychiatric treatment, framed as a move to improve public safety and compassion

At first glance, it seems reasoned: address the homelessness crisis in many progressive cities, restore order, & help those with severe mental illness. But when I read it closely, and the language….phrases like “untreated mental illness,” “public nuisance,” and “at risk of harm”is vague enough, subjective enough, and feels ripe for misuse 😳

This goes beyond homelessness. It marks a shift toward normalizing forced institutionalization, a trend with deep roots in American psychiatric history.

We explored this dark legacy in a recent episode, Beneath the White Coats 🥼 and if you listened to that episode, you’ll know that

compulsory commitment isn’t new.

Historically, psychiatric institutions in the U.S. served not just medical needs but social control. Early 20th-century asylums housed the poor, the racially marginalized, and anyone deemed “unfit.”

The International Congress of Eugenics’ Logo 1921

The eugenics movement wasn’t a fringe ideology….it was supported by mainstream medical groups, state law, and psychiatry. Forced sterilization, indefinite confinement, and ambiguous diagnoses like “moral defectiveness” were justified under the guise of public health.

Now, an executive order gives local governments incentives (and of course funding 💰 is always tied to compliance) to loosen involuntary commitment laws and redirect funding to those enforcing anti-camping and drug-use ordinances instead of harm reduction programs

Once states rewrite their laws to align with the order’s push toward involuntary treatment and if “public nuisance” or “mental instability” are to be interpreted broadly…

Now, you don’t have to be homeless to be at risk. A public disturbance, a call from a neighbor, even a refusal to comply with treatment may trigger involuntary confinement.

Is it just me, or does this feel like history is repeating?

We’ve seen where badly defined psychiatric authority leads: disproportionate targeting, loss of civil rights, and institutionalization justified as compassion. Today’s executive order could enable a similar expansion of psychiatric control.

So.. what do you think? Is this just a homelessness policy? or is it another slippery slope?

Beneath the White Coats: Psychiatry, Eugenics, and the Forgotten Graves

Dogma in a Lab Coat

We like to believe science is self-correcting—that data drives discovery, that good ideas rise, and bad ones fall. But when it comes to mental health, modern society is still tethered to a deeply flawed framework—one that pathologizes human experience, medicalizes distress, and often does more harm than good.

Psychiatry has long promised progress, yet history tells a different story. From outdated treatments like bloodletting to today’s overprescription of SSRIs, we’ve traded one form of blind faith for another. These drugs—still experimental in many respects—carry serious risks, yet are handed out at staggering rates. And rather than healing root causes, they often reinforce a narrative of victimhood and chronic dysfunction.

The pharmaceutical industry now drives diagnosis rates, shaping public perception and clinical practice in ways that few understand. What’s marketed as care is often a system of control. In this episode, we revisit the dangers of consensus-driven science—how it silences dissent and rewards conformity.

Because science, like religion or politics, can become dogma. Paradigms harden. Institutions protect their power. And the costs are human lives.

But beneath this entire structure lies a deeper, more uncomfortable question—one we rarely ask:

What does it mean to be a person?

Are we just bodies and brains—repairable, programmable, replaceable? Or is there something more?

Is consciousness a glitch of chemistry, or is it a window into the soul?

Modern psychiatry doesn’t just treat symptoms—it defines the boundaries of personhood. It tells us who counts, who’s disordered, who can be trusted with autonomy—and who can’t.

But what if those definitions are wrong?

We’ve talked before about the risks of unquestioned paradigms—how ideas become dogma, and dogma becomes control. In a past episode, How Dogma Limits Progress in Fitness, Nutrition, and Spirituality, we explored Rupert Sheldrake’s challenge to the dominant scientific worldview—his argument that science itself had become a belief system, closing itself off to dissent. TED removed that talk, calling it “pseudoscience.” But many saw it as an attempt to protect the status quo—the high priests of data and empiricism silencing heresy in the name of progress. We will revisit his work later on in our conversation. 

We’ve also discussed how science, more than politics or religion, is often weaponized to control behavior, shape belief, and reinforce social hierarchies. And in a recent Taste Test Thursday episode, we dug into how the industrial food system was shaped not just by profit but by ideology—driven by a merger of science and faith.

To read more:

This framework—that science is never truly neutral—becomes especially chilling when you look at the history of psychiatry.

To begin this conversation, we’re going back—not to Freud or Prozac, but further. To the roots of American psychiatry. To two early figures—John Galt and Benjamin Rush—whose ideas helped define the trajectory of an entire field. What we find there presents a choice: a path toward genuine hope, or a legacy of continued harm.

This  story takes us into the forgotten corners of that history, a place where “normal” and “abnormal” were declared not by discovery, but by decree.

Clinical psychiatrist Paul Minot put it plainly:

“Psychiatry is so ashamed of its history that it has deleted much of it.”

And for good reason.

Psychiatry’s early roots weren’t just tangled with bad science—they were soaked in ideology. What passed for “treatment” was often social control, justified through a veneer of medical language. Institutions were built not to heal, but to hide. Lives were labeled defective. 

We would like to think that medicine is objective, that the white coat stands for healing. But behind those coats was a mission to save society from the so-called “abnormal.”
But who defined normal?
And who paid the price?


The Forgotten Legacy of Dr. John Galt

Lithograph, “Virginia Lunatic Asylum at Williamsburg, Va.” by Thomas Charles Millington, ca.1845. Block & Building Files – Public Hospital, Block 04, Box 07. Image citation: D2018-COPY-1104-001. Special Collections.

Long before DSM codes and Big Pharma, the first freestanding mental hospital  in America called Eastern Lunatic Asylum opened its doors in 1773—just down the road from where I live, in Williamsburg, Virginia. Though officially declared a hospital, it was commonly known as “The Madhouse.” For most who entered, institutionalization meant isolation, dehumanization, and often treatment worse than what was afforded to livestock. Mental illness was framed as a threat to the social order—those deemed “abnormal” were removed from society and punished in the name of care.

But one man dared to imagine something different.

Dr. John Galt II, appointed as the first medical superintendent of the hospital (later known as Eastern State), came from a family of alienists—an old-fashioned term for early psychiatrists. The word comes from the Latin alienus, meaning “other” or “stranger,” and referred to those considered mentally “alienated” from themselves or society. Today, of course, the word alien has taken on very different connotations—especially in the heated political debates over immigration. It’s worth clarifying: the historical use of alienist had nothing to do with immigration or nationality. It was a clinical label tied to 19th-century psychiatry, not race or citizenship. But like many terms, it’s often misunderstood or manipulated in modern discourse.

Galt, notably, broke with the harsh legacy of many alienists of his time. Inspired by French psychiatrist Philippe Pinel—often credited as the first true psychiatrist—Galt embraced a radically compassionate model known as moral therapy. Where others saw madness as a threat to be controlled, Galt saw suffering that could be soothed. He believed the mentally ill deserved dignity, freedom, and individualized care—not chains or punishment. He refused to segregate patients by race. He treated enslaved people alongside the free. And he opposed the rising belief—already popular among his fellow psychiatrists—that madness was simply inherited, and the mad were unworthy of full personhood.

Credit: The Valentine
Original Author: Cook Collection
Created: Late nineteenth to early twentieth century

Rather than seeing madness as a biological defect to be subdued or “cured,” Galt and Pinel viewed it as a crisis of the soul. Their methods rejected medical manipulation and instead focused on restoring dignity. They believed that those struggling with mental affliction should be treated not as deviants but as ordinary people, worthy of love, freedom, and respect.

Dr. Marshall Ledger, founder and editor of Penn Medicine, once quoted historian Nancy Tomes to summarize this period:

“Medical science in this period contributed to the understanding of mental illness, but patient care improved less because of any medical advance than because of one simple factor: Christian charity and common sense.”

Galt’s asylum was one of the only institutions in the United States to treat enslaved people and free Black patients equally—and even to employ them as caregivers. He insisted that every person, regardless of race, had a soul of equal moral worth. His belief in equality and metaphysical healing put him at odds with nearly every other psychiatrist of his time.

And he paid the price.

The psychiatric establishment, closely allied with state power and emerging medical-industrial interests, rejected his human-centered model. Most psychiatrists of the era endorsed slavery and upheld racist pseudoscience. The prevailing consensus was rooted in hereditary determinism—that madness and criminality were genetically transmitted, particularly among the “unfit.”

This growing belief—that mental illness was a biological flaw to be medically managed—was not just a scientific view, but an ideological one. Had Galt’s model of moral therapy been embraced more broadly, it would have undermined the growing assumption that biology and state-run institutions offered the only path to sanity. It would have challenged the idea that human suffering could—and should—be controlled by external authorities.

Instead, psychiatry aligned with power.

Moral therapy was quietly abandoned. And the field moved steadily toward the medicalized, racialized, and state-controlled version of mental health that would pave the way for both eugenics and the modern pharmaceutical regime.

“The Father of American Psychiatry”

Long before Auschwitz. Long before the Eugenics Record Office. Long before sterilization laws and IQ tests, there was Dr. Benjamin Rush—signer of the Declaration of Independence, founder of the first American medical school, and the man still honored as the “father of American psychiatry.” His portrait hangs today in the headquarters of the American Psychiatric Association.

Though many historians point to Francis Galton as the father of eugenics, it was Rush—nearly a century earlier—who laid much of the ideological groundwork. He argued that mental illness was biologically determined and hereditary. And he didn’t stop there.

Rush infamously diagnosed Blackness itself as a form of disease—what he called “negritude.” He theorized that Black people suffered from a kind of leprosy, and that their skin color and behavior could, in theory, be “cured.” He also tied criminality, alcoholism, and madness to inherited degeneracy, particularly among poor and non-white populations.

These ideas found a troubling ally in Charles Darwin’s emerging theories of evolution and heredity. While Darwin’s work revolutionized biology, it was often misused to justify racist notions of racial hierarchy and biological determinism.

Rush’s medical theories were mainstream and deeply influential, shaping generations of physicians and psychiatrists. Together, these ideas reinforced the belief that social deviance and mental illness were rooted in faulty bloodlines—pseudoscientific reasoning that provided a veneer of legitimacy to racism and social control within medicine and psychiatry.

The tragic irony? While Rush advocated for the humane treatment of the mentally ill in certain respects, his racial theories helped pave the way for the pathologizing of entire populations—a mindset that would fuel both American and European eugenics movements in the next century.

American Eugenics: The Soil Psychiatry Grew From

Before Hitler, there was Cold Spring Harbor. Founded in 1910, the Eugenics Record Office (ERO) operated out of Cold Spring Harbor Laboratory in New York with major funding from the Carnegie Institution, later joined by Rockefeller Foundation money. It became the central hub for American eugenic research, gathering family pedigrees to trace so-called hereditary defects like “feeblemindedness,” “criminality,” and “pauperism.”

Between the early 1900s and 1970s, over 30 U.S. states passed forced sterilization laws targeting tens of thousands of people deemed unfit to reproduce. The justification? Traits like alcoholism, poverty, promiscuity, deafness, blindness, low IQ, and mental illness were cast as genetic liabilities that threatened the health of the nation.

The practice was upheld by the U.S. Supreme Court in 1927 in the infamous case of Buck v. Bell. In an 8–1 decision, Justice Oliver Wendell Holmes Jr. wrote, “Three generations of imbeciles are enough,” greenlighting the sterilization of 18-year-old Carrie Buck, a young woman institutionalized for being “feebleminded”—a label also applied to her mother and child. The ruling led to an estimated 60,000+ sterilizations across the U.S.

And yes—those sterilizations disproportionately targeted African American, Native American, and Latina women, often without informed consent. In North Carolina alone, Black women made up nearly 65% of sterilizations by the 1960s, despite being a much smaller share of the population.

Eugenics wasn’t a fringe pseudoscience. It was mainstream policy—supported by elite universities, philanthropists, politicians, and the medical establishment.

And psychiatry was its institutional partner.

The American Journal of Psychiatry published favorable discussions of sterilization and even euthanasia for the mentally ill as early as the 1930s. American psychiatrists traveled to Nazi Germany to observe and advise, and German doctors openly cited U.S. laws and scholarship as inspiration for their own racial hygiene programs.

In some cases, the United States led—and Nazi Germany followed.

The International Congress of Eugenics’ Logo 1921

This isn’t conspiracy. It’s history. Documented, peer-reviewed, and disturbingly overlooked.


From Ideology to Institution

By the early 20th century, the groundwork had been laid. Psychiatry had evolved from a fringe field rooted in speculation and racial ideology into a powerful institutional force—backed by universities, governments, and the courts. But its foundation was still deeply compromised. What had begun with Benjamin Rush’s biologically deterministic theories and America’s eugenic policies now matured into a formalized doctrine—one that treated human suffering not as a relational or spiritual crisis, but as a defect to be categorized, corrected, or eliminated.

This is where the five core doctrines of modern psychiatry emerge.

The Five Doctrines That Shaped Modern Psychiatry

These five doctrines weren’t abandoned after World War II. They were rebranded, exported, and quietly absorbed into the foundations of American psychiatry.

1. The Elimination of Subjectivity

Patients were no longer seen as people with stories, pain, or meaning—they were seen as bundles of symptoms. Suffering was abstracted into clinical checklists. The Diagnostic and Statistical Manual of Mental Disorders (DSM) became the gold standard, not because it offered clear science, but because it offered utility: a standardized language that served pharmaceutical companies, insurance billing, and bureaucratic control. If you could name it, you could code it—and medicate it.

2. The Eradication of Spiritual and Moral Meaning

Struggles once understood through relational, existential, or moral frameworks were stripped of depth. Grief became depression. Anger became oppositional defiance. Existential despair was reduced to a neurotransmitter imbalance. The soul was erased from the conversation. As Berger notes, suffering was no longer something to be witnessed or explored—it became something to be treated, as quickly and quietly as possible.

3. Biological Determinism

Mental illness was redefined as the inevitable result of faulty genes or broken brain chemistry—even though no consistent biological markers have ever been found. The “chemical imbalance” theory, aggressively marketed throughout the late 20th century, was never scientifically validated. Yet it persists, in part because it sells. Selective serotonin reuptake inhibitors (SSRIs)—still widely prescribed—were promoted on this flawed premise, despite studies showing they often perform no better than placebo and come with serious side effects, including emotional blunting, dependence, and sexual dysfunction.

4. Population Control and Racial Hygiene

In Germany, this meant sterilizing and exterminating those labeled “life unworthy of life.” In the U.S., it meant forced sterilizations of African-American and Native American women, institutionalizing the poor, the disabled, and the nonconforming. These weren’t fringe policies—they were mainstream, upheld by law and supported by leading psychiatrists and journals. Even today, disproportionate diagnoses in communities of color, coercive treatments in prisons and state hospitals, and medicalization of poverty reflect these same logics of control.

5. The Use of Institutions for Social Order

Hospitals became tools for enforcing conformity. Psychiatry wasn’t just about healing—it was about managing the unmanageable, quieting the inconvenient, and keeping society orderly. From lobotomies to electroshock therapy to modern-day involuntary holds, psychiatry has long straddled the line between medicine and discipline. Coercive treatment continues under new names: community treatment orders, chemical restraints, and state-mandated compliance.

These doctrines weren’t discarded after the fall of Nazi Germany. They were imported. Adopted. Rebranded under the guise of “evidence-based medicine” and “public health.” But the same logic persists: reduce the person, erase the context, medicalize the soul, and reinforce the system.


Letchworth Village: The Human Cost

I didn’t simply read this in a textbook. I stood there—on the edge of those woods—next to rows of numbered graves.

In 2020, while waiting to close on our New York house, my husband and I were staying in an Airbnb in Rockland County. We were walking the dogs one morning nearing the end of Call Hollow Road, there is a wide path dividing thick woodland when we came across a memorial stone:

“THOSE WHO SHALL NOT BE FORGOTTEN.”

We had stumbled upon the entrance to Old Letchworth Village Cemetery, and we instantly felt it’s somber history. Beyond it, rows of T-shaped markers each one a muted testament to the hundreds of nameless victims who perished at Letchworth. Situated just half a mile from the institution, these weathered grave markers reveal only the numbers that were once assigned to forgotten souls—a stark reminder that families once refused to let their names be known. This omission serves as a silent indictment of a system that institutionalized, dehumanized, and ultimately discarded these individuals.

When we researched the history, the truth was staggering.

Letchworth was supposed to be a progressive alternative to the horrors of 19th-century asylums. Instead, it became one of them. By the 1920s, reports described children and adults left unclothed, unbathed, overmedicated, and raped. Staff abused residents—and each other. The dormitories were overcrowded. Funding dried up. Buildings decayed.

The facility was severely overcrowded. Many residents lived in filth, unfed and unattended. Children were restrained for hours. Some were used in vaccine trials without consent. And when they died, they were buried behind the trees—nameless, marked only by small concrete stakes.

I stood among those graves. Over 900 of them. A long row of numbered markers, each representing a life once deemed unworthy of attention, of love, of dignity.

But the deeper horror is what Letchworth symbolized: the idea that certain people were better off warehoused than welcomed, that abnormality was a disease to be eradicated—not a difference to be understood.

This is the real history of psychiatric care in America.


The Problem of Purpose

But this history didn’t unfold in a vacuum. It was built on something deeper—an idea so foundational, it often goes unquestioned: that nature has no purpose. That life has no inherent meaning. That humans are complex machines—repairable, discardable, programmable.

This mechanistic worldview didn’t just shape medicine. It has shaped what we call reality itself.

As Dr. Rupert Sheldrake explains in Science Set Free, the denial of purpose in biology isn’t a scientific conclusion—it’s a philosophical assumption. Beginning in the 17th century, science removed soul and purpose from nature. Plants, animals, and human bodies were understood as nothing more than matter in motion, governed by fixed laws. No pull toward the good. No inner meaning.

By the time Darwin’s Origin of Species arrived in the 19th century 1859, this mechanistic lens was fully established. Evolution wasn’t creative—it was random. Life wasn’t guided—it was accidental.

Psychiatry, emerging in this same cultural moment, absorbed this worldview. Suffering was pathologized, difference diagnosed, and the soul reduced to faulty genetics and broken wiring.

Today, that mindset is alive in the DSM’s ever-expanding labels, in the belief that trauma is a chemical imbalance, that identity issues must be solved with hormones and surgery, and in the reflex to medicate children who don’t conform.

But what if suffering isn’t a bug in the system?

What if it’s a signal?

What if these so-called “disorders” are cries for meaning in a world that pretends meaning doesn’t exist?

The graves at Letchworth aren’t just a warning about medical abuse. They are a mirror—reflecting what happens when we forget that people are not problems to be solved, but souls to be seen.

Sheldrake writes, “The materialist denial of purpose in evolution is not based on evidence, but is an assumption.” Modern science insists all change results from random mutations and blind forces—chance and necessity. But these claims are not just about biology. They influence how we see human beings: as broken machines to be repaired or discarded.

As we said, in the 17th century, the mechanistic revolution abolished soul and purpose from nature—except in humans. But as atheism and materialism rose in the 19th century, even divine and human purpose were dismissed, replaced by the ideal of scientific “progress.” Psychiatry emerged from this philosophical soup, fueled not by reverence for the human soul but by the desire to categorize, control, and “correct” behavior—by any mechanical means necessary.

What if that assumption is wrong? What if the people we label “disordered” are responding to something real? What if our suffering has meaning—and our biology is not destiny?

“Genetics” as the New Eugenics

Today, psychiatry no longer speaks in the language of race hygiene.

It speaks in the language of genes.

But the message is largely the same:

You are broken at the root.

Your biology is flawed.

And the only solution is lifelong medication—or medical intervention.

We now tell people their suffering is rooted in faulty wiring, inherited defects, or bad brain chemistry—despite decades of inconclusive or contradictory evidence.

We still medicalize behaviors that don’t conform.

We still pathologize pain that stems from trauma, poverty, or social disconnection.

We still market drugs for “chemical imbalances” that have never been biologically verified.

And we still pretend this is science—not ideology.

But as Dr. Rupert Sheldrake argues in Science Set Free, even the field of genetics rests on a fragile and often overstated foundation. In Chapter 6, he challenges one of modern biology’s core assumptions: that all heredity is purely material—that our traits, tendencies, and identities are completely locked in by our genes.

But this isn’t how people have understood inheritance for most of human history.

Long before Darwin or Mendel, breeders, farmers, and herders knew how to pass on traits. Proverbs like “like father, like son” weren’t based on lab results—they were based on generations of observation. Dogs were bred into dozens of varieties. Wild cabbage became broccoli, kale, and cauliflower. The principles of heredity weren’t discovered by science; they were named by science. They were already in practice across the world.

What Sheldrake points out is that modern biology took this folk knowledge, stripped it of its nuance, and then centralized it—until genes became the sole explanation for almost everything.

And that’s a problem.

Because genetics has been crowned the ultimate cause of everything from depression to addiction, from ADHD to schizophrenia. When the outcomes aren’t clear-cut, the answer is simply: “We haven’t mapped the genome enough yet.”

But what if the model is wrong?

What if suffering isn’t locked in our DNA?

What if genes are only part of the story—and not even the most important part?

By insisting that people are genetically flawed, psychiatry sidesteps the deeper questions:

  • What happened to you?
  • What story are you carrying?
  • What environments shaped your experience of the world?

It pathologizes people—and exonerates systems.

Instead of exploring trauma, we prescribe pills.

Instead of restoring dignity, we reduce people to diagnoses.

Instead of healing souls, we treat symptoms.

Modern genetics, like eugenics before it, promises answers. But too often, it delivers a verdict: you were born broken.

We can do better.

We must do better.

Because healing doesn’t come from blaming bloodlines or rebranding biology.

It comes from listening, loving, and refusing to reduce people to a diagnosis or a gene sequence.


The Hidden Truth About Trauma and Diagnosis

As Pete Walker references Dr. John Briere’s poignant observation: if Complex PTSD and the role of early trauma were fully acknowledged by psychiatry, the Diagnostic and Statistical Manual of Mental Disorders (DSM) could shrink from a massive textbook to something no larger than a simple pamphlet.

We’ve previously explored the crucial difference between PTSD and complex PTSD—topics like trauma, identity, neuroplasticity, stress, survival, and what it truly means to come home to yourself. This deeper understanding exposes a vast gap between real human experience and how mental health is often diagnosed and treated today.

Instead of addressing trauma with truth and compassion, the system expands diagnostic categories, medicalizes pain, and silences those who suffer.

The Cost of Our Silence

Many of us know someone who’s been diagnosed, hospitalized, or medicated into submission.

Some of us have been that person.

And we’re told this is progress. That this is compassion. That this is care.

But when I stood at the edge of those graves in Rockland County—row after row of anonymous markers—nothing about this history felt compassionate.

It felt buried. On purpose.

We must unearth it.

Not to deny mental suffering—but to reclaim the right to define it for ourselves.

To reimagine what healing could look like, if we dared to value dignity over diagnosis.

Because psychiatry hasn’t “saved” the abnormal.

It has often silenced, sterilized, and sacrificed them.

It has named pain as disorder.

Difference as defect.

Trauma as pathology.

The DSM is not a Bible.

The white coat is not a priesthood.

And genetics is not destiny.

We need better language, better questions, and better ways of relating to each other’s pain.

And that brings us full circle—to a man most people have never heard of: Dr. John Galt II.

Nearly 200 years ago, in Williamsburg, Virginia, Galt ran the first freestanding mental hospital in America. But unlike many of his peers, he rejected chains, cruelty, and coercion. He embraced what he called moral treatment—an approach rooted in truth, love, and human dignity. Galt didn’t see the “insane” as dangerous or defective. He saw them as souls.

He was influenced by Philippe Pinel, the French physician who famously removed shackles from asylum patients in Paris. Together, these early reformers dared to believe that healing began not with force, but with presence. With relationship. With care.

Galt refused to segregate patients by race. He treated enslaved people alongside the free. And he opposed the rising belief—already popular among his fellow psychiatrists—that madness was simply inherited, and the mad were unworthy of full personhood.

But what does it mean to recognize someone’s personhood?

Personhood is more than just being alive or having a body. It’s about being seen as a full human being with inherent dignity, moral worth, and rights—someone whose inner life, choices, and experiences matter. Recognizing personhood means acknowledging the whole person beyond any diagnosis, disability, or social status.

This question isn’t just philosophical—it’s deeply practical and contested. It’s at the heart of debates over mental health care, disability rights, euthanasia and even abortion. When does a baby become a person? When does someone with a mental illness or cognitive difference gain full moral consideration? These debates all circle back to how we define humanity itself.

In Losing Our Dignity: How Secularized Medicine Is Undermining Fundamental Human Equality, Charles C. Camosy warns that secular, mechanistic medicine can strip people down to biological parts—genes, symptoms, behaviors—rather than seeing them as full persons. This reduction risks denying people their dignity and the respect that comes with being more than the sum of their medical conditions.

Galt’s approach stood against this reduction. He saw patients as complex individuals with stories and struggles, deserving compassion and respect—not just as “cases” to be categorized or “disorders” to be fixed.

To truly recognize personhood is to honor that complexity and to affirm that every individual, regardless of race, mental health, or social status, has an equal claim to dignity and care.

But… Galt’s approach was pushed aside.

Why?

Because it didn’t serve the state.

Because it didn’t serve power.

Because it didn’t make money.

Today, we see a similar rejection of truth and compassion.

When a child in distress is told they were “born in the wrong body,” we call it gender-affirming care.

When a woman, desperate to be understood, is handed a borderline personality disorder label instead.

When medications with severe side effects are pushed as the only solution, we call it science.

But are we healing the person—or managing the symptoms?

Are we meeting the soul—or erasing it?

We’ve medicalized the human condition—and too often, we’ve called that progress.

We’ve spoken before about the damage done by Biblical counseling programs when therapy is replaced with doctrine—how evangelical frameworks often dismiss pain as rebellion, frame anger as sin, and pressure survivors into premature forgiveness.

But the secular system is often no better. A model that sees people as nothing more than biology and brain chemistry may wear a lab coat instead of a collar—but it still demands submission.

Both systems can bypass the human being in front of them.

Both can serve control over compassion.

Both can silence pain in the name of order.

What we truly need is something deeper.

To be seen.

To be heard.

To be honored in our complexity—not reduced to a diagnosis or a moral failing.

It’s time to stop.

It’s time to remember that human suffering is not a clinical flaw. It’s time to remember the metaphysical soul/psyche. 

Our emotional pain is not a chemical defect.

That being different, distressed, or deeply wounded is not a disease.

It’s time to recover the wisdom of Dr. John Galt II.

To treat those in pain—not as problems to be solved—but as people to be seen.

To offer truth and love, not labels, not sterilizing surgeries and lifelong prescriptions.

Because if we don’t, the graves will keep multiplying—quietly, behind institutions, beneath a silence we dare not disturb.

But we must disturb it.

Because they mattered.

And truth matters.

And the most powerful medicine has never been compliance or chemistry.

It’s being met with real humanity.

Being listened to. Believed.

Not pathologized. Not preached at. Not controlled.

But loved—in the deepest, most grounded sense of the word.

The kind of love that doesn’t look away.

The kind that tells the truth, even when it’s costly.

The kind that says: you are not broken—you are worth staying with.

Because to love someone like that…

is to recognize their personhood.

And maybe that’s the most radical act of all.

SOURCES:

  • “Director of the Kaiser Wilhelm Institute for Anthropology, Human Heredity, and Eugenics from 1927 to 1942, [Eugen] Fischer authored a 1913 study of the Mischlinge (racially mixed) children of Dutch men and Hottentot women in German southwest Africa. Fischer opposed ‘racial mixing, arguing that “negro blood” was of ‘lesser value and that mixing it with ‘white blood’ would bring about the demise of European culture” (United States Holocaust Memorial Museum, “Deadly Medicine: Creating the Master Race,” HMM Online: https://www.ushmm.org/exhibition/deadly-medicine/ profiles/). See also, Richard C. Lewontin, Steven Rose, and Leon J. Kamin, Not in Our Genes: Biology, Ideology, and Human Nature 2nd edition (Chicago: Haymarket Books, 2017), 207.
  • Gonaver, The Making of Modern Psychiatry
  • Saving Abnormal-The Disorder of Psychiatric Genetics-Daneil R Berger II
  • Lost Architecture: Eastern State Hospital – Colonial Williamsburg
  • 📘 General History of American Eugenics
    Lombardo, Paul A.
    Three Generations, No Imbeciles: Eugenics, the Supreme Court, and Buck v. Bell (2008)
    This book is the definitive account of Buck v. Bell and American eugenics law. It documents how widespread sterilizations were and provides legal and historical context.
    Black, Edwin.
    War Against the Weak: Eugenics and America’s Campaign to Create a Master Race (2003)
    Covers the U.S. eugenics movement in depth, including funding by Carnegie and Rockefeller, Cold Spring Harbor, and connections to Nazi Germany.
    Kevles, Daniel J.
    In the Name of Eugenics: Genetics and the Uses of Human Heredity (1985)
    A foundational academic history detailing how early American psychiatry and genetics were interwoven with eugenic ideology.

    🧬 Institutions & Funding
    Cold Spring Harbor Laboratory Archives
    https://www.cshl.edu
    Documents the history of the Eugenics Record Office (1910–1939), its funding by the Carnegie Institution, and its influence on U.S. and international eugenics.
    The Rockefeller Foundation Archives
    https://rockarch.org
    Shows how the foundation funded eugenics research both in the U.S. and abroad, including programs that influenced German racial hygiene policies.

    ⚖️ Sterilization Policies & Buck v. Bell
    Supreme Court Decision: Buck v. Bell, 274 U.S. 200 (1927)
    https://supreme.justia.com/cases/federal/us/274/200/
    Includes Justice Holmes’ infamous quote and the legal justification for forced sterilization.
    North Carolina Justice for Sterilization Victims Foundation
    https://www.ncdhhs.gov
    Reports the disproportionate targeting of Black women in 20th-century sterilization programs.
    Stern, Alexandra Minna.
    Eugenic Nation: Faults and Frontiers of Better Breeding in Modern America (2005)
    Explores race, sterilization, and medical ethics in eugenics programs, with data from states like California and North Carolina.

    🧠 Psychiatry’s Role & Nazi Connections
    Lifton, Robert Jay.
    The Nazi Doctors: Medical Killing and the Psychology of Genocide (1986)
    Shows how American eugenics—including psychiatric writings—helped shape Nazi ideology and policies like Aktion T-4 (the euthanasia program).
    Wahl, Otto F.
    “Eugenics, Genetics, and the Minority Group Mentality” in American Journal of Psychiatry, 1985.
    Traces how psychiatric institutions were complicit in promoting eugenic ideas.
    American Journal of Psychiatry Archives
    1920s–1930s issues include articles in support of sterilization and early euthanasia rhetoric.
    Available via https://ajp.psychiatryonline.org

1984 and The Handmaid’s Tale: Misplaced Parallels and Liberal Delusion

Breaking Free: A Conversation with Yasmine Mohammed on Radical Islam, Empowerment, and the West’s Blind Spots

After finishing George Orwell’s 1984, I noticed its resurgence in popularity, especially after Trump’s election. Ironically, it’s not the conservative right but the progressive left that increasingly mirrors Orwellian themes. Similarly, Margaret Atwood’s The Handmaid’s Tale has become a rallying cry for liberals who claim to be on the brink of a dystopian theocracy. Yet, as Yasmine Muhammad pointed out in this week’s episode, this comparison is not only absurd but deeply insulting to women who live under regimes where Atwood’s fiction is a grim reality.

1984: Rewriting Language and History

The Democratic Party’s obsession with redefining language is straight out of Orwell’s playbook. They tell us biology is bigotry and that there are infinite genders, forcing people to adopt nonsensical pronouns or risk social ostracism. This is not progress—it’s the weaponization of language to control thought, eerily similar to Orwell’s Newspeak.

But it doesn’t stop there. They actively rewrite history by renaming monuments, military bases, and even schools, erasing cultural markers in the name of ideological purity. This is doublespeak in action: the manipulation of truth for political orthodoxy. Orwell’s warning that “orthodoxy is unconsciousness” feels disturbingly apt when observing the modern left.

The Handmaid’s Tale: An Insult to Women Who Actually Suffer

In our conversation, Yasmine highlighted the absurdity of liberal claims that America is The Handmaid’s Tale come to life. Yasmine, who grew up under Islamic theocracy, knows firsthand what it’s like to live in a world where women have no autonomy. These women cannot see a doctor without a male guardian, are forced to cover every inch of their bodies, and are denied basic freedoms like education or the right to drive.

Contrast this with the West, where women have more freedom than any other point in history. Liberal women can run around naked at Pride parades, freely express their sexuality, and redefine what it means to be a woman altogether. And yet, they cry oppression because they are expected to pay for their own birth control or endure debates over abortion limits. This level of cognitive dissonance—claiming victimhood while living in unprecedented freedom—is a slap in the face to women who actually suffer under real patriarchal oppression.

Liberal Orthodoxy: Lost in the Sauce

What’s truly Orwellian is how the left uses its freedom to strip others of theirs. They shout about inclusivity but cancel anyone who disagrees. They claim to fight for justice while weaponizing institutions to enforce ideological conformity. Meanwhile, they are so consumed with their own victim complex that they fail to see how absurd their comparisons to dystopian fiction really are.

Orwell and Atwood warned against unchecked power and ideological extremism. If liberals actually read these books instead of using them as aesthetic props, they might realize they’re mirroring the very authoritarianism they claim to oppose. Instead, they’re lost in the sauce, preaching oppression in a society where they have more freedom than they can handle.

As Yasmine said, “You want to see The Handmaid’s Tale? Try being a woman in Saudi Arabia, Iran, or Afghanistan.” The left would do well to remember that before playing the victim in their cosplay dystopia.

Understanding the Evolution of Witch Hunts

Welcome to Taste of Truth Tuesdays, where we unravel the strange, the mysterious, and today—the terrifying. This post delves into one of history’s darkest chapters: the witch hunts. We’ll explore how fear, superstition, and control shaped centuries of persecution and how these patterns are still evident in the modern world. Witch hunts aren’t just a thing of the past—they’ve evolved.

The European Witch Hunts – Early Modern Europe

Let’s start in early modern Europe. Scholar Peter Maxwell-Stuart illuminates the rise of demonology, where the fear of magic and the devil became a weapon of control for those in power. Beginning in the 1500s, political and religious leaders manipulated entire populations by tapping into their deep-rooted fears of ‘evil forces.’ The Church, in particular, weaponized these beliefs, positioning itself as the protector against witches—women (and sometimes men) believed to consort with devils or conjure dark forces. As the idea took hold that witches could be behind every famine, illness, or death, this created a perfect storm of paranoia.

Stuart argues that demonology texts—many sanctioned by the Church—fueled mass hysteria, feeding the narrative that witches were not just local troublemakers but cosmic agents of Satan, hell-bent on destroying Christendom. Ordinary people lived in constant fear of betrayal by their neighbors, leading to accusations that could swiftly escalate into brutal trials, with the accused often tortured into confessing their ‘diabolical’ crimes.

To understand how demonology in Europe gained such traction, we need to go back to Augustine of Hippo. We have mentioned him before in previous episodes, whose writings in the 4th and 5th centuries laid the foundation for Christian perceptions of the devil and demons. Augustine’s ideas, especially in City of God, emphasized the constant spiritual warfare between good and evil, casting demons as agents of Satan working tirelessly to undermine God’s plan. He argued that humans were caught in this cosmic battle, susceptible to the devil’s temptations and tricks.

‘Augustine before a group of demons’, from ‘De civitate Dei’ by Augustine, trans. by Raoul de Presles, late 15th Century

Augustine’s Doctrine of Demons

According to Augustine, demons were fallen angels who had rebelled and now sought to deceive and destroy humanity. While Augustine didn’t explicitly discuss witches, his interpretation of demons helped fuel the belief that humans could be manipulated by evil spirits—whether through pacts, possession, or magical practices. This idea later influenced medieval and early modern European demonology.

Augustine’s views on original sin—that humanity is inherently flawed and in need of salvation—also intensified fears that people, especially women (who were seen as ‘weaker’ spiritually), were more vulnerable to the devil’s influence.

SIDE NOTE: We have discussed the theological concept of original sin in previous episodes: Franciscan wisdom navigating spiritual growth and challenges with Carrie Moore, we specifically spun the doctrine of original sin on its head and then also Unpacking Religious Trauma: Navigating the Dynamics of Faith Deconstruction with Doctor Mark Karris.

In the centuries that followed, these ideas were weaponized to justify witch hunts. Augustine’s legacy is evident in how later theologians and demonologists, such as Heinrich Kramer (author of the infamous Malleus Maleficarum), built upon his ideas of demonic interference to condemn witchcraft as a real, existential threat to Christian society.

Maxwell-Stuart reveals that the creation of demonology wasn’t just religious but deeply political. Kings and clergy alike realized they could consolidate power by stoking the flames of fear, casting witches and sorcerers as a common enemy. The trials served a dual purpose: they reinforced the Church’s supremacy over the spiritual realm and gave ruling elites a tool for maintaining social order. Accusing someone of witchcraft was an effective way to silence dissent or settle personal scores.

Fear as a Tool of Control

Fear wasn’t just manufactured by rulers—it was deeply ingrained in the societal, religious, and legal systems of the time. Scholar Sophie Page reveals how beliefs in magic and the supernatural were not fringe ideas but core components of medieval and early modern life. Magic wasn’t merely a mysterious force; it was a pervasive explanation for any calamity. Failed harvests, plagues, or unexplained illnesses were often attributed to witches or the devil, creating a society constantly on edge, where supernatural forces were believed to lurk behind every misfortune.

By embedding these beliefs into legal codes, authorities could target suspected witches or sorcerers under the guise of protecting the community. Page’s work illustrates how rituals once seen as protective or healing gradually became demonized. Harmless folk practices and herbal remedies, used for centuries, began to be recast as witchcraft, especially when things went wrong. People, particularly those in rural areas, were vulnerable to this thinking because religion and superstition were inseparable from daily life.

Partisan scholars have long debated whether Catholics or Protestants were the “real” witch hunters, but they’ve made little headway. One important change in Christian morality, as discussed by John Bossie, occurred between the 14th and 16th centuries. The moral focus shifted from the Seven Deadly Sins—pride, greed, lust, envy, gluttony, anger, and sloth—to the Ten Commandments. This change, influenced by reform movements that shaped the Protestant Reformation, prioritized sins against God over those against the community. Idolatry and the worship of false gods became viewed as the gravest offenses.

This redefinition of witchcraft followed suit. Instead of being seen as harmful actions toward neighbors, witchcraft was now linked directly to devil worship and regarded as serious heresy. Scholars and church leaders began merging various forms of folk magic and healing into this new narrative, suggesting that practitioners were either knowingly or unknowingly making deals with the devil. Confessions of pacts or attendance at “witch gatherings” were shaped to highlight community failings, like envy and resentment. Consequently, educated society began to see witchcraft as a real threat rather than mere superstition. While traditional beliefs about magic still existed, they were overshadowed by fears of violent backlash from reformers.

The Power of Dualistic Thinking

This dualistic thinking, influenced by St. Augustine, gave rise to a semi-Manichean worldview, where the struggle between good and evil became more pronounced. Manichaeism, an ancient belief system, viewed the world as a battleground between equal forces of good and evil. Although orthodox Christianity rejected this dualism, the focus on the devil’s role in everyday life blurred those lines for many people. By emphasizing the devil’s pervasive influence, religious leaders inadvertently created a belief system in which evil seemed as powerful as good.

In this semi-Manichean view, the devil was not just a tempter of individuals but a corrupting force within communities and even within political and religious practices deemed heretical. Fears of devil-worshipping conspiracies became intertwined with anxieties about witchcraft and moral decay. Reformers, particularly in Protestant movements, fueled these fears by branding idolatry, Catholic rituals, and even folk healing as dangerous openings for the devil’s influence. This perspective transformed witchcraft from a local issue into a broader threat against God and society.

The result was a potent mix of dualistic thinking and an intense focus on spiritual warfare. This not only intensified the persecution of supposed witches but also reinforced the obsession with eliminating anything considered “satanic.” The ideological shift redefined witchcraft as a communal danger, turning innocent healing practices into accusations of demonic pacts.

Every village had its own ‘cunning folk’—individuals skilled in healing and folk magic—yet these very people could easily become scapegoats when something went wrong. The legal structures played a vital role in perpetuating this cycle of fear. Church courts, bolstered by theologians and demonologists, were empowered to try individuals accused of witchcraft, and the accusations quickly spiraled into mass hysteria. Trials often relied on tortured confessions, reinforcing the belief that witches and the devil were real and tangible threats to society. This institutionalized paranoia was a perfect storm of religion, fear, and control.

The Rise of Organized Witch Hunts

Beginning in the late 15th century, witch trials escalated into full-blown hunts, particularly after the publication of the Malleus Maleficarum in 1487. This infamous witch-hunting manual, written by Heinrich Kramer and endorsed by the Pope, offered legal and theological justifications for hunting down witches. It encouraged harsh interrogations and set guidelines for identifying witches based on superficial evidence like birthmarks, behaviors, and confessions extracted under torture. The legal system, which had already started to turn against folk healers, now had a codified method for persecuting them.

In regions like Germany, Scotland, and Switzerland, these legal trials turned into widespread witch hunts. Hundreds, even thousands, of individuals—predominantly women—were accused and executed. What’s particularly fascinating is that these witch hunts often peaked during periods of societal or economic instability when fear and uncertainty made people more susceptible to attributing their misfortunes to external, supernatural forces.

By institutionalizing the persecution of witches, rulers and religious leaders could manage social unrest and solidify their authority. The trials often reinforced the power structures by demonstrating that anyone perceived as a threat to societal order—whether through suspected witchcraft or merely social nonconformity—could be eradicated.

Witch Hunts and Gender

The scapegoating of women played a crucial role in these witch hunts. Owen Davies’ work reveals how the demonization of witches intersected with misogyny, turning the hunts into a gendered form of control. Midwives, healers, or outspoken women were more likely to be targeted, reinforcing patriarchal authority. The very skills that had once been valued, such as healing and midwifery, were redefined as dangerous and linked to dark powers.

As witch hunts spread, the legal frameworks across Europe became more refined and institutionalized, creating a climate where fear of witches and demonic possession became the norm. The trials’ obsession with confessions—often coerced under brutal conditions—further fueled public paranoia, as the more people confessed to witchcraft, the more tangible the ‘threat’ seemed.

The Modern Echoes of Witch Hunts

Fast forward to today, and we find that the legacy of witch hunts lingers. The tactics of fear-mongering, scapegoating, and social control can still be observed in modern contexts. Contemporary movements often mirror historical witch hunts, targeting marginalized groups through accusations and public shaming. Just as witch hunts flourished in times of societal uncertainty, modern societies can succumb to similar dynamics.

In the age of social media, legal accusations spread like wildfire, and the court of public opinion often acts faster than the courts themselves. Political enemies are dragged through the mud with allegations that may or may not have a basis in fact.

The case of Michael Jackson serves as a poignant example of how media narratives can distort reality. The beloved pop icon faced multiple allegations of child molestation, with the most notable case occurring in 2005 during a highly publicized trial. Accusers claimed that Jackson had abused them, yet the defense presented compelling counterarguments, including challenges to the credibility of the witnesses and highlighting inconsistencies in their testimonies. After a lengthy trial, Jackson was acquitted of all charges, but the media frenzy surrounding the case fueled public debate and sensationalism, earning him the derogatory nickname “Wacko Jacko.” This smear campaign perpetuated false narratives about his character and actions. Behind the scenes, Jackson was embroiled in a lawsuit against Sony Music, a battle he was reportedly winning at the time of these allegations. Furthermore, his controversial doctor, Conrad Murray, who administered drugs to Jackson, faced serious legal consequences for his role in the singer’s death, including manslaughter charges. The intersection of these legal battles and the media frenzy created a complex narrative that ultimately tarnished Jackson’s legacy, and that’s what truly breaks my heart.

By the time these individuals have the chance to clear their names, their reputations—and often their careers—are already in ruins. Davies’ research shows us that while modern witch hunts don’t involve burning at the stake, they do involve trial by media and mob justice.

And we can’t talk about modern-day witch hunts without bringing the CIA into the conversation. Since its inception, the CIA has been at the heart of international political manipulations—using covert methods to shape public perception, interfere in foreign governments, and even influence elections here in the United States. In the 1960s, the agency coined the term ‘conspiracy theorist’ to discredit anyone who questioned the official narratives surrounding events like the assassination of JFK. Those who didn’t toe the line were labeled as ‘paranoid’ or ‘dangerous.’ It was the modern version of labeling someone a witch—turning them into a social outcast, not to be trusted.

Fast forward to today: we see similar tactics used against whistleblowers, journalists, and activists who challenge the powerful. Think about Edward Snowden, Julian Assange, and even political figures targeted by intelligence communities. The second they start exposing uncomfortable truths, they are vilified. Whether through leaks, smear campaigns, or selective legal action, these modern-day ‘witches’ face an onslaught of accusations, designed to discredit them before they can fully tell their story.

In many cases, the evidence behind these accusations is shaky at best. The CIA’s involvement in manipulating public perception goes all the way back to Operation Mockingbird, a secret program to influence media narratives, which showed that controlling information was one of the most powerful tools they had. During the Cold War, the United States engaged in a concerted effort to influence and control media narratives to align with its interests, which involved recruiting journalists and establishing relationships with major media outlets.

Edward Bernays, often referred to as the father of public relations, played a pivotal role in these discussions on media manipulation. Working with several major companies, including Procter & Gamble, General Electric, and the American Tobacco Company, Bernays was instrumental in promoting the cigarette brand Lucky Strike, famously linking it to the women’s liberation movement. His connections extend to notable figures like Sigmund Freud, who was Bernays’ uncle, Freud’s psychoanalytic theories significantly shaped Bernays’ PR strategies. Throughout his career, Bernays leveraged media to influence public perception and political leaders, raising profound questions about the power dynamics of media and its capacity to shape societal narratives. (If you’re intrigued by the intricate interplay of media and propaganda, this is a rabbit hole worth exploring!)

Today, that same fear-mongering tactic is played out on a much larger scale. Accusations, whether of conspiracy, treason, or subversion, become tools to silence anyone questioning the status quo. Just as witches in the past were seen as ‘different’ and thus dangerous, today’s targets are often people who challenge the system.

And while throughout the 1300-1600s, there was no due process for the accused witches, today, we see something similar in the digital realm. There’s no real accountability or fairness in the court of public opinion. All it takes is a viral accusation—a tweet, a blog post, or a video—and the person’s career, family, and mental health can be obliterated overnight. No evidence required, no trial, no defense.

So, what can we learn from this history? From the witch hunts of early modern Europe to today’s viral accusations and political fearmongering, there’s one key lesson: fear remains one of the most dangerous tools of control. When we allow fear to dictate our actions—whether it’s fear of witches, outsiders, or anyone who doesn’t fit into the mold—we lose sight of reason and humanity.

In closing, I’d like to examine the phenomenon of witch hunts through the lens of amygdala hijacking, a topic we discussed in a previous episode. This term refers to the brain’s immediate response to perceived threats, where the amygdala—the emotional center of the brain—takes control, often resulting in irrational and impulsive actions.

During the witch hunts, communities gripped by fear of the unknown succumbed to a mob mentality whenever someone fell ill or misfortune struck. The amygdala triggered a fight-or-flight response, compelling individuals to find scapegoats, with cunning folk and those deviating from societal norms becoming prime targets. As accusations spiraled, fear dominated decision-making instead of rational thought. Today, we observe similar patterns in how social media can incite panic, leading to modern witch hunts. When fear takes over, reason often fades, resulting in unjust vilification—echoing the dark lessons of history.

As we navigate our modern world, let’s remain vigilant against the echoes of this history, seeking truth and questioning the narratives that shape our beliefs. Fear may be powerful, but curiosity and critical thinking are our greatest allies in maintaining our autonomy and humanity.

Resources:

Briggs, Robin. Witches and Neighbors: The Social and Cultural Context of European Witchcraft. Oxford University Press, 1996.

  • This book provides a comprehensive exploration of the social dynamics surrounding witch hunts in early modern Europe, highlighting the interplay of fear, community, and cultural beliefs.

Maxwell-Stuart, Peter G.Witchcraft in Europe, 1100-1700: A Sourcebook. Palgrave Macmillan, 2010.

  • This sourcebook compiles essential documents related to the history of witchcraft in Europe, providing insights into how fear and persecution were constructed and justified.

Page, Sophie.Magic in the Middle Ages. Cambridge University Press, 2005.

  • This book offers an analysis of the cultural and religious practices surrounding magic during the medieval period, emphasizing how these beliefs shaped societal attitudes toward witchcraft.

Bossy, John.Christianity in the West, 1400-1700. Oxford University Press, 1985.

  • Bossy examines the transformation of Christian morality during the Reformation, providing context for the changing perceptions of witchcraft and heresy.

Davies, Owen. Popular Magic: Cunning Folk in English History. Continuum, 2007.

  • This work explores the role of cunning folk—those who practiced folk magic—and how their practices were perceived within the broader context of witchcraft accusations.

Baroja, J. C. Witches and Witchcraft. University of California Press, 1990.

  • Baroja’s work examines the historical and cultural significance of witchcraft, providing insights into the social conditions that fueled witch hunts and the cultural implications of these beliefs.

The first use of the term “conspiracy theory” is much earlier — and more interesting — than historians have thought.

Is Veganism a Psy-Op? Maybe. The Real Issue is Engineering Ourselves Away from Nature

In today’s complex world of nutrition and health, embracing skepticism and critical thinking is essential. Rather than accepting dominant narratives, challenge them to uncover the truth.

🥕 Veganism vs. Meat: What’s the Real Issue? 🥕

The debate over veganism often gets tangled in oversimplified conspiracies. However, the real concern lies in our growing disconnect from nature’s balance. Our modern lifestyles and diets are increasingly detached from natural ecosystems, which profoundly affects our health and well-being.

To truly grasp the nuances of nutrition and health, especially when it comes to veganism, we must examine how our beliefs have been shaped by science, history, and religion. Over the next few weeks, we will time traveling through the last century to see how these elements intertwine and influence our perspectives on veganism.

🔬Before Lobbyism: The Golden Age of Nutritional Science 🔬

Before the rise of lobbyism and industrial influence in the mid-20th century, nutritional science was marked by pioneering research that laid the groundwork for our understanding of essential nutrients. One such figure was Elmer McCollum: Vitamin Pioneer.

Elmer McCollum, a prominent nutrition researcher in the early 20th century, made groundbreaking discoveries regarding vitamins A, B, C, and D. His work was instrumental in identifying the role of these vitamins in preventing nutritional deficiencies.

Vitamin A (Retinol): McCollum’s work significantly advanced the understanding of vitamin A, which is crucial for vision, immune function, and skin health. Retinol, the active form of vitamin A, is primarily found in animal-based foods like liver, fish oils, eggs, and dairy products. Unlike plant-based sources, which provide provitamin A carotenoids like beta-carotene that the body must convert into retinol, animal sources deliver this vitamin in its ready-to-use form.

🧬 BCO1 Gene and Vitamin A 🧬

Did you know that about 45% of people have a genetic variation that makes it hard for them to get enough vitamin A from plant foods? This is because of a gene called BCO1.

The BCO1 gene is responsible for converting beta-carotene (found in carrots, sweet potatoes, and other plants) into active vitamin A, also known as retinol. But for almost half of the population, this gene doesn’t work very efficiently, meaning their bodies can’t make enough vitamin A from plants alone.

Vitamin A is crucial for things like good vision, a strong immune system, and healthy skin. If you can’t get enough from plants, you might need to include animal foods like liver, fish oils, or dairy in your diet to make sure you’re meeting your vitamin A needs.

This explains why some people might struggle with a vegan diet—they need the more easily absorbed form of vitamin A that comes from animal products.

McCollum’s research emphasized the importance of unprocessed, nutrient-rich foods in maintaining health. Diets high in refined grains can exacerbate nutritional deficiencies by displacing more nutrient-dense foods. This indirectly touches on the issues, we see today related to grain consumption, though McCollum’s era was more focused on preventing deficiencies than on inflammation.

The Refinement of Grains: A Double-Edged Sword

As the food industry grew and refined processing techniques became widespread, the nutritional value of grains was compromised. The removal of bran and germ during processing not only reduced the essential vitamins and minerals in grains but also increased their glycemic index. This shift contributed to inflammation and other metabolic issues, like Type-2 Diabetes a concern that has become more prominent in later research.

A Shift in Focus: From Nutritional Science to Industrial Influence

McCollum’s era represents a time when nutritional science was still largely driven by the quest to understand and prevent deficiencies. However, as we moved into the mid-20th century, the influence of lobbyists and industrial interests began to muddy the waters, promoting processed foods and refined grains that strayed from McCollum’s principles of whole, nutrient-rich foods.

🥕 The Influence of Religion and Early Health Movements 🥕

Ellen G. White, a key figure in the Seventh-day Adventist Church, significantly impacted early American dietetics with her advocacy for a plant-based diet and abstinence from alcohol, tobacco, and caffeine. Her health reforms, which emphasized vegetarianism and whole foods, were institutionalized through health institutions like the Battle Creek Sanitarium and figures like Dr. John Harvey Kellogg. The sanitarium’s success and the dissemination of these dietary principles led to the establishment of the American Dietetic Association in 1917, which originally promoted many of these plant-based, whole-food principles. The Adventist emphasis on preventive health care and diet principles laid the groundwork for many modern dietary guidelines and continue to influence discussions around veganism.

🔬 The Role of Science in Shaping Dietary Beliefs 🔬

In the early 20th century, scientific advancements also played a role in shaping nutrition. The Fetner Report highlighted the need for standardized nutritional guidelines and brought attention to the importance of vitamins and minerals. Meanwhile, innovations like Crisco introduced hydrogenated fats into American diets, shifting culinary practices and influencing our understanding of what constitutes a healthy diet.

In a future episode dropping 9/10, we’ll take a deeper dive into how industrialization, scientific reports, and influential figures like John D. Rockefeller and Ancel Keys have further impacted our dietary beliefs and public health policies. Stay tuned as we explore:

  • The Flexner Report: How it reshaped medical education and its ripple effects on nutrition science.
  • The Rise of Processed Foods: The transformation of our food supply and its long-term health implications.
  • Rockefeller’s Influence: The role of industrial interests in shaping modern dietary guidelines.
  • Ancel Key’s: His research became highly influential in the field of nutrition, primarily took place during the mid-20th century, particularly in the 1950s and 1960s. His most famous work, the Seven Countries Study, began in 1958 and was published over several decades. This research was pivotal in linking dietary fat, particularly saturated fat, to heart disease and played a significant role in shaping dietary guidelines that emphasized reducing fat intake to prevent cardiovascular disease. Now adays it is seen as deeply controversial due to several perceived flaws that have been widely discussed by critics over the years.

How does current research define the top nutrient-dense foods?

📰 Spotlight on Micronutrient Density: A Key to Combatting Global Deficiencies

A March 2022 study published in Frontiers in Nutrition titled “Priority Micronutrient Density in Foods” emphasizes the importance of nutrient-dense foods in addressing global micronutrient deficiencies, particularly in vulnerable populations. The research identifies organ meats, small fish, dark leafy greens, shellfish, and dairy products as some of the most essential sources of vital nutrients like vitamin A, iron, and B12. These findings could be instrumental in shaping dietary guidelines and nutritional policies.

🔗 Read more here.

🍽️ Plant vs. Animal Nutrients: Understanding Bioavailability 🍽️

When it comes to nutrient absorption, not all foods are created equal. The bioavailability of nutrients—the proportion that our bodies can absorb and use—varies significantly between plant and animal sources.

🌱 Plant-Based Nutrients: While plant foods are rich in essential vitamins and minerals, they also contain anti-nutrients like phytates and oxalates. These compounds can bind to minerals such as iron, calcium, and zinc, inhibiting their absorption. For example, non-heme iron found in plants is less efficiently absorbed compared to the heme iron from animal sources. Similarly, the vitamin A found in plants as beta-carotene requires conversion to retinol in the body, a process that is not always efficient, particularly in certain populations.

🍖 Animal-Based Nutrients: Animal products, on the other hand, often provide nutrients in forms that are more readily absorbed. Heme iron from meat, retinol from animal liver, and vitamin B12 from dairy and eggs are all examples of highly bioavailable nutrients. These forms are directly usable by the body without the need for complex conversions, making animal products a more reliable source for certain essential nutrients.

🌍 Global Property Rights: Gender Inequality 🌍

Promoting veganism can unintentionally undermine the very principles of women’s rights and social justice that the political left often advocates for. In many countries, women face significant legal and cultural barriers that prevent them from owning land, despite laws that may suggest otherwise. However, in these same regions, women often have the ability to own and manage livestock, which serves as a crucial economic resource and a form of wealth.

This disparity highlights the persistent challenges in achieving gender equality in property rights, especially in rural areas where land ownership is key to economic independence and security. While livestock ownership is valuable, it doesn’t offer the same level of security or social status as land ownership. The lack of land rights perpetuates gender inequality, limiting women’s economic power, social status, and access to resources.

🌿 Plant-Based Diets and Environmental Costs 🌿

Plant-based diets are often praised for their environmental benefits, yet it’s crucial to recognize the complexities involved. While the availability of vegan foods has significantly improved, making it easier than ever to follow a plant-based diet, this increased accessibility does not necessarily equate to better environmental outcomes.

Many vegan products rely heavily on industrial agriculture and monocropping practices. These methods can lead to deforestation, soil depletion, and the loss of biodiversity. The production of popular vegan ingredients, such as soy and almonds, often involves large-scale farming that can have detrimental effects on local ecosystems. Additionally, the industrial processes used to produce processed vegan foods, including heavy use of pesticides, fertilizers, and water, also contribute to environmental concerns.

Understanding these trade-offs is crucial for making informed dietary choices. Opting for sustainably farmed, organic produce and supporting local farmers can help mitigate some of these negative impacts. It’s not just about choosing plant-based foods, but also about how they are produced.

🔄 Ethical Food Choices 🔄

Making ethical food choices involves a comprehensive evaluation of your diet’s impact on health, the environment, and animal welfare. While plant-based diets can be a step towards reducing your carbon footprint, it’s important to consider the broader implications of industrial agriculture and monocropping. Strive for a balanced approach that aligns with your values and promotes sustainability. This might include supporting local and organic options, as well as exploring ways to minimize your environmental impact through diverse and responsible food choices.

By being mindful of these factors, you can better navigate the complexities of dietary decisions and work towards a more ethical and sustainable future.

🔍 Listen to Our Podcast for More 🔍

For an in-depth exploration of these topics and more, tune into our podcast. We offer detailed discussions and insights into how history, science, and societal trends shape our understanding of nutrition and health. Stay curious and informed!

In a future episode dropping 9/10, we’ll take a deeper dive into how industrialization, scientific reports, and influential figures like John D. Rockefeller have further impacted our dietary beliefs and public health policies. Stay tuned as we explore:

  • The Flexner Report: How it reshaped medical education and its ripple effects on nutrition science.
  • The Rise of Processed Foods: The transformation of our food supply and its long-term health implications.
  • Rockefeller’s Influence: The role of industrial interests in shaping modern dietary guidelines.

The interplay of religion, science, and industry has profoundly influenced our beliefs about veganism and nutrition. By understanding these historical and scientific contexts, we gain insight into the broader impact on our dietary choices and health.

Don’t miss the upcoming episode where we’ll explore these themes in greater depth!

Resources:

1. Historical and Nutritional Science:

“Nutrition and Physical Degeneration” by Weston A. Price: Examines traditional diets and their impact on health, providing historical context for nutritional science.

“The Adventist Health Study: 30 Years of Research” edited by Gary E. Fraser: Covers the impact of vegetarian diets advocated by the Seventh-day Adventists.

“Food Politics: How the Food Industry Influences Nutrition and Health” by Marion Nestle: Examines how food industries shape dietary guidelines and public perception.

“The Vitamin D Solution” by Michael F. Holick: Offers insights into the importance of Vitamin D, complementing McCollum’s work on essential nutrients.

Prophetess of Health: A Study of Ellen G. White (Library of Religious Biography) Paperback – July 2, 2008

Articles:

“Ellen G. White and the Origins of American Vegetarianism” from Journal of the American Dietetic Association: Explores the historical influence of Ellen G. White on American dietetics.

“Elmer McCollum: The Vitamin Pioneer” from The Journal of Nutrition: Provides an overview of McCollum’s contributions to nutritional science.

Genetic Factors and Vitamin A

  • Research Papers:
    • “The Role of Genetic Variability in Vitamin A Metabolism” by Steven A. Arneson et al. (Journal of Nutrition): Discusses the genetic factors affecting Vitamin A conversion.
    • “BCO1 Genetic Variation and Beta-Carotene Conversion” in American Journal of Clinical Nutrition: Explores how genetic differences impact the conversion of beta-carotene to Vitamin A.

The Impact of Industrial Agriculture

  • Books:
    • “The Omnivore’s Dilemma” by Michael Pollan: Investigates the industrial food system and its environmental impact.
    • “The End of Food” by Paul Roberts: Looks at the global food industry and its implications for health and the environment.
  • Articles:
    • “The Hidden Costs of Industrial Agriculture” from Environmental Research Letters: Analyzes the ecological impacts of industrial farming practices.

1. Regenerative Agriculture Principles and Practices

  • Books:
    • “Regenerative Agriculture: How to Create a Self-Sustaining Farm Ecosystem” by Richard Perkins: Provides a comprehensive guide to regenerative farming practices.
    • “The Regenerative Garden: How to Grow Healthy Soil and Manage Your Garden for the Future” by Maria Rodale: Focuses on regenerative techniques for gardening.
    • “Dirt to Soil: One Family’s Journey into Regenerative Agriculture” by Gabe Brown: Shares practical experiences and insights from a farmer who has successfully implemented regenerative practices.
  • Articles:
    • “Regenerative Agriculture: What Is It and Why Does It Matter?” from Regenerative Agriculture Initiative: Provides an overview of regenerative agriculture principles and benefits.
    • “The Benefits of Regenerative Agriculture for Soil Health and Sustainability” from Agronomy Journal: Discusses how regenerative practices impact soil health and sustainability.

2. Sustainable and Ecological Farming

  • Books:
    • “The Soil Will Save Us: How Scientists, Farmers, and Foodies Are Healing the Soil to Save the Planet” by Kristin Ohlson: Explores how soil health can be restored through sustainable practices.
    • “Beyond the Jungle: Regenerative Agroforestry and Resilient Communities” by S. H. Smith: Examines the role of agroforestry in regenerative practices and community resilience.
  • Articles:
    • “Sustainable Agriculture and Its Impact on Environmental Conservation” from Sustainable Agriculture Research: Analyzes how sustainable farming methods contribute to environmental conservation.
    • “Ecological Farming: Benefits Beyond the Farm Gate” from Ecology and Society: Looks at the broader ecological benefits of adopting ecological farming practices.

3. Soil Health and Carbon Sequestration

  • Books:
    • “The Carbon Farming Solution: A Global Toolkit of Perennial Crops and Regenerative Agriculture Practices for Climate Change Mitigation and Food Security” by Eric Toensmeier: Focuses on using regenerative practices to sequester carbon and improve soil health.
    • “Soil: The Incredible Story of What Keeps Us Alive” by David R. Montgomery: Provides an in-depth look at soil science and its crucial role in agriculture and climate stability.
  • Articles:
    • “Carbon Sequestration and Soil Health: The Role of Regenerative Agriculture” from Agricultural Systems: Discusses how regenerative agriculture practices contribute to carbon sequestration and soil health.
    • “Soil Organic Matter and Its Role in Carbon Sequestration” from Journal of Soil and Water Conservation: Explores the importance of soil organic matter in maintaining soil health and sequestering carbon.

4. Food Systems and Regenerative Practices

  • Books:
    • “The Ecology of Food: A Historical Perspective” by Peter M. Smith: Provides historical context on food systems and their ecological impact.
    • “The Omnivore’s Dilemma: A Natural History of Four Meals” by Michael Pollan: While it explores various food systems, it touches on sustainable and regenerative practices in agriculture.
  • Articles:
    • “The Future of Food: Regenerative Agriculture and Its Role in Sustainable Food Systems” from Food Policy: Examines the role of regenerative agriculture in creating sustainable food systems.
    • “Regenerative Agriculture and Food Security: An Integrative Approach” from Journal of Agricultural and Environmental Ethics: Looks at how regenerative practices contribute to food security and sustainability.

Gender Inequality and Property Rights

  • Books:
    • “Women, Work, and Property: Gender Inequality and the Economic Impact of Land Rights” by Elizabeth N. L. Allwood: Analyzes the intersection of gender, land ownership, and economic empowerment.
  • Articles:
    • “Gender and Land Rights: A Global Overview” from World Development: Examines gender disparities in land ownership and its implications for women’s economic status.

“Women in Half the World Still Denied Land, Property Rights Despite Laws.”

The Crunchy-to-Alt-Right Pipeline: from Wellness to Extremism

Over the last few weeks, we have been exploring the complex interplay between radicalization, conspiracies and religion. During the pandemic, I was one of those new-age rebels that was pumped into conspiracy and conversion to religion pipeline. I was one of those people seeking answers and meaning that was drawn to radical ideologies and conspiratorial narratives that promised belonging, purpose, and empowerment.

A huge aspect of my deconstruction process is realizing how I’ve been susceptible and caught up in cult-like dynamics for most of my adult life. I spent years entangled in an MLM (2016-2020), which only worsened my dis0rded eat1ng behaviors from high school. These products often promoting unrealistic body standards and fostering unhealthy relationships with food. Feeling lost without that community, I was drawn into pandem1c conspiracies and eventually into high-control religion.

The “crunchy hippie to alt-right pipeline” is a phenomenon where individuals initially attracted to alternative wellness and New Age practices become increasingly exposed to far-right ideologies.

This shift is facilitated by social media algorithms and influential figures who blend wellness content with conspiracy theories and extremist views. This shift is facilitated by social media algorithms and influential figures who blend wellness content with conspiracy theories and extremist views.

Key Points of the Pipeline:

  1. Algorithmic Influence:
    • Social media platforms like YouTube and Instagram use algorithms that can gradually expose users to more extreme content. For instance, someone watching videos on natural health remedies might eventually receive recommendations for videos that include far-right conspiracy theories or anti-establishment rhetoric​ (Virginia Review of Politics)​.
  2. Overlapping Values:
    • Certain aspects of New Age and wellness cultures, such as skepticism of mainstream medicine and government, overlap with the distrust and anti-establishment sentiments of far-right groups. This makes the transition smoother as the ideologies can appear to support each other​ (Cross Cultural Solidarity)​.
  3. Influential Figures:
    • Wellness influencers who propagate conspiracy theories (like QAnon) help bridge the gap between New Age communities and far-right ideologies. They often present themselves as offering alternative truths, which can be appealing to those already disillusioned with conventional systems​ (Cross Cultural Solidarity)​.
  4. Community Dynamics:
    • Online communities play a crucial role. Individuals often seek validation and a sense of belonging in these groups. Once part of a community that blends wellness with far-right views, it becomes easier to accept and internalize these extremist ideologies​ (Virginia Review of Politics)​​ (Cross Cultural Solidarity)​.

Implications:

  • Radicalization: This pipeline can lead to the radicalization of individuals who initially joined wellness communities for benign reasons but gradually adopt extremist views.
  • Polarization: The spread of far-right ideologies within wellness spaces contributes to societal polarization and the mainstreaming of conspiracy theories.
  • Public Health Concerns:
    • Misinformation and Hesitancy towards “BigPharma”
      Social media platforms have been conduits for the dissemination of misinformation regarding 💉, leading to hesitancy. False claims about safety and conspiracy theories have undermined public health efforts.
    • Addressing these public health concerns requires a multi-faceted approach that includes combating misinformation, improving mental health services, addressing healthcare inequities, ensuring continuity of chronic disease management, strengthening public health infrastructure, and promoting evidence-based health practices. Public awareness and education, policy reforms, and community engagement are essential in tackling these challenges and improving overall public health outcomes

Conclusion:

Understanding this pipeline is essential for recognizing how seemingly unrelated interests in wellness and spirituality can be co-opted by extremist ideologies. It highlights the need for vigilance and critical thinking in online spaces, as well as the importance of promoting credible information and fostering inclusive communities. For more detailed discussions on this topic, you can refer to articles from sources like the Virginia Review of Politics and Cross Cultural Solidarity​ (Virginia Review of Politics)​​ (Cross Cultural Solidarity)​.

One of the major injustices in Christian theology is….

Monergism VS Synergism…. The age-old debate with God‘s relationship with the world…. As I’ve been researching, I’ve been very disappointed to see how many continue to misrepresent Jacob Arminius & what he taught.

👉🏻Swipe👉🏻

👉🏻As with total depravity, many Arminians LATER rejected unconditional perseverance, & taught that a person can lose salvation through neglect, as well as conscious rejection of grace. Many other Armenians came to believe in the eternal security of those truly regenerated and justified by grace.

Yet, the smear campaigns don’t hold a flame to Prince Maurice’s treatment of Arminian statesmen at the Synod of Dort in 1618. John Bogerman a Calvinist preacher, was in favor of punishing heresy by death… and so he committed murder in the name of conformity. Similar to how they persecuted the Anabaptists, who were a radical group of dissenters from the mainstream Protestant movement. They felt the Magisterial Reformers (Luther, Zwingli Calvin etc) were all stuck in Constantinianism and Augustinianism. These were the two main diseases of medieval Christianity and these radical Reformers wished to eradicate it from Christianity itself.

Magisterial Leaders such as Lutherans in Germany, Anglicans in England, Ulrich Zwingli in Switzerland and John Calvin in Geneva were instrumental in the spreading of falsehoods, lies, suppression and persecution of Anabaptists.

As a result thousands were drowned, (including women) beheaded or burned at the stake. Others fled across Europe and eventually to the Americas in search of security to practice their faith.💔

Tune into this week’s podcast episode Is Easter Christian or Pagan, as we discuss a bit more behind the Reformation history!!

Drops tomm 12 am EST!

#theology #calvinism #arminianism #theologymatters #reformation #debates #controversial #biblical #doctrine #religion #holyweek #history

Did you know about this part of Reformation History?