The Christian Inheritance of the West

What Christianity Absorbed, Built, and Left Behind

Listen as you read here!

People say this all the time.

That the West got its ideas about pluralism, tolerance, and liberty from Christianity. That without it, there would be no concept of human dignity, no rights, no freedom in the modern sense. And that if those things feel unstable now, the solution is simple: return to the source.

The claim that pluralism, tolerance, and liberty are direct inheritances of Christianity is not just oversimplified. It reverses the historical pattern.

In Part 1, I pushed back on the idea that Christianity “founded the West” in any clean or singular sense, or that returning to it offers an obvious path forward. In Part 2, I stepped back and looked at something more fundamental: the fragility of freedom itself. Not as an abstract ideal, but as a social order that depends on limits, restraint, and a population capable of sustaining it. More importantly, I looked at how quickly that order begins to break down when those conditions are no longer present.

Across the responses to both pieces, there was a shared sense that something is not working. Not just politically, not just culturally, but at a deeper level that is harder to name.

One way to make sense of that is to stop looking for a single cause and start looking at how the whole inheritance fits together.

Western civilization did not develop along one track. It emerged through multiple layers operating at the same time. At a minimum, those layers include institutions, culture, and psychology.

Institutions include law, political authority, and the distribution of power. Culture includes religion, tradition, identity, and shared meaning. Psychology includes the moral instincts people use to interpret the world: instincts tied to fairness, loyalty, authority, purity, harm, belonging, and threat.

For long stretches of time, those layers reinforced one another. Institutions reflected shared values. Cultural traditions gave meaning to authority. Moral instincts were channeled through forms of life that provided both order and legitimacy.

But that fit was never permanent.

When those layers begin to pull apart, the result is not merely disagreement. It is instability.

That is the backdrop for this final piece.

The goal here is not to argue that Christianity caused the West, or that it deserves credit for everything people now associate with Western civilization. It is also not to reduce Christianity to a purely destructive force. Both approaches distort the picture in different ways.

The same problem appears in the phrase “Judeo-Christian values.” This often creates the impression of a smooth and unified inheritance, when the actual history is far more fractured. Judaism and Christianity are related, but they are not interchangeable. Christianity did not simply preserve Jewish covenantal thought. It reinterpreted it, universalized it, and claimed fulfillment over it. A tradition rooted in a particular people, law, land, and covenant was recast as a universal message for all mankind.

This repositioning changed the role of religion entirely. It no longer sits alongside other domains. It began to judge them.

It loosened religion from peoplehood and place. It made belief itself the primary marker of belonging. And once belief becomes the primary boundary, disagreement takes on a different moral weight.

Today’s article will address the harder question:

What did Christianity reorganize, what did it scale, and what did it leave unstable?

Because Christianity’s real inheritance was not simply compassion, liberty, or dignity. It reshaped how belief, authority, identity, and moral obligation functioned at a civilizational level. It expanded moral language in ways that could operate across large populations, but it also introduced sharper boundaries between true and false belief, salvation and error, belonging and exclusion.

That combination, expansion on one side and constraint on the other, is where the inheritance becomes complicated.


McClees, Helen and Christine Alexander. 1933. The Daily Life of the Greeks and Romans: As Illustrated in the Classical Collections, 5th ed. pp. 131, 133, fig. 159, New York: The Metropolitan Museum of Art.

SECTION I: THE GOOD
What Christianity Absorbed and Reorganized

Before getting into what Christianity actually contributed, it’s worth being clear about what is usually attributed to it.

A moral framework. Stable family structures. The unification of fragmented tribal societies into something resembling a shared civilization. A sense of cohesion strong enough to hold large populations together.

Those developments did happen. The question is where they came from…

Because none of those things begin with Christianity. They depend on something older: stability across generations, shared practices, inherited obligations, and a way of life that binds people before it explains itself.

That is what tradition is.

The word itself comes from the Latin traditio: a handing over, a passing down, something delivered across generations. But that definition only gets you so far. Tradition is not just a set of ideas preserved in texts or doctrines. It is lived. It shows up in habits, rituals, inherited gestures, seasonal rhythms, family patterns, and the quiet repetition of things people do not always stop to explain but continue to do anyway.

It exists in the structure of daily life.

You see it most clearly in how societies deal with death.

Long before Christianity became dominant in Europe, burial practices already reflected a deep sense of connection between the living and the dead. In the Stone Age, communities used mass graves in caves or pits. Later, megalithic cultures constructed communal tombs that anchored memory to specific places. Indo-European groups developed barrows and cremation practices that changed over time while preserving the same underlying logic.

The dead were not discarded. They were placed, remembered, and integrated into the ongoing life of the community.

Tradition, in that sense, is not something invented at a particular moment. It is something carried forward, shaped and reshaped over time without losing its original intention.

Christianity enters into that world rather than creating it from scratch.

What changes is not the existence of tradition, but its scale and its organizing thought.

Earlier religious life was largely tied to local identity: tribe, land, household, ancestry, city, and people. Christianity expands beyond that. It speaks in universal terms and builds a shared symbolic order that can operate across regions and populations that do not share the same lineage, gods, rituals, or customs.

That increases the reach of the moral imagination.

Concern no longer stops at the boundary of immediate belonging. It extends outward, attaching value to individuals beyond their role within a specific family, tribe, or city. Over time, that broader vision feeds into developments people now associate with the Western inheritance: ideas about dignity, education, care for the poor, moral responsibility, and obligation toward those outside one’s immediate circle.

But this is typically where the story gets oversimplified.

Those impulses did not originate with Christianity. Traditions within the Greco-Roman world had already developed forms of civic responsibility, philanthropy, patronage, public works, and mutual obligation. Grain distributions, civic benefaction, philosophical ethics, and local forms of duty were not Christian inventions.

But even the Greco-Roman world was not self-contained. It had already absorbed influences from older and neighboring civilizations (Egyptian, Mesopotamian, Anatolian, and Phoenician) through trade, conquest, and cultural exchange. As scholars like Martin L. West and Walter Burkert have shown, Greek thought itself was shaped in part by these eastern traditions.

The ancient world was not morally empty before the church arrived. It was already layered, interconnected, and carrying forward inherited forms of order, obligation, and meaning.

You can see this clearly in Stoic thought. Christianity is often treated as if it introduced universal human concern into a cruel and indifferent ancient world. Stoicism already spoke in universal terms. It could describe human beings as participants in a shared moral order and extend concern beyond tribe, city, or immediate kinship.

But the structure was different.

The bronze Equestrian Statue of Marcus Aurelius, Capitoline Hill, Rome.

Runar Thorsteinsson’s comparison of Roman Christianity and Roman Stoicism helps clarify the distinction. Stoicism could speak of universal humanity without making moral belonging depend on conversion to a saving truth. Early Christianity, by contrast, carried a universal message while also drawing a sharper boundary around religious adherence. Its moral vision expanded outward, but it did so through a division between those inside and outside the saving order.

Christianity did not invent universal concern but it did reorganize it.

It took older moral instincts, philosophical ideas, Jewish inheritance, Roman scale, and local traditions, then bound them into a universal religious narrative. It gave those instincts a broader scope, a more unified story, and a more durable institutional form.

But expansion alone does not explain why a civilization holds together.

A social order lasts when it fits the way people already experience the world.

People do not move through life as detached rational observers. They respond through instinct: loyalty and betrayal, fairness and injustice, authority and rebellion, purity and contamination, belonging and threat. These instincts do not operate on their own. They cluster.

In more traditional societies, moral intuitions tend to reinforce one another. Care, fairness, loyalty, authority, and a sense of the sacred operate together rather than pulling apart. Even when people disagree, they often draw from the same underlying moral vocabulary when interpreting what is happening around them.

That shared moral vocabulary gives a society stability.

Christianity operated at that level.

It did not simply present moral rules. It gave instinct narrative form and placed it inside a larger story about meaning, suffering, hierarchy, obligation, sin, redemption, and ultimate reality. It offered a way of interpreting the world itself.

For people living in unstable conditions, where political authority could be inconsistent and survival uncertain, that kind of story organized experience. It offered coherence in a world that might otherwise feel random. It placed individuals inside a larger order and gave meaning to suffering, duty, death, and loss.

Once that fit took hold between cultural meaning, institutional power, and moral instinct, it became difficult to dislodge.

At the same time, Christianity did not remain completely closed off to innovative thought. Even within a religious order that emphasized authority and inherited truth, there were moments where that inheritance was tested from within.

Peter Abelard represents one of those moments.

His importance lies less in the drama of his life and more in the method he applied to truth itself. The intellectual world he entered was structured around inherited authority. Figures like Augustine were treated as settled voices, and the role of the student was often to understand, organize, and transmit what had already been established.


Peter Abelard with Book Giclee

Reasoning had a place, but it operated within limits. It was expected to clarify, not destabilize.

Abelard did not reject the tradition from the outside. He worked within it and exposed its internal tensions. In Sic et Non, he placed authoritative statements side by side in a way that made contradiction difficult to ignore.

If these sources were meant to provide certainty, why did they diverge so sharply?

If truth had already been handed down in a unified form, why did it fracture under comparison?

He treated those questions as a starting point rather than a threat to avoid.

“For it is from doubt that we arrive at questioning, and in questioning we arrive at truth.”

That quote represents the change in intellectual posture.

Instead of beginning with certainty and using reason to defend it, Abelard begins with tension and uses reason to work through it. Authority alone no longer settles the issue. Claims must be examined, language clarified, and assumptions tested.

Once questioning becomes legitimate, authority can no longer rely on transmission alone. It now has to also persuade.

Abelard pushed beyond accepted limits. He applied reason to doctrines often treated as beyond rational explanation and placed greater emphasis on intention in moral evaluation. In doing so, he opened space for a more nuanced understanding of ethics, one not entirely bound to inherited categories.

The response to him was what you would expect from institutional power.

He was condemned. His works were burned. He was brought before councils that were less interested in exploring his arguments and more so in containing their implications. The reaction showed what was at stake. A religious order grounded in authority does not easily absorb a method that legitimizes doubt.

And yet the method persisted.

Even when his specific conclusions were rejected, the habit of inquiry he modeled proved difficult to suppress. The practice of setting opposing views side by side and working through contradiction became central to scholasticism. The intellectual tradition that later shaped medieval universities carried forward elements of an approach once treated as dangerous.

Abelard does not stand alone as the cause of a broader intellectual reopening. The recovery of classical texts, the reintroduction of Aristotle, contact with Islamic and Jewish scholarship, and the growth of educational institutions all played a role.

What his story represents is the shift in attitude.

Inherited knowledge no longer functions as a sealed inheritance. It became something that can be examined, refined, and, within limits, challenged.

Of course, those constraints never fully disappeared.

Abelard was allowed to question, but not indefinitely. He was permitted to reason, but not without consequence. The same religious culture that made his work possible also defined where it had to stop.

That tension between authority and inquiry did not remain confined to intellectual life. It also carried forward into the institutions that developed over time.

A university lecture (an illustration from the second half of the 14th century).

The medieval university is one of the clearest places to see this pattern at work. Often treated as a distinctly Christian achievement, it grew out of a much broader mix of influences.

In Spain, Baghdad, and Cairo, Islamic schools, libraries, and observatories held resources far beyond anything available in much of Europe at the time. Arab, Jewish, and Christian scholars shared intellectual interests through expanding trade networks and translation movements. After the Christian capture of Toledo in 1085, that city became one of the key places where these worlds met, allowing texts to move across languages, traditions, and religious boundaries.

The Western reopening of inquiry did not happen because Europe simply looked inward and rediscovered itself.

It happened because knowledge traveled.

Averroes’ commentaries on Aristotle, translated into Latin, became essential sources for thirteenth-century Christian intellectuals, including Thomas Aquinas. That alone should complicate any idea that Christian scholarship developed in isolation. The university absorbed, translated, debated, and reorganized knowledge that had already passed through Greek, Arabic, Jewish, and Latin traditions.

Islamic Astronomers #1 is a photograph by Science Source

Even the structure of medieval universities reflects that broader inheritance. They developed their own corporate identities, governed collectively by masters, with distinct curricula and examination systems. By the late thirteenth century, Master of Arts could vastly outnumber Master of Theology. Historian Charles Freeman notes one example where 120 teachers of the arts were listed against only 15 Masters of Theology. That imbalance tells you what mattered most. The curriculum leaned heavily on classical texts, not purely Christian foundations.

Christian Europe helped institutionalize learning, but the material being organized was older, broader, and more cosmopolitan than the church-centered story suggests. The university becomes another example of Christianity’s larger pattern: it absorbed existing goods, gave them institutional form, and placed them inside its own theological horizon.

But the results did not move in one direction.

The same religious vision that could support care and dignity could also justify hierarchy and control. Because the tradition depended on scriptural interpretation, and interpretation depended on authority, very different conclusions could emerge from the same source material.

That instability is not only a matter of later interpretation. It is already present in the texts themselves.

The Gospels do not present a single, unified account. They offer overlapping portraits that do not fully align.

In Gospel of Matthew and Gospel of Mark, Jesus cries out, “My God, my God, why have you forsaken me?” while in Gospel of John, he concludes, “It is finished.” The tone shifts from abandonment to completion.

The timeline shifts as well, with the Synoptic Gospels placing the final meal at Passover, while John places the crucifixion before it begins.

Even the ethical posture is not entirely consistent: in Matthew, Jesus teaches “turn the other cheek,” while in Luke, he tells his followers, “Let the one who has no sword sell his cloak and buy one.”

Taken together, these are not minor discrepancies. They open space for fundamentally different readings of what the tradition demands.

Christianity persists not as a fixed form, but as a tradition capable of producing multiple, competing forms while still claiming continuity.

This becomes especially clear in debates over slavery.

Christians were involved in abolition movements, and that history is part of the record. The language of universal moral equality played a real role in mobilizing opposition to slavery and reshaping moral expectations.

But that is not the whole story.

The same texts were also used to defend slavery, reinforce it, and argue that existing social orders were divinely sanctioned.

That contradiction is not incidental. It reveals something important about the Christian inheritance itself.

A religious order that combines universal moral language with authoritative texts creates the conditions for both expansion and constraint. It can push moral concern outward, but it can also bind that concern within approved categories. The outcome depends on who interprets the texts, which authorities prevail, and what social pressures shape the reading.

Critics of abolitionist movements, including Thomas Carlyle, argued that what they saw as abstract humanitarian concern could override more immediate obligations or practical realities. A contemporary political cartoon captured this dynamic under the phrase “telescopic philanthropy”—a tendency to focus moral concern at a distance while neglecting what is closer at hand.

The point I’m trying to make here is not that concern beyond one’s own group is inherently false or wrong.

The point is that moral expansion creates distance.

The farther a concern stretches, the easier it becomes to neglect concrete obligations close at hand: family, neighbors, local order, inherited duties, and the people one is actually responsible for. Abstract compassion can become morally flattering precisely because it asks less of the person expressing it.

Whether one agrees with those criticisms or not, they point to something very real.

A moral order that expands obligation beyond local belonging gains reach, but it also risks losing proportion. It can elevate the stranger while forgetting the neighbor. It can speak beautifully about mankind while failing the people right in front of it.

Christianity extended moral concern beyond tribe and built institutions that carried that vision forward. But it also introduced pressures around authority, interpretation, exclusion, and the limits of acceptable thought.

The good is real, but…so is the tension inside it.

Christianity’s inheritance was not simply compassion, dignity, or education. It was a moral architecture: universal in scope, institutional in form, inward in psychology, and unstable once detached from the cultural world that had once held it together.

That brings us to our next inquiry.

Not just what Christianity gave the West, but what kind of order made those outcomes possible.


SECTION II — THE BAD

Truth, Authority, and the Limits of Inquiry

At this point, the issue is not simply what happened when Christianity moved from the margins to power. I’ve explored that elsewhere: the suppression of rival systems, the narrowing of acceptable thought, and the long habit of treating competing worldviews not as alternatives to debate, but as errors to contain.

The deeper question here is more structural.

What kind of religious order produces those outcomes in the first place?

Because the shift was a reorganization of how truth operated, how disagreement was handled, and how legitimacy was defined.

Earlier Greco-Roman religious and philosophical life was not tolerant in the modern sense, but it was more comfortable with multiplicity. Rival schools, local cults, household gods, civic rituals, and philosophical traditions could coexist without requiring one totalizing creed to absorb or eliminate the rest. That did not make the ancient world peaceful or morally pure. It did mean that truth was not always treated as one fragile object that had to be protected from every rival.

The Abrahamic worldview introduced something different, often called the “Mosaic distinction.”

God giving the Tablets of the Law to Moses, from a manuscript attributed to Chrétien Legouais, 1325 CE. Image source: gallica.bnf.fr / Bibliothèque municipal de Rouen

It drew a sharper line between true and false in a way that changed the stakes of disagreement. Belief was no longer simply one option among many. It became a dividing line. Once that line was drawn, alternative ways of seeing the world did not remain neutral. They became errors, and error began to carry drastic consequences beyond private belief.

If truth is singular and binding, then the religious order has to decide what to do with everything outside of it. Some ideas are absorbed. Some are tolerated for a time. Others are pushed out entirely. But none of them sit comfortably alongside it anymore. They exist in tension with the claim that one truth must govern above all others.

As we previously discussed, Christianity is often credited with preserving learning and building universities, and that claim is not false. Medieval universities became important institutions for intellectual training, debate, law, theology, medicine, and philosophy. They helped organize knowledge and gave scholastic inquiry a durable form.

But that achievement has to be kept in proportion.

The medieval university was an achievement, but it was not a recovery of classical freedom. It was classical inheritance under theological supervision.

Ancient philosophy could be studied, but it had to be reconciled with Christian doctrine. Aristotle could return, but not as Aristotle alone. He had to be interpreted through Christian categories, corrected where necessary, and placed beneath revealed truth. Reason was permitted, even sharpened, but it was not sovereign.

The medieval university did not represent inquiry on open ground. It represented inquiry inside boundaries. Reason could clarify doctrine, defend doctrine, organize doctrine, and reconcile contradictions within inherited authorities. But when reason pressed too far against the architecture of belief, the limits became quite visible.

That does not make medieval learning worthless. It makes it conditional.

And that conditionality is the point.

Christian Europe did not simply preserve the classical world. It received it, edited it, baptized it, and constrained it. What could be made useful to the Christian order survived more easily. What threatened that order did not.

This is the kind of intellectual narrowing later critics would recognize in Christianity’s relationship to philosophy. Heidegger’s critique of onto-theology is not aimed at Christianity alone, but it helps name the pattern: open-ended questioning becomes absorbed into a prior explanatory order. Instead of wonder remaining primary, inquiry is routed through established claims about creation, causality, divine order, sin, and salvation.

The question is no longer allowed to remain fully open.

It has to be answered inside the architecture of doctrine.

Once orthodoxy is established it operates within boundaries that have already been set, and stepping outside those boundaries starts to carry not just intellectual consequences, but social ones. Access to authority, education, and influence becomes tied, at least in part, to alignment.

At that point, belief is no longer just something people hold. It becomes something that moves outward, seeking to correct and expand.


SECTION III: THE UGLY

Universalism, Power, and the Moral Afterlife

By the time you reach the modern West, the question is no longer whether Christianity shaped it. That much is obvious. The deeper issue is what, exactly, it left behind, and what happens when the conditions that once sustained that inheritance begin to unravel.

Christianity did not simply introduce a set of beliefs and then fade as those beliefs weakened. It reorganized moral life at a level that persists long after doctrine loses its authority. It changed how individuals understood themselves, how they related to others, where moral responsibility resided, and how truth was expected to move through the world.

The ugly side of the Christian inheritance is not merely universalism. It is universalism with a missionary engine.

Christianity does not simply say, “This is true.” It says truth must be spread. Error must be corrected. The world must be brought into submission to the saving order. That structure changes the meaning of difference. A rival worldview is not merely foreign, local, or ancestral. It becomes spiritually demonic.

And once a difference becomes an error, correction can be justified as mercy.

The religious world Christianity emerges from was already in tension with the surrounding Greek and Roman order. Second Temple Judaism didn’t simply blend into Hellenistic life. Again and again, it resisted it—politically, culturally, religiously.

D. H. Lawrence saw this tendency clearly. In Apocalypse, he describes a fear-driven impulse within Christianity—a refusal to leave other ways of understanding the world intact. Not just disagreement, but the drive to overcome, absorb, or eliminate what stands outside the truth.

That instinct is already embedded in the apocalyptic world Christianity emerges from. Second Temple Judaism carries expectations of final judgment, cosmic conflict, and the ultimate victory of a single, rightful order-the coming of the Moshiach/Messiah.) Christianity inherits that framework and gives it a wider reach.

That is where Christianity’s relationship to Rome becomes essential. Christian universalism did not spread on its own. It moved through the late imperial Roman systems: roads, cities, law, administration, literacy, political centralization, and habits of governance already trained toward scale. The faith did not merely conquer Rome. It also inherited Rome’s machinery.

Rome gave Christianity infrastructure. Christianity gave Rome a sacred moral horizon. Together, they helped produce a form of power that could move across peoples, lands, languages, and customs while claiming to operate in the name of truth rather than mere domination.

This is also why Christianity receives too much credit for goods it did not invent.

One reason it’s treated as the source of Western morality is that it became dominant enough to absorb older goods and narrate them backward as Christian achievements. Care for the poor, philosophical inquiry, civic duty, moral discipline, education, and concern for the common good did not appear out of nowhere when Christianity entered history. Many of these were already present in Greek, Roman, Jewish, and local European worlds. Christianity reorganized them inside its own story.

That reorganization gave them reach.

But it also gave them a new master narrative.

Older traditions were often embedded in particular peoples, places, households, ancestors, cities, gods, calendars, and sacred landscapes. Religion was not just a private belief system. It was woven into the life of a people. Christianity altered that relationship by making belief portable. It could cross borders, override local cults, and create a community defined less by blood, land, or inherited custom than by shared confession.

That is one of the most consequential shifts in Western history.

Christianity weakened the older link between people, place, ancestors, and gods. It did not erase those attachments overnight, and in practice it often absorbed local festivals, sacred sites, and folk customs. But the deeper logic changed. The highest belonging was no longer rooted primarily in the local or ancestral. It was relocated into a universal religious identity.

Conversion, then, was not merely persuasion. It was the remaking of belonging.

A people could be separated from their gods, their rituals, their inherited calendar, their sacred places, and their ancestral memory, then folded into a new universal story that claimed to redeem them while also replacing the world that formed them.

Not every conversion was violent. That would be too simple. Some conversions were gradual, political, strategic, sincere, blended, or partial. But once that universal truth claim became tied to salvation, rival traditions do not remain equal neighbors. They become obstacles to be overcome, errors to be corrected, or remnants to be absorbed.

The First Crusade: Pope Urban II and Jerusalem vs. Diplomatic Unification

The crusades make this structure visible in its most explicit and militarized form.

They were not only political wars. They were religious wars shaped by sacred geography, penitential promise, and the belief that violence could be folded into a redemptive order.

The Crusades did not simply mobilize Europe—they redirected it toward Jerusalem, a sacred center that was not its own.

That does not mean every participant had the same motive, and it does not mean politics, land, wealth, status, and military ambition were irrelevant. Of course they mattered. But the crusading imagination reveals something specific: once warfare is placed inside a sacred story, conquest can be interpreted as obedience, purification, defense, or salvation.

That is the danger of missionary structure joined to power.

It sanctifies expansion.

And this is not confined to medieval history. The same basic pattern can reappear whenever politics inherits religious moral intensity. The opponent is no longer merely wrong about policy. He becomes a threat to truth, justice, salvation, progress, safety, democracy, equality, or whatever sacred term now carries the old theological weight.

At that point, disagreement becomes harder to contain.

The modern West inherited this moral intensity even as explicit Christian authority declined. Most people inherited a world in which Christianity had already begun to lose its grip, but nothing fully replaced it. The rituals became optional. The authority fractured. Yet many of the underlying assumptions remained intact.

What had once been explicitly theological was gradually translated into secular terms.

At the center of that structure is a form of universalism Christianity helped entrench: the idea that all people stand beneath one moral order, that identity is secondary to a broader human category, and that truth applies universally rather than locally. That assumption did not disappear with religious decline. It migrated.

Liberalism, in many of its modern forms, carries that template forward: the individual abstracted from place, lineage, inherited duty, and thick communal belonging, then positioned inside a universal framework of rights, equality, and moral expectation.

The language changes. The structure does not.

The West moved from Christian universalism to liberal universalism without seriously interrogating the universalism itself. It replaced theological justification with philosophical or political justification, but it retained the assumption that the highest moral order transcends particular identities rather than emerging from them.

And what carries forward is not only universal morality, but missionary mentality.

Salvation becomes progress. Sin becomes injustice. Heresy becomes hate. Evangelism becomes activism. The world must still be corrected. The morally backward must still be brought into line.

And the irony is hard to miss. The same people who pride themselves on rejecting religious dogma often reproduce its structure almost perfectly—moral certainty, heresy-hunting, and the impulse to correct and convert, just without calling it religion.

You can see this most clearly in the modern left, especially in its activist and radical edges. What presents itself as political theory often behaves like secularized salvation mythology. The infrastructure is unchanged: the world is broken and the masses need liberation. God is removed, but everything else remains. The heretics still need correction. Sin becomes hierarchy. Salvation becomes self-rule. The missionary doesn’t disappear—he just changes form.

It still sorts people into the righteous and the condemned. It still creates moral taboos. It still treats disagreement as contamination. It still imagines that the world can be redeemed if only the right moral order is imposed—with enough force, shame, education, policy, or institutional pressure.

That is not the absence of Christianity.

It is part of its afterlife.

Later European expansion, and even modern geopolitical projects, often operate within the same structure—intervention framed as liberation, reform, or progress.

Whenever universal moral claims are aligned with power and tied to the belief that truth must spread, action begins to feel necessary rather than optional.

To understand why it persists, and why it adapts so easily across different historical contexts, you have to look at what is happening at a deeper level. Not just in institutions or empires, but within the individual.

Because the most enduring change Christianity introduces is not only institutional.

It is psychological. It altered where morality is located.

In earlier classical traditions, especially in Aristotle, the moral life is oriented outward. The Greek conception of eudaimonia assumes that human beings can develop toward excellence. Flourishing is cultivated through practice, discipline, rational activity, and participation in the world. Character is formed through what one does, and the moral life is outward, embodied, and lived over time within a shared civic and social context.

Christianity, especially through Augustine of Hippo, redirects that focus inward.

The problem is no longer simply what a person does, but what a person is. Human nature itself becomes suspect, marked from the beginning. The doctrine of original sin reframes the individual not as someone developing toward excellence, but as someone starting already compromised. This is not just about isolated wrongdoing. It is about a baseline disorder built into human existence, transmitted across generations, shaping inclination before any conscious choice is made.

From that premise, morality reorganizes itself accordingly. If the problem lies within, then moral evaluation cannot remain limited to outward behavior. It extends inward, into thought, desire, intention, and impulse—the parts of life no one else sees but are still treated as morally significant.

Fra Angelico, The Conversion of St. Augustine (c. 14301435)

This becomes structured into daily practice. Monastic traditions classify internal states (temptation, pride, doubt, desire) as if they were items that could be named, tracked, and corrected. Authority expands beyond regulating behavior into defining what counts as acceptable thought, shaping not just action but the boundaries of the inner life itself.

Once it relocates inward, the primary site of regulation is no longer only the community. It is the individual mind, where conscience, guilt, confession, fear, and self-regulation operate continuously, often without any visible external enforcement.

You can see the implications of this in the conflict between Augustine and Pelagius. Pelagius emphasizes human capacity: the ability to choose, improve, and take responsibility for moral development. Augustine rejects that position, insisting on dependence—on God’s grace, on divine intervention, on something beyond human effort.

This is not only a theological disagreement.

It is also a question about agency.

If the individual cannot fully rely on their own capacity to move toward the good, then moral development becomes entangled with God’s authority. Responsibility does not disappear, but it no longer stands on its own. It becomes mediated, conditioned, and in some cases limited, as the individual is situated within a framework that places ultimate transformation outside of purely human reach.

Over time, that tension begins to shape intellectual life as well. Historians like Charles Freeman do not argue that inquiry simply disappeared, but that the conditions surrounding it changed. When belief becomes tied to salvation, and when error carries not only intellectual but spiritual consequences, curiosity itself begins to look different. Questions are no longer neutral exercises. They take on moral weight, and in certain contexts, they begin to carry risk.

Writers like Thomas Paine noticed this and pushed directly against the idea that truth can rest on inherited authority. In The Age of Reason, Paine argues that revelation, once it passes through human hands, can no longer function as unquestionable truth. What begins as divine claim becomes human interpretation, and therefore something that must be examined rather than simply accepted. That move cuts directly against the structure that treats questioning as risk. It reopens the possibility that belief itself should be subject to the same scrutiny as anything else.

Mark A. Noll describes a similar pattern in later Christian intellectual culture: a tendency to preserve belief rather than extend it. Questioning is not always welcomed as curiosity. It can be interpreted as disloyalty, a sign that alignment is weakening rather than deepening. The safest position, in that environment, becomes one of conformity rather than exploration.

The obedient mind is the secure mind.

This is not new. It is already visible earlier in the tradition. The same system that could produce figures like Abelard (where questioning began to reopen) also produces the conditions Noll is describing, where belief becomes something to preserve rather than extend.

The instinct to monitor thought, to moralize disagreement, to treat deviation as more than error—those habits do not emerge in a vacuum. They develop within specific historical conditions, and they persist even as the surrounding language changes.

This is why the internal reorganization matters.

It is not only about doctrine.

It is about how individuals learn to relate to themselves.

If Augustine relocates morality inward, Protestantism amplifies and personalizes that shift. The individual is placed in more direct relation to truth, expected to read, interpret, examine, and align himself without the same mediating structures that once guided that process. Authority does not vanish. It becomes more diffuse and more demanding.

The church hierarchy may weaken in some places, but new pressures emerge through scripture, sermon, household discipline, community surveillance, literacy, and conscience. The individual is made more responsible before God, but also more exposed.

The burden of interpretation moves further into the self.

Over time, that inward structure detaches from the communal and cultural worlds that once gave it shape. What remains is a society of individuals expected to interpret, justify, and regulate themselves inside a universal moral order, but without a shared culture capable of holding that process together.

That misalignment becomes visible in how people interpret conflict, identity, history, and political life.

In modern America, this can still be seen in forms of biblical literalism, dispensationalism, and end-times prophecy that shape how many Christians understand Israel, war, nationhood, and world events. These beliefs do not remain private. They influence political imagination. They affect how people interpret history, alliances, enemies, and what they believe is inevitable or divinely sanctioned.

In this context, belief stops being just belief. It starts shaping how everything else is seen.

That is the same mechanism operating in another key. The pattern that once defined orthodoxy and constrained variation does not disappear. It adapts as the cultural environment shifts. The language evolves, but the underlying habit remains… truth is singular, error is dangerous, and those outside the moral order must be corrected, converted, contained, or cast out.

What this reveals is not a simple story of progress or decline.

Christianity did not leave behind a stable moral foundation that the West either followed or abandoned. It left behind a set of interacting pressures: universalism and particular identity, internalized morality and external authority, individual responsibility and collective order, compassion and conquest, salvation and exclusion.

For a time, those pressures could be held in relative balance, but this fit no longer holds.

The institutions remain, but they no longer command the same trust. The moral instincts remain, but they are no longer guided by a shared tradition. The universal language remains, but it floats above increasingly fractured peoples, places, and loyalties.

Conflict becomes more moralized. Disagreement becomes harder to contain.

This is why the modern West feels both thin and volatile.

Thin, because inherited forms of continuity have weakened.

Volatile, because the moral pressure embedded in the inheritance remains, now operating without the older structures that once gave it proportion.

That is the condition the modern West has inherited.


CONCLUSION: Why the West Still Cannot Escape the Problem

The Christian inheritance of the West cannot be reduced to either gratitude or resentment.

It gave moral concern, meaning to suffering, durable institutions, and the preservation and transmission of knowledge, even as that knowledge was filtered through doctrine. It created a shared moral vocabulary capable of binding large populations together.

But it also changed the terms of belonging.

It loosened religion from peoplehood, place, ancestry, and local custom. It made belief portable. It turned truth into something singular and binding, making disagreement morally charged. Once rival traditions became errors rather than neighbors, the pressure to absorb, correct, or suppress them followed naturally.

The West did not abandon Christianity so much as carry its habits forward. The missionary impulse remained. The abstract individual remained. The suspicion of rooted identity remained. Social Justice became their new end times.

That is why a return to Christianity does not solve the problem. It would not restore a stable foundation but reassert one layer of the inheritance while leaving its tensions unresolved.

Secular liberalism does not solve it either. It often preserves the universalism while stripping away the cultural limits that once gave it proportion, asking people to live as abstract individuals inside a moral framework detached from place, memory, and inherited obligation.

What remains is not a coherent worldview, but a contradictory one.

From the beginning, the inheritance carried competing impulses. Early Christianity emerged from an apocalyptic environment while also developing moral and institutional frameworks for life within the world. Over time, those tensions were not resolved but reworked and emphasized in different ways.

Within Protestantism alone, some strands treated the world as something to be ordered and reformed, energizing movements like abolition, while others emphasized its corruption and eventual end, orienting life toward endurance and escape. The divergence is not a break from the tradition, but a difference in emphasis within it.

The result is a system that can point in opposite directions while still claiming the same foundation.

This is not a foundation a civilization can stand on.

A civilization needs moral scale, but also proportion. Compassion, but not so abstract that it forgets its own people. Rights, but not detached from duty. Inquiry, but not subordinated to sacred certainty. Space for disagreement, but enough shared identity to keep it from becoming civilizational warfare.

Above all, it needs rooted obligations.

A civilization cannot survive on abstract principles alone. It needs loyalty, shared memory, boundaries, place, and a people capable of recognizing what is theirs to preserve.

Because removing structure does not remove power. It removes the forms that make power visible and accountable. And when that happens, power does not disappear. It shifts—into forms that are harder to see and harder to resist.

We are not standing outside this inheritance.

We are still working within it.

And the task is not to romanticize Christianity, completely demonize it, or pretend we have escaped it, but to understand what it absorbed, what it built, what it destabilized, and what it left behind clearly enough to stop repeating its most destructive patterns.


Sources

Abelard, Peter. Sic et Non.

Aristotle. Nicomachean Ethics.

Aristotle. Politics.

Arktos Journal and Laurent Guyénot, The Crusading Civilisation: From the Middle Ages to the Middle East” (Substack, April 3, 2026).

Atkinson, Kenneth. “Judean Piracy, Judea and Parthia, and the Roman Annexation of Judea: The Evidence of Pompeius Trogus.” Electrum 29 (2022): 127–145. https://doi.org/10.4467/20800909EL.22.009.15779

Augustine. Confessions.

Augustine. The City of God.

Brown, Peter. Augustine of Hippo: A Biography. Berkeley: University of California Press, 2000

Burkert, Walter. The Orientalizing Revolution: Near Eastern Influence on Greek Culture in the Early Archaic Age. Cambridge, MA: Harvard University Press, 1992.

Carlyle, Thomas. “Occasional Discourse on the Negro Question.”

Doner, Colonel V. “Cognitive Dissonance of Political Activists, Or Whatever Happened to the Religious Right?” Chalcedon, July 1, 1999.

Freeman, Charles. The Closing of the Western Mind: The Rise of Faith and the Fall of Reason. London: Heinemann, 2002.

Freeman, Charles. The Reopening of the Western Mind: The Resurgence of Intellectual Life from the End of Antiquity to the Dawn of the Enlightenment. London: Head of Zeus, 2023.

Lawrence, D. H. Apocalypse. 1931.

Locke, John. A Letter Concerning Toleration. 1689.

MacCulloch, Diarmaid. Reformation: Europe’s House Divided, 1490–1700. London: Allen Lane, 2003.

MacMhaolain, Aodhan. The Transmission of Fire: How To Keep Tradition Burning. The Enchiridion, April 9, 2026.

Montesquieu, Charles de Secondat. The Spirit of the Laws. 1748.

Noll, Mark A. The Scandal of the Evangelical Mind. Grand Rapids: Eerdmans, 1994.

Paine, Thomas. The Age of Reason. 1794–1807.

Paine, Thomas. Common Sense. 1776.

Thorsteinsson, Runar M. Roman Christianity and Roman Stoicism: A Comparative Study of Ancient Morality. Oxford: Oxford University Press, 2010.

West, Martin L. The East Face of Helicon: West Asiatic Elements in Greek Poetry and Myth. Oxford: Clarendon Press, 1997.

The Fragility of Freedom

What Liberty Actually Depends On

Hey hey, welcome back to Taste of Truth Tuesdays. Today’s episode is where we dig into philosophy, culture, history, and the ideas that have shaped the world we’re living in—everything from classical texts to the American founding documents that are still very much relevant to how we should think about freedom today.

Listen here:

There’s a growing sense that something isn’t working.

You see it in the fragmentation of identity, the erosion of shared norms, and the breakdown of trust across institutions.

You don’t have to look very hard to notice it.

People don’t trust elections, medicine, or the media—sometimes all at once, and often for completely different reasons.

Dating is “freer” than it’s ever been, and yet it feels more unstable, more transactional, and more confusing than most people expected.

Corporations speak like moral authorities, issuing statements about justice and truth, while operating through incentives that have nothing to do with either.

Everything is still functioning. But less of it feels legitimate.

In my last piece, I traced one part of this problem back to a common assumption, that Christianity built the foundations of the West. But when you actually follow the development of those ideas, much of what we associate with Western thought—natural law, reason, and the structure of political life—has deeper roots in the Greco-Roman philosophical tradition.

That matters, because the frameworks we inherit shape what we think freedom is, and what we expect it to do.

This piece is a continuation of that question. Not only about where those ideas came from, but about what they require to hold together.

Because a free society doesn’t sustain itself on freedom alone. It depends on discipline, restraint, and a shared understanding of limits—conditions that the system itself cannot produce.

And when those begin to erode, the system doesn’t just break. It follows a pattern that’s been observed for a very long time.

Jefferson intentionally designed the Virginia Capitol in Richmond directly after the Roman temple Maison Carrée~16 CE

I. The Fear Beneath the Founding

This isn’t a new problem.

The relationship between freedom and instability shows up wherever societies try to govern themselves.

The American founding emerged out of that concern. The people designing the system weren’t just thinking about how to create liberty, they were trying to understand why it collapses.

The colonists weren’t casually referencing Rome. English translations of Vertot’s Revolutions that Happened in the Government of the Roman Republic (1720) were in almost every library, private or institutional, in British North America. They studied how free societies decay, how power shifts from shared trust into something self-serving, and how internal corruption (not just external threat) brings systems down.

They believed they were watching it happen in real time.

What they took from antiquity was not blind optimism about freedom, but caution.

And this wasn’t limited to classical history. As Bernard Bailyn observed, the colonists were immersed in dense and serious political literature, shaped by philosophy, and sustained reflection on the problem of power.

Part of what they were working with was an older line of thought running through Greek and Roman philosophy.

The idea that human life is not directionless. That there are patterns to how people live, and that some ways of living lead to stability and flourishing, while others lead to breakdown.

You can already see the foundation of this in Aristotle. He didn’t use the term “natural law,” but the structure is there. Human beings have a nature, and flourishing comes from living in alignment with it—not whatever we happen to want in the moment, but a way of life shaped by discipline, balance, and the cultivation of virtue over time.

The Stoics make this more explicit. They describe the world as ordered by reason—logos—and argue that human beings can come to understand that order.

From that perspective, moral truth isn’t something we invent. It’s something we discover. And law, at its best, should reflect that underlying structure rather than contradict it.

By the time you get to Rome, this idea is articulated more directly. Cicero describes a true law grounded in right reason and in agreement with nature—something universal, not dependent on custom or preference, but rooted in reality itself.

These ideas don’t disappear. They are carried forward and developed.

Christian thinkers later absorb and expand them, especially through Thomas Aquinas, who integrates Greek philosophy and Roman legal thought into a more explicit framework of natural law. And that influence is real. It’s part of the Western story whether we like it or not.

But that’s not the point of this piece.

What matters here is that by the time you reach the early modern period, this idea of a structured moral order—something that places limits on behavior and grounds freedom in discipline—is already well established.

You can see that continuity clearly in how the Founders and colonists read earlier political thought. Returning to those earlier sources, Plato describes how political systems degrade over time, arguing that excessive and undisciplined freedom can produce disorder, which eventually leads people to accept tyranny in the search for stability. Aristotle traces how democracies collapse when law gives way to persuasion and personality. Polybius maps the recurring cycle through which governments rise and decay.

What he described was called anacyclosis, a recurring cycle of political systems. Governments begin in relatively stable forms, rule by one, by a few, or by many, but over time they degrade. Kingship becomes tyranny. Aristocracy becomes oligarchy. Democracy, when it loses discipline, collapses into what he called ochlocracy, rule by the mob.

This wasn’t abstract to the colonists, like I said,  they believed they were watching this pattern unfold in real time. And it shows up just as clearly in the political language of the founding era itself.


As Bailyn explains, monarchy, aristocracy, and democracy were each seen as capable of producing human happiness. But left unchecked, each would inevitably collapse into its corrupt form: tyranny, oligarchy, or mob rule.


Writings like Cato’s Letters were widely read in the colonies and helped shape how ordinary people understood government, power, and liberty.

What’s striking when you read Cato more closely is how little confidence they placed in moral restraint alone. It doesn’t describe freedom as unlimited expression or personal autonomy. The idea that belief, fear of God, or good intentions would keep power in check is treated as dangerously naive. Power is not self-regulating, and it is not made safe by the character or beliefs of those who hold it. It has to be exposed, limited, and actively resisted—because even institutions and ideas meant to restrain it, including religion, can be repurposed to justify its expansion.

It describes government more as a trust—one that exists to protect the conditions that make ordinary life possible.

As Cato writes:

“Power is like fire; it warms, it burns, it destroys. It is a dangerous servant and a fearful master”

And more directly:

“What is government, but a trust committed…that everyone may, with the more security, attend upon his own?”

The assumption is clear. Power must be restrained. Freedom depends on it.

But in Cato’s framing, that restraint doesn’t come from structure alone. It depends on constant exposure and resistance. Freedom of speech and a free press aren’t treated as abstract rights, but as active safeguards—tools for uncovering corruption and preventing power from consolidating unchecked. The logic is simple but demanding: power does not correct itself. It expands, protects its own interests, and, if left unchallenged, begins to operate beyond the limits it was given.

The point of understanding the political cycles of revolution wasn’t to say that any one system was uniquely flawed. It was that all systems are vulnerable to the same underlying problem:

Human nature.

Self-interest eventually creeps in. Restraint erodes. Power shifts from a trust into something personal and extractive.

And once that shift happens, the form of government matters less than the character of the people within it. That thread runs directly into the founding.

The American system wasn’t designed as a pure democracy. It was an attempt to stabilize a problem earlier thinkers had already identified.

Rather than choosing a single form of government, the founders built a mixed system, blending elements of rule by one, rule by a few, and rule by many. An executive to act with decisiveness. A Senate to provide deliberation and continuity. A House to represent the people more directly.

This wasn’t accidental.

It reflected an awareness that each form of government carries its own risks, and that concentrating power in any one place tends to accelerate its corruption.

By distributing power across different institutions, the goal was to create tension within the system itself. Ambition would check ambition. Competing interests would slow the consolidation of power.

From my understanding, they weren’t trying to escape the cycle Polybius described. They were trying to manage it.

They weren’t designing a perfect system. They were attempting to design one built to withstand imperfect people.

But even that depended on something it could not guarantee.

In Federalist No. 10, James Madison writes:

“The latent causes of faction are thus sown in the nature of man.”

He’s not describing a temporary problem. He’s describing a permanent one.

Differences in opinion, interests, wealth, and temperament don’t disappear. They organize. They form groups. And those groups will sometimes pursue aims that are at odds with the rights of others or the stability of the system itself.

Madison’s conclusion is straightforward:

“The causes of faction cannot be removed… relief is only to be sought in the means of controlling its effects.”

That distinction is crucial. He doesn’t try to eliminate conflict or force unity. He assumes conflict is inevitable and builds a system around that reality.

Instead of requiring perfect discipline from individuals, the structure disperses power, multiplies interests, and forces negotiation. Representation slows decision-making. Scale makes domination more difficult.

Freedom is preserved not by removing conflict, but by structuring it.


They looked ahead with anxiety, not confidence. Because they believed liberty was collapsing everywhere. New tyrannies had spread like plagues. The world had become, in their words, “a slaughterhouse.” Across the globe:  Rulers of the East were almost universally absolute tyrants…Africa was described as scenes of tyranny, barbarism, confusion and violence. France ruled by arbitrary authority. Prussia under absolute government. Sweden and Denmark had “sold their liberties.” Rome burdened by civil and religious control. Germany is a hundred-headed hydra. Poland consumed by chaos. Only Britain (and the colonies) were believed to still hold onto liberty. And even there… barely. From revolutionary-era political writings, as compiled by Bernard Bailyn


University of Virginia Rotunda-Modeled after the Roman Pantheon

II. Ordered Liberty and the Kind of Person It Requires

The founders believed in liberty, but not as an unlimited good. They believed in ordered liberty. Freedom that exists within a framework of responsibility, discipline, and civic virtue. The system they designed assumed a certain kind of person, one capable of self-governance, restraint, and participation in a shared moral world.

That assumption was not optional. It was structural. It’s easy to miss how much is built into that.

And this is where the modern tension and the current understanding of freedom begins to diverge from its origins.

Classical Liberalism, in its earlier form, was not about as Deneen states in Why Liberalism Failed, detaching individuals from all institutions, identities, or relationships. But it was about protecting individuals from tyranny while preserving the conditions necessary for a functioning society. It assumed the continued existence of family, community, religious frameworks, and shared norms.

But where Deneen is right, early liberal thought did introduce something new. 

John Locke, for example, reframed institutions like marriage as voluntary associations rather than fixed, inherited structures. That didn’t mean early liberal political philosophy was designed to erode the family. But it did change how those institutions were understood. It placed individual choice alongside social stability in a way that could be expanded over time.

To understand where this expansion comes from, you have to look at what came before it


Without freedom of thought, there can be no such thing as wisdom; and no such thing as publick liberty, without freedom of speech: Which is the right of every man, as far as by it he does not hurt and control the right of another; and this is the only check which it ought to suffer, the only bounds which it ought to know. Cato’s letters No.15


III. The Moral Inheritance of the West

The Lia Fail Inauguration Stone on the Hill of Tara in County Meath Ireland

In many pre-Christian societies, moral life wasn’t organized primarily around abstract rules or universal doctrines, but around continuity. Identity was tied to lineage, family, and inherited roles. Authority came not from individual preference, but from what had been passed down—customs, obligations, and expectations shaped over generations. To live well wasn’t just a personal project. It meant upholding something larger than yourself: maintaining the reputation of your family, fulfilling your role within a community, and carrying forward a way of life that you didn’t create but were responsible for preserving.

You can see how this played out in places like Anglo-Saxon England, where social structure and legal life were more embedded in family and local custom than in centralized doctrine. Women, for example, could own property, inherit land, appear in legal proceedings, and in some cases exercise real economic and political influence. These weren’t modern equality frameworks, but they complicate the assumption that agency and rights only emerge through later “progress.”

That structure did more than organize society. It created cohesion. It gave people a shared reference point for what mattered, what was expected, and what should be restrained—even when no one was watching. Authority wasn’t something constantly renegotiated. It was inherited, lived, and reinforced through participation in a shared way of life.

Greek and Roman life was also structured around civic duty, hierarchy, and inherited roles.

Their moral frameworks reflected that structure. Thinkers like Aristotle emphasized virtue as balance, habits cultivated over time within a community, oriented toward harmony and the common good.

As Christianity spread, moral authority became less tied to lineage and local custom, and more anchored in universal doctrine—rules that applied across communities, not just within them. Obligation didn’t vanish, but it was increasingly reframed. Less about inherited roles within a specific people, more about the individual’s relationship to a broader moral order.

That shift didn’t happen all at once, and it’s not a simple story. The development of early Christianity, its integration into the Roman Empire, and the ways it reshaped intellectual life and authority are far more complex than a few paragraphs can capture here. I’ve gone into that in more detail elsewhere, particularly around the Constantinian period and the rise of revelation and fall of reason.

This development intensifies further with the rise of Protestantism, where that reframing of obligation becomes even more explicit.The movement from the Seven Deadly Sins to the Ten Commandments as a dominant moral framework.

Avarice (Avaritia), from “The Seven Deadly Sins”
Pieter van der Heyden Netherlandish
After Pieter Bruegel the Elder Netherlandish
Publisher Hieronymus Cock Netherlandish
1558

The Seven Deadly Sins, pride, greed, lust, envy, gluttony, wrath, and sloth, are not rules in the strict sense. They describe internal dispositions, patterns of character that distort judgment and pull a person out of balance. They are concerned with formation, with who you are becoming.

The Ten Commandments, by contrast, are structured as prohibitions. You shall not. They define boundaries, obedience, and transgression in relation to divine authority.

Both frameworks aim at moral order. But they operate differently. One is oriented toward the cultivation of character within a shared moral world. The other emphasizes compliance, law, and accountability before God.

The Protestant Reformation further reduced the role of mediating institutions, emphasizing personal conscience, direct access to scripture, and an individual relationship to truth. Authority became less external and more internalized, but also more individualized and less uniformly shared.

The emphasis is unmistakable. Moral responsibility is no longer primarily inherited or communal, but individual and direct.

This did not dissolve the community. But it did begin to relocate the moral center of gravity, from the maintenance of balance within a community, to the accountability of the individual before God.

A political system built on individual rights and self-governance emerged from a cultural framework that had already begun to center moral responsibility at the level of the individual.

At the same time, Christianity reshaped how the natural world was understood. Earlier traditions often treated nature as infused with meaning, order, or even divinity. Christianity maintained that the world was ordered, but no longer sacred in itself. It was created, not divine.

That distinction introduced a kind of distance. A world that is no longer sacred in itself becomes, over time, easier to treat as something external, something to study, measure, and ultimately use.

None of these shifts were inherently destabilizing on their own. But they altered the underlying framework.

Over time, they contributed to a gradual reorientation, one that made it easier to conceive of the individual as separate, autonomous, and capable of standing apart from inherited structures.

That development would later be expanded and amplified through liberal thought.

But the point is not that Protestant Christianity caused modern individualism. It is that it helped make it thinkable.

By the time you reach the Enlightenment and the American founding, those earlier shifts had not disappeared. They had been carried forward and reworked into a new framework—one increasingly shaped by reason, not as a rejection of religion entirely, but as a refusal to let authority go unquestioned simply because it claims moral or divine legitimacy.


The state of nature has a law of nature to govern it, which obliges every one: and reason, which is that law, teaches all mankind, who will but consult it, that being all equal and independent, no one ought to harm another in his life, health, liberty, or possessions… (and) when his own preservation comes not in competition, ought he, as much as he can, to preserve the rest of mankind, and may not, unless it be to do justice on an offender, take away, or impair the life, or what tends to the preservation of the life, the liberty, health, limb, or goods of another.

-John Locke on the rights to life, liberty, and property of ourselves and others


IV. When Freedom Loses Its Structure

Over the next two centuries, that framework continued to expand. Early expansions focused on political participation—who could vote, who counted as a citizen, and who could take part in public life.

By the mid-20th century, that expansion accelerated through civil rights movements, which pushed the language of equality and access further into law, culture, and institutions.

In the 1960s into the 1970s, the focus widened into personal life. Questions of family, marriage, sexuality, and individual identity were increasingly reframed in terms of autonomy and personal choice.

The sexual revolution, in particular, was widely understood as an expansion of personal freedom: loosening traditional constraints around sex, marriage, and family life. But over time, some of the assumptions underlying that shift have come under renewed scrutiny. The idea that women can navigate complete sexual and relational autonomy without significant cost appears increasingly fragile, especially in the absence of the social structures that once provided stability and direction.

Expanding rights changes the system, not just access to it.

What’s often assumed is that this expansion is self-justifying—that extending rights is always a net good, and that the system can absorb that expansion without consequence. But that assumption is rarely examined.

As the scope of participation widens, so does the demand placed on the system and on the people within it.

A political system built on equal participation assumes a level of judgment, responsibility, and long-term thinking that is not evenly distributed. It assumes that individuals, given more freedom, will be able to navigate it without undermining the conditions that make it possible in the first place.

What we also see in modern times is the cultural and institutional structures that once shaped behavior—family expectations, community standards, shared moral frameworks have become much weaker, more contested, or easier to reject.

For most of known human history, moral behavior wasn’t just a matter of personal conviction. It was embedded in small, stable, reputation-based communities where actions were visible, remembered, and judged over time. Behavior carried consequences because it was tied to relationships that endured.

That community system relied on three conditions: shared standards, stable enforcement, and long-term relationships. As those weaken, accountability becomes less consistent or non existent. Not because human nature has changed, but because the structures that made behavior visible and tied to consequences have broken down.

Part of that shift is tied to the broader move toward secularism. As religious frameworks lose authority, the shared narratives that once provided cohesion, meaning, and moral orientation begin to fragment. This doesn’t eliminate the human need for structure—it shifts where people look for it. It disperses into competing sources of identity, morality, and meaning.

In The Republic, Plato makes a similar observation about belief itself. What matters is not just what people claim to believe, but whether those beliefs hold under pressure. “We must test them… to see whether they will hold to their convictions when they are subjected to fear, pleasure, or pain.”

Without shared structures reinforcing those convictions, belief becomes more reactive, more situational, and more easily reshaped by external forces.

We are left with a society of multiple, incompatible systems of belief—each with its own values, demands, and claims to legitimacy, but no widely accepted structure holding them together. 

What was once a shared moral world becomes a contested one.

In Propaganda, Edward Bernays makes a blunt observation: the conscious and intelligent manipulation of the masses is not only possible, but essential to managing modern society. That insight becomes more relevant, not less, in the absence of a shared framework.

Because when a society loses the unifying structures that once held it together, the vacuum doesn’t stay empty. New ideologies rush in (secular, political, cultural) offering belonging, morality, and meaning, often with more intensity than the systems they replaced.

More autonomy. Less formation. More fragmentation. Less agreement on what freedom even demands.

This raises a harder question: whether removing earlier constraints produced the kind of freedom it promised, or simply replaced one set of pressures with another.

As that imbalance deepens, people don’t simply become more independent. They look for stability elsewhere.

This is where Deneen’s observation becomes useful, even if I don’t fully agree with his framing. As traditional institutions weaken, dependence doesn’t disappear—it shifts. From local, relational structures to larger, more abstract systems like the state and the market.

Another way to see this is that societies don’t just rely on formal institutions. They rely on something less visible—a kind of cultural immune system. Shared norms, expectations, and informal boundaries that regulate behavior without constant enforcement.

When those weaken, systems don’t become freer. They become easier to exploit.

One of the clearest examples of that vulnerability is the modern corporation.

The American system was designed in deep suspicion of concentrated power, yet over time it has extended expansive protections to corporate entities, allowing large institutions, backed by wealth, media, and legal abstraction, to shape public life in ways the founding framework was poorly equipped to restrain. 

The founders were wary of concentrated power, but they were not designing a system for multinational corporations with vast economic and informational reach. Over time, constitutional doctrine expanded in ways that made these entities increasingly difficult to limit, culminating in decisions like Citizens United, where the Court held that independent political spending by corporations and unions could not be restricted under the First Amendment.

This is part of the same pattern. A system built to preserve liberty becomes easier to exploit when power no longer appears as a king, a church, or a visible ruling class, but as diffuse institutions operating through law, markets, and media.

And as we have seen that happen, trust has eroded, cooperation breaks down, and the very conditions that made freedom possible have begun to unravel.

But I don’t think that was the original aim of classical liberalism.

It’s not that it set out to dismantle the community. It’s that over time, through cultural, economic, and technological changes, the balance between freedom and structure eroded. And now we’re dealing with the consequences of that imbalance.

The more I read, the harder it is to ignore the tension at the heart of the American Revolution itself.

It speaks the language of liberty, but often operated through pressure, surveillance, and social enforcement. Groups like the Sons of Liberty didn’t just resist authority—they replaced it with their own forms of coercion, loyalty tests, and public punishment.

The sons of liberty regularly tar and feathered anyone who offended them or were officers of the British government.

I am not saying the ideals were wrong. It means liberty, on its own, doesn’t sustain itself.When formal authority is rejected, power doesn’t disappear. It simply relocates.

And without shared discipline or internal restraint, it often reappears in more fragmented, less accountable forms.

Liberty is not the absence of power.

It’s a problem of how power is structured, restrained, and lived.

There’s another reaction to this tension that’s worth acknowledging, even if it goes too far.

Thinkers like Mencken argued that the real problem isn’t the system, but the people—that democracy inevitably lowers the standard because it reflects the average citizen. 

And I understand the sentiment; but that framing misses something important.

The issue isn’t that people are inherently incapable of self-government.

It’s that self-government requires habits, discipline, and formation that a system alone cannot produce.

What makes this moment particularly interesting is that the unease people feel doesn’t map neatly onto political categories.

Across both the left and the right, there’s a growing intuition that something isn’t functioning the way it should.

You see it in the rare points of agreement. Public frustration over the lack of transparency in the Epstein files cuts across political lines, with overwhelming majorities convinced that key information is still being withheld and justice is yet to be served. 

You see it in foreign policy as well. Even in a deeply divided country, there is broad skepticism toward escalating conflicts like the war involving Iran, with many of us questioning the purpose, cost, and direction of involvement. 

That concern isn’t new. It shows up clearly in Cato’s Letters, where distrust of power wasn’t abstract—it was grounded in history. The Roman Empire was a constant reference point, especially in how standing armies, once established, could be turned inward, gradually eroding liberty and consolidating control.

They weren’t against defense. But they were deeply wary of permanent military power and foreign entanglements that primarily served those in control, not the public. War wasn’t just protection. It was one of the fastest ways power could expand.

And it’s hard not to wonder how they would look at what we now call the military-industrial complex—how permanent it’s become, how embedded it is, and how easily it justifies its own expansion. 

Power attracts interests that seek to influence it through money, proximity, and favor and over time those interests become embedded within the system itself, shaping decisions in ways that are no longer aligned with the public.

How this shows up today in modern times points to the fact that governmental power no longer feels like a trust. We The People who want to put America and her people’s needs First, are witnessing an occupied government like never before. And that our institutions are no longer held accountable. They have become self-protective and disconnected from the very people they’re meant to serve.


“Power, in proportion to its extent, is ever prone to wantonness.” — Josiah Quincy Jr., Observations on the Boston Port-Bill (1774)

“The supreme power is ever possessed by those who have arms in their hands.” (colonial political writing, mid-18th century)

Standing armies, they warned, could become “the means, in the hands of a wicked and oppressive sovereign, of overturning the constitution… and establishing the most intolerable despotism.” — Simeon Howard, sermon (c. 1773–1775)

Which is why Jefferson insisted on keeping “the military… subject to the civil power,” not the other way around (1774).


There’s also empirical evidence from over a decade ago pointing in that direction. 

Sometimes known as “the oligarchy study” published in 2014 by Martin Gilens and Benjamin Page analyzed nearly 1,800 policy decisions in the United States and found that economic elites and organized business interests have a substantial independent influence on policy outcomes, while average citizens have little to no independent impact.

Policies favored by the majority tend to pass only when they align with the preferences of the wealthy. When they don’t, public opinion has almost no measurable effect.

This one study doesn’t prove that the system has fully collapsed into oligarchy.

But it does reinforce our intuition that something has shifted, that power is no longer functioning as it should and that representation is much more limited than we assume.

What I’ve learned from putting this together is that this concern is not new. It’s ancient.

It’s the same fear that appears in the Greek philosophers, carries through Rome, reemerges in the founding era, and is now unfolding again in modern society.

This is the same dynamic Madison was pointing to in Federalist No. 10. When legitimacy starts to weaken, people don’t simply disengage.

They form groups around competing explanations for what’s gone wrong—different interests, different priorities, different visions of what should replace it.

Within the modern left, those responses are not all the same.

Establishment Democrats still operate within existing systems. Liberals tend to push for reform through policy. Progressives begin to question the structure itself. And further out, democratic socialists and revolutionary groups are not aiming to fix the system, but to replace it entirely.

That distinction matters. Because once you move from reform to replacement, you’re no longer arguing about how to use a system.

You’re arguing about whether it should exist at all. At the far end of that spectrum, some movements push toward dismantling foundational structures entirely, treating them as irredeemably corrupt.

You can see this in specific, coordinated efforts.

Large-scale protest movements like the recent “No Kings” demonstrations, like on March 28th, 2026, bringing 8 million people into the streets across the United States. With more than 3,300 coordinated events spanning all 50 states, the mobilization set a record for the largest single day of protest in U.S. history.

They have planned actions like May Day strikes, where activists are calling for mass labor disruption and economic shutdown. And organized noncooperation campaigns designed to train people in how to resist, overwhelm, or halt existing systems altogether.

Their logic is that the system of capitalism is no longer seen as something to work within, but something to resist, bypass, or bring to a stop.

Not reform. But disruption and replacement.

I’ve spent enough time around these spaces to understand the appeal. When institutions feel captured or unresponsive, the instinct is not to reform them—but to burn them down to the ground.

Freedom is not collapsing because people have rejected it. It’s becoming unstable because we can no longer agree on what it is, what it requires, or what its limits should be.

And as more of the burden falls on individuals while leadership fails to model it, people start to feel both responsible and powerless. And that’s where apathy begins to take hold——when it no longer feels like it matters, especially to the people at the top.

United States Capitol Rotunda — The Dome Painting “The Apotheosis of Washington” Painted by Constantino Brumidi in 1865

V. The Human Problem at the Center of Freedom

A republic doesn’t survive on laws alone.

It survives on citizens who can exercise restraint, who understand limits, who see freedom not just as permission, but as responsibility.

One way to understand this shift more clearly is through moral psychology. Human beings don’t arrive at morality purely through reasoning. We rely on a set of underlying intuitions (care, fairness, loyalty, authority, and a sense of the sacred) that shape how we judge right and wrong before we ever explain why.

In more conservative or traditional societies, these moral intuitions tend to operate together rather than in isolation. Care, fairness, loyalty, authority, and a sense of the sacred reinforce one another, creating a more unified moral framework. People may still disagree, but they are drawing from a shared moral language, with expectations around family, roles, restraint, and what should or should not be done.

But that kind of shared moral framework doesn’t hold evenly across modern society.

The second way to see this is by looking at how these moral intuitions organize into distinct patterns cluster across different groups. In the chart, you can see three broad orientations: progressives, conservatives, and libertarians. Progressives tend to cluster around care and fairness. Conservatives draw from a wider range, incorporating loyalty, authority, and a sense of the sacred alongside those concerns. Libertarians center heavily on liberty, placing less weight on the others. What looks like a disagreement about politics is often a difference in moral orientation—people emphasizing entirely different parts of the same moral landscape.

And the differences don’t just show up in orientation, but in intensity.

This bar graph illustrates this pattern more clearly when we look at how different groups actually prioritize these moral intuitions. 

Secular liberals and the religious left tend to emphasize care and fairness most strongly, focusing on reducing harm and promoting equality. By contrast, more traditional or socially conservative groups draw more evenly across a broader set of values, including loyalty, authority, and a sense of the sacred alongside care and fairness. Libertarians tend to narrow even further, prioritizing individual liberty while placing less emphasis on collective or traditional moral structures. 

The result isn’t just disagreement over morality—it’s a difference in what people are even measuring in the first place, which makes shared judgment harder to sustain.

You can see the split in how people respond to the same breakdown in trust.

For those on the left, freedom means removing constraint entirely and that leads to a push to dismantle systems they see as corrupt or oppressive. 

For those on the right, it produces deep suspicion: distrust of elections, media, public health authority, and government itself, along with a desire to restore order, stability, and clearer boundaries. In some cases, that turns into nostalgia for earlier structures: family roles, gender norms, and forms of religious authority that are seen as more stable, even if that restoration comes with its own trade-offs.

These aren’t just different political positions. 

They reflect different instincts about what matters most and different assumptions about what freedom is for.

And both risk missing the deeper question.

Not just: what system creates freedom?

But what kind of people can sustain it?

This is where Aristotle’s framework becomes difficult to ignore. In that sense, his may be closer to the truth than many modern assumptions. It starts from the premise that people are not equal in their capacity for judgment or self-governance—and builds from there, rather than pretending those differences don’t matter.

It shows up in how people live, how they make decisions, and how they exercise restraint. That’s where his framework of virtue comes in—not as an ideal, but as a way of describing what it actually takes to live well and participate in a functioning society.

He didn’t think virtue was about perfection. He thought of it as balance. Courage sits between cowardice and recklessness.

Self-control between indulgence and insensibility.

Generosity between stinginess and excess.

Virtue is not automatic. It is cultivated. And it can be lost.

He applied that same logic to political systems. A government can exist in a healthy form, oriented toward the common good, or in a corrupted form, serving only a faction. At that point, the difference isn’t just structural. It comes down to character.

One tension that keeps resurfacing in political thought is the gap between equality in principle and inequality in capacity.

You can see this play out in small, everyday ways. Give ten people the same freedom, the same opportunity, the same set of rules—and you don’t get the same outcomes. Some plan ahead. Some act impulsively. Some take responsibility. Others look for ways around it. The structure is equal, but the response isn’t. 

Because human beings are not identical in judgment, discipline, or temperament. Some are more capable of long-term thinking, self-restraint, and navigating complexity than others.

A free society doesn’t eliminate those differences. It has to operate in spite of them. And that creates the real challenge.

A system built on self-government depends on habits it cannot enforce, on restraint it cannot require, and on a shared understanding of limits it cannot guarantee.

Which raises a difficult question:

What happens when a system built on equal freedom depends on unequal capacities to sustain it?

Freedom is not self-sustaining. The more we treat it like it is, the more fragile it becomes. 

When those conditions weaken, the structure doesn’t collapse all at once. It loosens, then drifts, and eventually begins to follow the same pattern that earlier thinkers warned about. 

Not because the idea of freedom was flawed, but because it was always contingent on something more demanding than we like to admit.

And that’s what makes the older warnings so difficult to ignore. The concerns that show up in Greek philosophy, carry through Rome, and reappear in the founding era weren’t tied to one moment in history. They’re describing something recurring. Power doesn’t stay put. It accumulates. It protects itself. And without pressure against it, it shifts (often quietly) into something more self-serving than it was at the start.

The documents and letters from the founding era weren’t written for a stable world. They were written by people who assumed this drift was inevitable. That’s why they were obsessed over things like faction, corruption, and the abuse of power. Not just as political problems, but as moral ones. Because once corruption sets in, it doesn’t just distort institutions. It reshapes the people within them. A corrupt government cannot be a just government. That’s why they treated free speech, free press and an informed public less like ideals and more like important tools—ways of forcing power into the open before it had the chance to consolidate.

Cato’s letters, in particular, were relentless on this point. They knew that a society that becomes consumed with wealth, status, and self-interest doesn’t just become unequal. It becomes easier to manipulate, easier to divide, and eventually less capable of governing itself at all. Civic virtue wasn’t a side note. It was the condition that made freedom possible in the first place.

And when you look at it from that angle, it doesn’t feel like you’re reading writings from the 18th century. It feels familiar, much closer to home. 

Of course, the scale is different now. The mechanisms are different. But the tension is very much the same. Governments and corporations operate with a level of reach the founders never could have imagined with technology. Information is filtered, behavior is shaped, and power often moves through systems that don’t look like power at all. You don’t always see it directly. But you feel the effects of it.

So the responsibility doesn’t go away. It never did.

If anything, it becomes less obvious and more necessary at the same time.

A system like this doesn’t hold because it was designed well. It holds, when it does, because enough people are still paying attention. Still pushing back. Still unwilling to let power define its own limits.

And once that slips…once that expectation fades, the structure doesn’t fail all at once. It just stops holding in the way it used to. And the pattern continues.

United States Capitol Rotunda — The Dome Painting “The Apotheosis of Washington” Painted by Constantino Brumidi in 1865

Resources: 

This piece pulls from a mix of ancient sources, founding-era writing, and modern critiques. Not because I agree with all of them, but because each one sharpens a different part of the problem. If you want to work through it yourself, these are the ones that shaped how I’m thinking about it:

Corporate Rights and the Most Absurd Legal Fiction: A Reactionary History and Analysis of Corporate Personhood

Bernard Bailyn — The Ideological Origins of the American Revolution
Less about what the founders built, more about what they were reacting to—especially the collapse of earlier republics.

Alexander Hamilton, James Madison, John Jay — The Federalist Papers
A direct look at how they thought about human nature, power, and why freedom needs structure to hold.

Patrick Deneen — Why Liberalism Failed
I don’t agree with all of it, but the critique of modern individualism and the erosion of shared norms is worth taking seriously.

Plato — The Republic
Still one of the clearest descriptions of how excessive freedom destabilizes a society.

Aristotle — Politics
Helpful for understanding how democracies drift when law loses authority and personality takes over.

Polybius — Histories
His framework for how governments rise and decay is hard to unsee once you see it.

Louise Perry — The Case Against the Sexual Revolution
A modern example of how expanded freedom doesn’t always produce the outcomes people expect.

Jonathan Haidt — The Righteous Mind
Useful for understanding why reason alone doesn’t hold societies together—and why people experience morality so differently.

Charles Freeman — The Closing of the Western Mind
Explores how early Christianity reshaped intellectual life in the West.
Also recommended: The Opening of the Western Mind

Roger E. Olson — The Story of Christian Theology
A clear overview of how Christian thought developed over time and how its internal tensions evolved.

Judith Bennett — Women in the Medieval English Countryside
Insight into everyday life, structure, and roles in pre-modern society.

Christine Fell — Women in Anglo-Saxon England
A look at social organization and cultural norms in early English society.

When Discipline Stops Working

What Women Were Never Told About Weight, Aging, and Control

The Science They Never Told Us

This is the first episode of 2026, and I wanted to start the year by slowing things down, getting a bit personal instead of chasing the latest talking points.

At the end of last year, I spent time reading a few books that genuinely stopped me in my tracks. Not because they offered a new diet or a new protocol, but because they challenged something much deeper: the story we’ve been told about discipline, control, and women’s bodies.

There is a reason women’s bodies change across the lifespan. And it has very little to do with willpower, discipline, or personal failure.

In Why Women Need Fat, evolutionary biologists William Lassek and Steven Gaulin make the case that most modern conversations about women’s weight are fundamentally misinformed. Not because women are doing something wrong, but because we’ve built our expectations on a misunderstanding of what female bodies are actually designed to do.

A major part of their argument focuses on how industrialization radically altered the balance of omega-6 to omega-3 fatty acids in the modern food supply, particularly through seed oils and ultra-processed foods. They make a compelling case that this shift plays a role in rising obesity and metabolic dysfunction at the population level.

I agree that this imbalance matters, and it’s a topic that deserves its own full episode. At the same time, it does not explain every woman’s story. Diet composition can influence metabolism, but it cannot override prolonged stress, illness, hormonal disruption, nervous system dysregulation, or years of restriction. In my own case, omega-6 intake outside of naturally occurring sources is relatively low and does not account for the changes I’ve experienced. That matters, because it reminds us that biology is layered. No single variable explains a complex adaptive system.

One of the most important ideas in the book is that fat distribution matters more than fat quantity.

Women do not store fat the same way men do. A significant portion of female body fat is stored in the hips and thighs, known as gluteofemoral fat. This fat is metabolically distinct from abdominal or visceral fat. It is more stable, less inflammatory, and relatively enriched in long-chain fatty acids, including DHA, which plays a key role in fetal brain development.

From an evolutionary standpoint, this makes sense. Human infants are born with unusually large, energy-hungry brains. Women evolved to carry nutritional reserves that could support pregnancy and lactation, even during times of scarcity. In that context, having fat on your lower body was not a flaw or a failure. It was insurance.

From this perspective, fat is not excess energy. It is deferred intelligence, stored in anticipation of future need. This is where waist-to-hip ratio enters the conversation.

Across cultures and historical periods, a lower waist-to-hip ratio in women has been associated with reproductive health, metabolic resilience, and successful pregnancies. This is not about thinness, aesthetics, or moral worth. It is about fat function, not fat fear, and about how different tissues behave metabolically inside the body. It is about where fat is stored and how it functions.

And in today’s modern culture we have lost that distinction.

Instead of asking what kind of fat a woman carries, we became obsessed with how much. Instead of understanding fat as tissue with purpose, we turned it into a moral scoreboard. Hips became a problem. Thighs became something to shrink. Curves became something to discipline.

Another central idea in Why Women Need Fat is biological set point.

The authors argue that women’s bodies tend to defend a natural weight range when adequately nourished and not under chronic stress. When women remain below that range through restriction, over-exercise, or prolonged under-fueling, the body does not interpret that as success. It interprets it as threat.

Over time, the body adapts, not out of defiance, but out of protection.

Metabolism slows. Hunger and fullness cues become unreliable. Hormonal systems compensate. When the pressure finally eases, weight often rebounds, sometimes beyond where it started, because the body is trying to restore safety.

From this perspective, midlife weight gain, post-illness weight gain, or weight gain after years of restriction is not mysterious. It is not rebellion. It is regulation.

None of this is taught to women.

Instead, we are told that if our bodies change, we failed. That aging is optional. That discipline and botox should override biology. That the number on the scale tells the whole story.

So, before we talk about culture, family, trauma, or personal experience, this matters:

Women’s bodies are not designed to stay static.
They are designed to adapt.

Once you understand that, everything else in this conversation changes.


Why the Body Became the Battlefield

This is where historian Joan Jacobs Brumberg’s work in The Body Project: An Intimate History of American Girls, provides essential context, but it requires some precision.

Girls have not always been free from shame. Shame itself is not new. What has changed is what women are taught to be ashamed of, and how that shame operates in daily life.

Brumberg asks a question that still feels unresolved today:
Why is the body still a girl’s nemesis? Shouldn’t sexually liberated girls feel better about themselves than their corseted counterparts a century ago?

Based on extensive historical research, including diaries written by American girls from the 1830s through the 1990s, Brumberg shows that although girls today enjoy more formal freedoms and opportunities, they are also under more pressure and at greater psychological risk. This is due to a unique convergence of biological vulnerability and cultural forces that turned the adolescent female body into a central site of social meaning during the twentieth century.

In the late nineteenth and early twentieth centuries, girls did not typically grow up fixated on thinness, calorie control, or constant appearance monitoring. Their diaries were not filled with measurements or food rules. Instead, they wrote primarily about character, self-restraint, moral development, relationships, and their roles within family and community.

One 1892 diary entry reads:

“Resolved, not to talk about myself or feelings. To think before speaking. To work seriously. To be self-restrained in conversation and in actions. Not to let my thoughts wander. To be dignified. Interest myself more in others.”

In earlier eras, female shame was more often tied to behavior, sexuality, obedience, and virtue. The body mattered, but primarily as a moral symbol rather than an aesthetic project requiring constant surveillance and correction.

That changed dramatically in the twentieth century.

Brumberg documents how the mother-daughter connection loosened, particularly around menstruation, sexuality, and bodily knowledge. Where female relatives and mentors once guided girls through these transitions, doctors, advertisers, popular media, and scientific authority increasingly stepped in to fill that role.

At the same time, mass media, advertising, film, and medicalized beauty standards created a new and increasingly exacting ideal of physical perfection. Changing norms around intimacy and sexuality also shifted the meaning of virginity, turning it from a central moral value into an outdated or irrelevant one. What replaced it was not freedom from scrutiny, but a different kind of pressure altogether.

By the late twentieth century, girls were increasingly taught that their bodies were not merely something they inhabited, but something they were responsible for perfecting.

A 1982 diary entry captures this shift starkly:

“I will try to make myself better in any way I possibly can with the help of my budget and baby-sitting money. I will lose weight, get new lenses, already got a new haircut, good makeup, new clothes and accessories.”

What changed was not the presence of shame, but its location. Shame moved inward.

Rather than being externally enforced through rules and prohibitions, it became self-policed. Girls were taught to monitor themselves constantly, to evaluate their bodies from the outside, and to treat appearance as the primary expression of identity and worth.

Brumberg is explicit on this point. The fact that American girls now make their bodies their central project is not an accident or a cultural curiosity. It is a symptom of historical changes that are only beginning to be fully understood.

This is where more recent work, such as Louise Perry’s The Case Against the Sexual Revolution, helps extend Brumberg’s analysis into the present moment. Perry argues that while sexual liberation promised autonomy and empowerment, it often left young women navigating powerful biological and emotional realities without the social structures that once offered protection, guidance, or meaning. In that vacuum, the body became one of the few remaining sites where control still seemed possible.

The result is a paradox. Girls are freer in theory, yet more burdened in practice. The body, once shaped by communal norms and shared female knowledge, becomes a solitary project, managed under intense cultural pressure and constant comparison.

For many girls, this self-surveillance does not begin with magazines or social media. It begins at home, absorbed through tone, comments, and modeling from the women closest to them.

Brumberg argues that body dissatisfaction is often transmitted from mother to daughter, not out of cruelty, but because those mothers inherited the same aesthetic anxieties. Over time, body shame becomes a family inheritance, passed down quietly and persistently.

Some mothers transmit it subtly.

Others do it bluntly.

This matters not because my experience is unique, but because it illustrates what happens when a body shaped by restriction, stress, and cultural pressure is asked to perform indefinitely. Personal stories are often dismissed as anecdotal, but they are where biological theory meets lived reality.

If you want to dive deeper into this topic:


Where It All Began: The Messages That Shape Us

I grew up in a household where my body was not simply noticed. It was scrutinized, compared, and commented on. Comments like that do not fade with time. They shape how you see yourself in mirrors and photographs. They teach you that your body must be managed and monitored. They plant the belief that staying small is the price of safety.

So, I grew up believing that if I could control my body well enough, I could avoid humiliation. I could avoid becoming the punchline. I could avoid being seen in the wrong way.

For a while, I turned that fear into discipline.


The Years Before the Collapse: A Lifetime of Restriction and Survival

Food never felt simple for me. Long before bodybuilding, chronic pain, or COVID, I carried a strained relationship with eating. Growing up in a near constant state of anxiety meant that hunger cues often felt unpredictable. Eating was something to plan around or push through. It rarely felt intuitive or easy.

Because of this, I experimented with diets that replaced real meals with cereal or shakes. I followed plans like the Special K diet. I relied on Carnation Instant Breakfast instead of full meals. My protein intake was low. My fear of gaining weight was high. Restriction became familiar.

Top left is when I started working out obsessively at age 16, top right and bottom photo are from middle school when I was at my “heaviest” that drove the disordered behaviors.

In college, I became a strict vegetarian out of compassion for animals, but I did not understand how to meet my nutritional needs. I was studying dietetics and earning personal training certifications while running frequently and using exercise as a way to maintain control. From the outside, I looked disciplined. Internally, my relationship with food and exercise remained tense and inconsistent.

Later, I became involved in a meal-replacement program through an MLM. I replaced two meals a day with shakes and practiced intermittent fasting framed as “cleanse days.” In hindsight, this was structured under-eating presented as wellness. It fit seamlessly into patterns I had lived in for years.

Eating often felt overwhelming. Cooking felt like a hurdle. Certain textures bothered me. My appetite felt fragile and unreliable. This sensory sensitivity existed long before the parosmia that would come years later. From early on, food was shaped by stress rather than nourishment.

During this entire period, I was also on hormonal birth control, first the NuvaRing and later the Mirena IUD, for nearly a decade. Long-term hormonal modulation can influence mood, inflammation, appetite, and weight distribution. It added another layer of complexity to a system already under strain.

Looking back, I can see that my teens and twenties were marked by near constant restriction. Restriction felt normal. Thriving did not.

The book Why Women Need Fat discusses the idea of a biological weight “set point,” the range a body tends to return to when conditions are stable and adequately nourished. I now understand that I remained below my natural set point for years through force rather than balance. My biology never experienced consistency or safety.

This was the landscape I carried into my thirties.


The Body I Built and the Body That Broke

By the time I entered the bodybuilding world in 2017 and 2018, I already had years of chronic under-eating, over-exercising, and nutrient gaps behind me. Bodybuilding did not create my issues. It amplified them.

I competed in four shows. People admired the discipline and the physique. Internally, my body was weakening. I was overtraining and undereating. By 2019, my immune system began to fail. I developed severe canker sores, sometimes twenty or more at once. I started noticing weight-loss resistance. Everything I had done in the past, was no longer working. On my thirty-fifth birthday, I got shingles. My energy crashed. My emotional bandwidth narrowed. My body was asking for rest, but I did not know how to slow down.

Dive deeper into my body building journey here:

Around this time, I was also navigating eating disorder recovery. Learning how to eat without panic or rigid control was emotionally exhausting even under ideal circumstances… but little did I know things were about to take a massive turn for the worst.


COVID, Sensory Loss, and the Unraveling of Appetite

After getting sick with the ‘vid late 2020, everything shifted again. I developed parosmia, a smell and taste distortion that made many foods taste rotten or chemical. Protein and cooked foods often tasted spoiled. Herbs smelled like artificial chemical. Eating became distressing and, at times, impossible.

My appetite dropped significantly. There were periods where my intake was very low, yet my weight continued to rise. This is not uncommon following illness or prolonged stress. The body often shifts into energy conservation, prioritizing survival overweight regulation.

Weight gain became another source of grief. Roughly thirty pounds over the next five years. I feel embarrassed and avoid photographs. I often worry about how others will perceive me.

If this experience resonates, it is important to say this clearly: your body is not betraying you. It is responding to stress, illness, and prolonged strain in the way bodies are designed to respond.


Why Women’s Bodies Adapt Instead of “Bounce Back”

When years of restriction, intense exercise, chronic stress, illness, hormonal shifts, and emotional trauma accumulate, the body often enters a protective state. Metabolism slows. Hormonal signaling shifts. Hunger cues become unreliable. Weight gain or resistance to weight loss can occur even during periods of low intake, because energy regulation is being driven by survival physiology rather than simple calorie balance.

This is not failure. It is physiology.

The calories-in, calories-out model does not account for thyroid suppression, nervous system activation, sleep disruption, pain, trauma, or metabolic adaptation. It reduces a complex biological system to arithmetic.

Women are not machines. We are adaptive systems built for survival. Sometimes resilience looks like holding onto energy when the body does not feel safe.


The Systems That Reinforce Shame

Despite this biological reality, we live in a culture that ties women’s value to discipline and appearance. When women gain weight, even under extreme circumstances, we blame ourselves before questioning the system.

Diet culture frames shrinking as virtue.

Toxic positivity encourages acceptance without context.

Industrial food environments differ radically from those our ancestors evolved in.

Medical systems often dismiss women’s pain and metabolic complexity.

Social media amplifies comparison and moralizes body size.

None of this is your fault. And all of it shapes your experience.

This is why understanding the science matters. This is why telling the truth matters. This is why sharing stories matters.


In the book, More Than a Body, Lindsay and Lexie Kite describe how women are taught to relate to themselves through constant self-monitoring. Instead of living inside our bodies, we learn to watch ourselves from the outside. We assess how we look, how we are perceived, and whether our bodies are acceptable in a given moment.

This constant self-surveillance does real harm. It pulls attention away from hunger, pain, fatigue, and intuition. It trains women to override bodily signals in favor of appearance management. And over time, it creates a split where the body is treated as a project to control rather than a system to understand or care for.

When you layer this kind of self-objectification on top of chronic stress, restriction, illness, and trauma, the result is not empowerment. It is disconnection. And disconnection makes it even harder to hear what the body needs when something is wrong.

Weight gain is not just a biological response. It becomes a moral verdict. And that is how women end up fighting bodies that are already struggling to keep them alive.

The Inheritance Ends Here

For a long time, I believed that breaking generational cycles only applied to mothers and daughters. I do not have children, so I assumed what I inherited would simply end with me, unchanged.

Brumberg’s work helped me see this differently.

What we inherit is not passed down only through parenting. It moves through tone, silence, and self-talk. It appears in how women speak about their bodies in front of others. It lives in the way shame is normalized.

I inherited a legacy of body shame. Even on the days when I still feel its weight, I am choosing not to repeat it.

For me, the inheritance ends with telling the truth about this journey and refusing to speak to my body with the same cruelty I absorbed growing up. It ends here.


Closing the Circle: Your Body Is Not Broken

I wish I could end this with a simple story of resolution. I cannot. I am still in the middle of this. I still grieve. I still struggle with eating and movement. I am still learning how to inhabit a body that feels unfamiliar.

But I know this: my body is not my enemy. She is not malfunctioning. She is adapting to a lifetime of stress, illness, restriction, and emotional weight.

If you are in a similar place, I hope this offers permission to stop fighting yourself and start understanding the patterns your body is following. Not because everything will suddenly improve, but because clarity is often the first form of compassion.

Your body is not betraying you. She is trying to keep you here.

And sometimes the most honest thing we can do is admit that we are still finding our way.


References

  1. Brumberg, J. J. (1997). The Body Project: An Intimate History of American Girls. Random House.
  2. Lassek, W. D., & Gaulin, S. J. C. (2011). Why Women Need Fat: How “Healthy” Food Makes Us Gain Excess Weight and the Surprising Solution to Losing It Forever. Hudson Street Press.
  3. Kite, L., & Kite, L. (2020). More Than a Body: Your Body Is an Instrument, Not an Ornament. Houghton Mifflin Harcourt.

Scientific and academic sources

  1. Lassek, W. D., & Gaulin, S. J. C. (2006). Changes in body fat distribution in relation to parity in American women. Evolution and Human Behavior, 27(3), 173–185.
  2. Lassek, W. D., & Gaulin, S. J. C. (2008). Waist–hip ratio and cognitive ability. Proceedings of the Royal Society B, 275(1644), 193–199.
  3. Dulloo, A. G., Jacquet, J., & Montani, J. P. (2015). Adaptive thermogenesis in human body-weight regulation. Obesity Reviews, 16(S1), 33–43.
  4. Fothergill, E., et al. (2016). Persistent metabolic adaptation after weight loss. Obesity, 24(8), 1612–1619.
  5. Kyle, U. G., et al. (2004). Body composition interpretation. American Journal of Clinical Nutrition, 79(6), 955–962.
  6. Simopoulos, A. P. (2016). Omega-6/omega-3 balance and obesity risk. Nutrients, 8(3), 128.

Trauma, stress, and nervous system context

  1. Sapolsky, R. M. (2004). Why Zebras Don’t Get Ulcers. Henry Holt and Company.
  2. Walker, P. (2013). Complex PTSD: From Surviving to Thriving. Azure Coyote Books.

Projection, Power, and the Pagan Revival

When Belief Becomes Control

This episode isn’t about religion versus religion.
It’s about power, fear, and what happens inside belief systems when conformity becomes more important than honesty.

In this conversation, I’m joined by Sigrin, founder of Universal Pagan Temple.

She’s a practicing Pagan, a witch, a public educator, and someone who speaks openly about leaving Christianity after experiencing fear-based theology, spiritual control, and shame. I want to pause here, because even as an agnostic, when I hear the word witch, my brain still flashes to the cartoon villain version. Green. Ugly. Evil. That image didn’t come from nowhere. It was taught.

One of the things we get into in this conversation is how morality actually functions in Pagan traditions, and how different that framework is from what most people assume.

She describes leaving Christianity not as rebellion, but as self-preservation. And what pushed her out wasn’t God. It was other Christians.

For many people, Christianity isn’t learned from scripture.
It’s learned from other Christians.

The judgment.
The constant monitoring.
The fear of being seen as wrong, dangerous, or spiritually compromised.

In high-control Christian environments, conformity equals safety. Questioning creates anxiety. And the fear of social punishment often becomes stronger than belief itself.

When belonging is conditional, faith turns into survival.


What We Cover in This Conversation:

Paganism Beyond Aesthetics

A lot of people hear “Paganism” and immediately picture vibes, trends, or cosplay. We spend time breaking that assumption apart.

  • Sigrin explains that many beginners jump straight into ritual without actually invoking or dedicating to the divine.
  • She talks about the difference between aesthetic practice and intentional practice.
  • For people who don’t yet feel connected to a specific god or goddess, she offers grounded guidance on how to approach devotion without forcing it.
  • We talk about the transition she experienced moving from Christianity, to atheism, to polytheism.
  • We explore the role of myth, story, and symbolism in spiritual life.
  • She shares her experience of feeling an energy she couldn’t deny, even after rejecting belief entirely.
  • We touch on the wide range of ways Pagans relate to pantheons, including devotional, symbolic, ancestral, and experiential approaches.

The takeaway here isn’t “believe this.”
It’s that Paganism isn’t shallow, trendy, or uniform. It’s relational.


No Holy Book, No Central Authority

One of the most misunderstood aspects of Paganism is the absence of a single text or governing authority.

  • Sigrin references a line she often uses: “If you get 20 witches in a room, you’ll have 40 different beliefs.”
  • We talk about how Pagan traditions don’t operate under enforced doctrine or centralized belief.
  • She brings up the 42 Negative Confessions from ancient Egyptian tradition as an example of ethical self-statements rather than commandments.
  • These function more like reflections on character than laws imposed from above.
  • We compare this to moral storytelling across different myth traditions rather than rigid rule-following.
  • She emphasizes intuition and empathy as core tools for ethical decision-making.
  • I add the role of self-reflection and introspection in systems without external enforcement.

This raises an important question: without a script, responsibility shifts inward.

Why This Can Be Hard After Christianity

We also talk honestly about why this freedom can be uncomfortable, especially for people leaving authoritarian religion.

  • Sigrin notes how difficult it can be to release belief in hell, even after leaving Christianity.
  • Fear doesn’t disappear just because belief changes.
  • When morality was once externally enforced, internal trust has to be rebuilt.
  • Pagan paths often require learning how to sit with uncertainty rather than replacing one authority with another.

This isn’t easier.
It’s quieter.
And it asks more of the individual.

That backdrop matters, because it shapes how Paganism gets misunderstood, misrepresented, and framed as dangerous.


The “Pagan Threat” Narrative

One of the reasons Pagan Threat has gained attention and sparked controversy is not just its content, but whose voice it carries and how it’s framed at the outset.

  • The book was written by Pastor Lucas Miles, a senior director with Turning Point USA Faith and author of other conservative religious critiques. The project is positioned as a warning about what Miles sees as threats to the church and American society. The foreword was written by Charlie Kirk, founder of Turning Point USA. His introduction positions the book as urgent for Christians to read.

From there, the book makes a striking claim:

  • It describes Christianity as a religion of freedom, while framing Paganism as operating under a hive mind or collective groupthink.

A key problem is which Paganism the book is actually engaging.

  • The examples Miles focuses on overwhelmingly reflect liberal, online, or activist-adjacent Pagan spaces, particularly those aligned with progressive identity politics.
  • That narrow focus gets treated as representative of Paganism as a whole.
  • Conservative Pagans, reconstructionist traditions, land-based practices, and sovereignty-focused communities are largely ignored.

As a result, “wokeness” becomes a kind of explanatory shortcut.

  • Modern political anxieties get mapped onto Paganism.
  • Gender ideology, progressive activism, and left-leaning culture get blamed on an ancient and diverse spiritual category.
  • Paganism becomes a convenient container for everything the author already opposes.

We also talk openly about political realignment, and why neither of us fits cleanly into the right/left binary anymore. I raise the importance of actually understanding Queer Theory, rather than using “queer” as a vague identity umbrella.

To help visualize this, I reference a chart breaking down five tiers of the far left, which I’ll include here for listeners who want context.

Next, in our conversation, Sigrin explains why the groupthink accusation feels completely inverted to anyone who has actually practiced Paganism.

  • Pagan traditions lack central authority, universal doctrine, or an enforcement mechanism.
  • Diversity of belief isn’t a flaw. It’s a defining feature.
  • Pagan communities often openly disagree, practice differently, and resist uniformity by design.

The “hive mind” label ignores that reality and instead relies on a caricature built from a narrow and selective sample.

 “Trotter and Le Bon concluded that the group mind does not think in the restricted sense of the word. In place of thoughts, it has impulses, habits, and emotions. Lacking an independent mind, its first impulse is usually to follow the example of a trusted leader. This is one of the most firmly established principles of mass psychology.”  Propaganda by Edward L. Bernays

We contrast this with Christian systems that rely on shared creeds, orthodoxy, and social enforcement to maintain cohesion.

Accusations of groupthink, in that context, often function as projection from environments where conformity is tied to spiritual safety.

In those systems, agreement is often equated with faithfulness and deviation with danger.

Globalism, Centralization, and Historical Irony

We end the conversation by stepping back and looking at the bigger historical picture.

  • The book positions Christianity as the antidote to globalism.
  • At the same time, it advocates coordinated religious unification, political mobilization, and cultural enforcement.
  • That contradiction becomes hard to ignore once you zoom out historically.

Sigrin points out that pre-Christian Pagan worlds were not monolithic.

  • Ancient polytheist societies were highly localized.
  • City-states and regions had their own gods, rituals, myths, and customs.
  • Religious life varied widely from place to place, even within the same broader culture.

I reference The Darkening Age by Catherine Nixey, which documents this diversity in detail.

  • Pagan societies weren’t unified under a single doctrine.
  • There was no universal creed to enforce across regions.
  • Difference wasn’t a problem to be solved. It was normal.

Christianity, by contrast, became one of the first truly globalizing religious systems.

  • A single truth claim.
  • A centralized authority structure.
  • A mandate to replace local traditions rather than coexist with them.

That history makes the book’s framing ironic.

  • Paganism gets labeled “globalist,” despite being inherently local and decentralized.
  • Christianity gets framed as anti-globalist, while proposing further consolidation of belief, power, and authority.

What This Is Actually About

This isn’t about attacking Christians as people.
And it’s not about defending Paganism as a brand.

It is a critique of how certain forms of Christianity function when belief hardens into certainty and certainty turns into control.

Fear-based religion and fear-based ideology share the same problem.
They promise safety.
They demand conformity.
And they struggle with humility.

That doesn’t describe every Christian.
But it does describe systems that rely on fear, surveillance, and moral enforcement to survive.

What I appreciate about this conversation is the reminder that spirituality doesn’t have to look like domination, hierarchy, or a battle plan.

It can be rooted. Local. Embodied.

It can ask something of you without erasing you.

And whether someone lands in Paganism, Christianity, or somewhere else entirely, the question isn’t “Which side are you on?”

It’s whether your beliefs make you more honest, more grounded, and more responsible for how you live.

That’s what I hope people sit with after listening.

Ways to Support Universal Pagan Temple 

Every bit of support helps keep the temple lights on, create more free content, and maintain our community altar. Thank you from the bottom of my heart! 

🖤
☕

 Buy me a coffee (one-time support)  
https://www.buymeacoffee.com/UniversalPaganTemple

💝

 Make a direct donation to the temple  
https://www.paypal.com/donate?hosted_button_id=6TMJ4KYHXB36U

🌟

 Become a Patreon/Subscribestar member (monthly perks & exclusive content)  
https://www.patreon.com/universalpagantemple
https://www.subscribestar.com/the-pagan-prepper

📜

 Join our Substack community (articles, rituals & updates)  
https://universalpagantemple.substack.com

🔮

 Book a Rune or Tarot reading (Etsy)  
https://www.etsy.com/shop/RunicGifts

📚

Grab our books on Amazon  
 •Wicca & Magick: Complete Beginner’s Guide  
https://www.amazon.com/Wicca-Magick-Complete-Beginners-Guide-ebook/dp/B019MZN8LQ

* Runes: Healing and Diet by Sigrún and Freya Aswynn
https://www.amazon.com/dp/B08FP25KH4#averageCustomerReviewsAnchor

• The Egyptian Gods and Goddesses for Beginners  
https://www.amazon.com/Egyptian-Gods-Goddesses-Beginners-Worshiping/dp/1537100092

Even just watching, liking, commenting, and sharing is a huge help!  
Blessed be 

🌀

The Older Story Beneath Christmas

A History of Yule and Cultural Amnesia

Every December, the same argument erupts like clockwork.

“Christmas is pagan.”
“No it isn’t, stop lying.”
“Actually, it’s Saturnalia.”
“Actually, it’s Jesus’ birthday.”

Christian Calling others out 😮

And honestly, the argument itself is the least interesting part.

Because Christmas didn’t replace older solstice traditions.
It grew out of them.

Long before doctrine, people were already gathering at midwinter. Lighting fires. Sharing food. Hanging evergreens. Leaving offerings. Watching the sun closely. Trying to survive the longest night of the year.

Most of what we now call “Christmas spirit” (the lights, the feasting, the greenery, the warmth, even the winter gift-giver) is older than Christian theology by centuries.

And yet, when I converted to Christianity in 2022, none of that felt magical.

It felt dangerous.


My First Christian Christmas: Panic, Purging, and Fear

I was only a few months into my short-lived Christian phase when December arrived, and I suddenly found myself terrified that Christmas was pagan, demonic, or spiritually contaminated.

I burned books.
I threw away crystals.
I cleaned my home like I was preparing for divine inspection.
I interrogated every decoration like it might open a portal.

I’m not exaggerating. I recently found an old document I wrote during that time, and reading it now is unsettling. It reads like I took an entire bucket of fundamentalist talking points, sprinkled in some Wikipedia conspiracies, and shook it like a snow globe.

Here are real lines I wrote in 2022:

“Christmas is a religious holiday. But it’s not Christian.”
“Christmas is the birthday of the sun god Tammuz.”
“Mistletoe came from druids who used it for demonic occult powers.”
“Santa Claus is based on Odin and meant to deceive children.”
“Jesus does not want you to celebrate Christmas.”

I believed every word of it.

Because fear-based Christianity works by shrinking your imagination.
It makes symbols dangerous.
History suspicious.
The world a spiritual minefield.

That was my first clue this wasn’t JUST about theology. It was about fear.
And the inability to hold layered meaning.


Why Winter Was Sacred Long Before Religion

For pre-industrial people, winter wasn’t cozy.

It wasn’t aesthetic. It wasn’t symbolic. It was dangerous.

Food stores ran low. Animals died. Illness spread. Darkness swallowed the day.

When the sun disappeared, it wasn’t metaphorical. It was existential.

That’s why midwinter mattered everywhere, not because cultures shared gods, but because they shared bodies, seasons, and risk.

Homes were built from thick logs, stone, and earth. Materials with thermal mass that held heat long after the fire dimmed. Hearths weren’t decorative. They were survival technology. Families and animals gathered together because warmth meant life.

This wasn’t primitive living. It was skilled living. And it shaped belief.

Seasonal rites weren’t abstract spirituality.
They were instructions for how to endure.


This Isn’t Just Capitalism — It’s Cultural Amnesia

It’s tempting to blame modern capitalism for the way winter has been flattened into noise, urgency, and forced cheer. And capitalism absolutely accelerated the problem.

But that explanation skips a much older rupture.

Pre-Christian seasonal traditions already honored limits. Rest. Darkness. Slowness. Winter was understood as a time of contraction, not productivity. You didn’t push harder in December. You pulled inward. You conserved. You waited.

Those rhythms were disrupted long before department stores and advertising campaigns.

First came religious overwrite… seasonal intelligence reframed into theological narratives that demanded certainty and transcendence over embodiment. Then came industrialization, which severed daily life from land, daylight, and season entirely. Artificial light erased night. Clocks replaced the sun. Productivity became moral.

By the time capitalism arrived in its modern form, much of the damage was already done. Capitalism didn’t invent our disconnection from seasonal limits. It inherited it.

What we’re really dealing with isn’t just exploitation.

It’s amnesia.

We forgot how winter works. We forgot how rest works. We forgot how darkness functions as part of a healthy cycle. And once that memory was gone, it became easy to sell us endless brightness in the darkest part of the year.


What Yule Actually Was. Before Christianity Rewrote It

This is where the history gets interesting….

The earliest surviving written reference to Yule comes from the 8th century, recorded by the Christian monk Bede. Like much of what we know about pre-Christian traditions, it was documented after conversion had already begun. The traditions themselves are older, but the written record is fragmentary and filtered.

The Venerable Bede, an English monk and missionary, was among the earliest writers to record the existence of Yule.

That timing matters.

Like much of what we know about pre-Christian Europe, Yule was documented after conversion had already begun. Earlier traditions were primarily oral, and many were actively suppressed or destroyed, which means the written record is incomplete and filtered through Christian authors.

That does not mean the traditions were new.

It means Christianity arrived late to write them down.

Later sources, such as Snorri Sturluson in Heimskringla (12th–13th century), describe Yule as a midwinter feast involving communal drinking, oath-making, ceremonial meals, ancestor honoring, and celebrations lasting multiple days, often twelve. By the time Snorri was writing, Christianity had already reshaped much of Nordic life, yet the seasonal patterns he records remain strikingly consistent.

The record is not pristine. But it is consistent enough to tell us this:
Yule was a land-based, seasonal response to winter, practiced long before Christianity and remembered imperfectly afterward.

So, when people talk about the “Twelve Days of Christmas,” they’re unintentionally echoing Yule, not the Gospels.


Yule Was Never One Thing — or One Date

There was never a single Yule and never a single calendar.

Some communities marked the solstice itself. Others observed the days before it.
Others celebrated after, once the sun’s return was perceptible.

Yule could last days or weeks, depending on latitude, climate, and local conditions. This diversity wasn’t confusion. It was responsiveness.

Seasonal traditions bent to land, not doctrine.
And that flexibility is one reason they survived so long.


Ancestors, Offerings, and the Household

Yule wasn’t only about gods. It was about the dead.

Midwinter was understood as a liminal time when ancestors drew near. The boundary between worlds thinned. Homes became places of hospitality not just for the living, but for those who came before.

Offerings were left. Food. Drink. Light. We still do this…. even if we pretend it’s just for children.

Milk and cookies for Santa didn’t come out of nowhere.
They echo something far older: leaving nourishment overnight, acknowledging unseen visitors, participating in reciprocity.

The modern story makes it cute.
The older story makes it sacred.


Before Santa, the Sky Was Crowded

Across Northern and Eastern Europe, winter solstice was associated with feminine figures of light, fertility, and renewal— many of whom traveled the sky.

In Baltic traditions, Saule carried the sun across the heavens. Among the Sámi, Beiwe rode through the winter sky in a sleigh pulled by reindeer, restoring fertility to the frozen land.

Darkness wasn’t evil. It was gestational.

The womb is dark. Seeds germinate underground.
Transformation happens unseen. That imagery didn’t disappear.

It migrated.


When Christmas Was Once Illegal

Here’s a part of the story that tends to surprise people.

Christmas was not always embraced by Christianity in America.
In fact, it was once illegal.

In the mid-1600s, Puritan leaders in New England viewed Christmas as pagan, Catholic, and morally corrupt. Everything associated with it raised suspicion.

Evergreens were considered pagan.
Feasting was considered pagan.
Dancing, games, and excess were condemned.
Even taking the day off work was seen as spiritually dangerous.

In 1659, the Massachusetts Bay Colony passed a law banning the celebration of Christmas outright. The statute read:

“Whosoever shall be found observing any such day as Christmas or the like, either by forbearing labour, feasting, or any other way… every such person so offending shall pay for every such offence five shillings.”

Celebrating Christmas was a finable offense.

The ban remained in effect until 1681. And even after it was repealed, many New England towns treated December 25th as an ordinary workday well into the 1700s.

Early American Christianity didn’t preserve Christmas.

It rejected it.

And yet, winter rituals have a way of surviving rejection.


How Christmas Quietly Returned

Christmas didn’t re-enter American life through theology or church decree.

It returned through households.

Throughout the 1700s and early 1800s, winter customs persisted in small, domestic ways. Evergreen branches were brought indoors. Candles were lit in windows. Food was shared. Stories of winter figures and gift-givers circulated quietly within families.

These practices weren’t organized or ideological. They were inherited.

Passed down the way people pass down recipes, songs, and seasonal habits, especially in communities tied to land, season, and home.

They survived because they worked.

They made winter bearable.
They gave rhythm to darkness.
They anchored people to memory and place.

Over time, these household customs accumulated. By the mid-1800s, Christmas re-emerged into public culture, not as a restored Christian holy day, but as a reassembled seasonal festival shaped by folklore, family practice, and winter necessity.

Only later was it fully absorbed, standardized, and commercialized.

That shift, from household memory to mass reproduction…. changed everything.


Santa Claus, Commercialism, and My Mom’s Coca-Cola Bathroom

Santa is one of the clearest examples of what happens when household tradition gives way to mass culture. Early versions of Santa look nothing like the modern mascot. Long robes. Staffs. Hoods. Sometimes thin. Sometimes eerie. Often dressed in green, brown, or deep red.

These figures echo older winter travelers. Odin riding the sky, spirits roaming during Yule, ancestors moving close. This transformation accelerated in the 1800s, when American illustrators and writers began merging European folklore with newly invented holiday imagery.

By then, Santa took shape again.

My husband and I recently found a reproduction Santa figure based on an 1897 illustration. He’s dressed in a long green robe with a staff in hand. This style was common in the 1800s, especially in Germanic and Scandinavian traditions where the winter gift-giver was closer to a folkloric spirit than a cozy grandfather. Seeing him in that deep forest green, with that hooded, old-world posture, makes it obvious how far the modern Santa has drifted from his roots.


By the 1900s, Coca-Cola standardized him. Red suit. White trim. Jolly. Brand-safe. Growing up, this wasn’t abstract for me.

My mom worked for Coca-Cola when the company was based in Richmond, Virginia, in the early 1980s. My first word was “Coke.” Coca-Cola wasn’t just a brand in our house, it was part of the atmosphere.

My mom loved Coca-Cola décor. We had Coca-Cola signs, collectibles, and even a full Coca-Cola bathroom. At the time, it just felt normal. Cozy, even. Americana. Tradition.

I didn’t realize until much later how completely my sense of “holiday spirit” had been shaped by corporate nostalgia rather than ancestral memory. What I thought of as timeless wasn’t old at all. It was manufactured, standardized, and sold back to us as heritage.

That doesn’t make it evil. But it does matter.

Because when branding replaces ritual, something gets flattened. The symbols remain, but the relationship is gone. What was once seasonal, local, and embodied becomes aesthetic. Consumable. Safe.

And for many of us, that’s the only version of winter we were ever given.

That’s not a judgment. It’s just reality. Most of us weren’t raised with ritual.
We were raised with branding.

What was lost in that transformation wasn’t belief. It was relationship— to land, to season, to memory.

And the people who held onto that relationship longest were already labeled for it.


Why “Heathen” Never Meant Godless

The word heathen never originally meant immoral or evil.

It meant rural.

Its earliest known form, haithno, is feminine and means “woman of the heath” — the open, uncultivated land beyond cities and roads. From there it spread through Germanic languages: Anglo-Saxon hǣþen, Old Norse heidinn, Old High German heidan.

Clergy used heathen to describe those who kept ancestral customs while cities converted. The 8th-century monk Paulus Diaconus wrote of heidenin commane (the rural people) calling them “the wild heathen.”

Offerings to trees, springs, and stones were condemned as sacrilege. Over time, heathen merged with Latin paganus, meaning “rural dweller,” and gentilis, meaning “of another tribe.”

What began as a description of people who would not leave the wild became a moral accusation.

Later, the same language was exported outward… applied to colonized lands as uncivilized or heathen.

The fear was never really about gods. It was about land that refused to be controlled.

What Actually Happened, and Why the Old Ways Are Calling Back

The same patterns repeat across centuries: suppression, survival, absorption, and forgetting.

But we need to be honest about what that suppression looked like.

This was not a gentle handoff.
It was not mutual exchange.
It was not respectful evolution.

Christianity did not simply reinterpret older traditions.
It destroyed them where it could.

This is not rhetoric. It is history.

Historian Catherine Nixey documents this process in The Darkening Age. Early Christianity treated pagan traditions not as ancestors, but as enemies. Temples were smashed. Statues were defaced. Sacred groves were cut down. Libraries were burned. Seasonal rites that had structured life for centuries were criminalized.

This destruction was not hidden or accidental. It was celebrated.

Christian writers praised the demolition of temples. They mocked the old gods as demons. Beauty, pleasure, ritual, and joy were reframed as moral danger. Festivals became obscene. Feasting became gluttony. The body itself became suspect.

What could not be eradicated outright was stripped, renamed, and absorbed, while its origins were denied.

The solstice became Christ’s birth.
The returning sun became metaphor.
Evergreens became safe symbols.
Ancestor offerings were reduced to children’s fantasy.

This was not borrowing. It was conquest, followed by selective inheritance.

When that conquest met resistance in rural places, in households, and in women’s hands, it adapted. It waited. It layered itself over what remained.

That is why the seams still show. That is why Christmas has always felt haunted.
Layered. Conflicted. Unstable.

What survived did so despite institutional Christianity, not because of it.

It survived in kitchens and hearths. In fields and forests.
In winter nights and quiet ritual.
In land-based people who refused to forget how the seasons worked.

Centuries later, capitalism finished what religion began. What remained was flattened into nostalgia, branding, and spectacle.

Not because the old ways were weak.
But because they were powerful.


Why the Call Feels Loud Again

The pull people feel now toward solstice, ancestors, darkness, rest, and land is not aesthetic.

It is memory.

It is the body remembering rhythms it was trained to forget.
It is the psyche rejecting constant light, constant productivity, constant cheer.
It is old intelligence resurfacing after centuries of suppression.

The old gods were never gone. They were buried. Winter has a way of thawing buried things.

If something in you responds to the fire, the darkness, the offering, or the pause, that does not mean you are rejecting modern life or indulging fantasy.

It means you are responding to a pattern older than doctrine.
Older than empire. Older than the fear that tried to erase it.

What was destroyed is stirring. What was taken is being remembered.

In a few days, I’ll be sitting down with Universal Pagan Temple for a conversation on pagan culture, ritual, history, and lived practice, with Sigrún Gregerson, Pagan priestess and educator. If this piece brought up questions for you, about Yule, Mother’s Night, ancestor work, or what reclaiming these traditions actually looks like, I’d love to carry them into that conversation. Feel free to leave your questions in the comments or send them my way.

This is how the old ways return.
Quietly. Carefully. Through memory, practice, and conversation.

My Mother’s Night Altar 12.20.25

The Historical Jesus Fact or Fiction?

Nailed: Ten Christian Myths That Show Jesus Never Existed at All

Today’s episode is one I’ve been looking forward to for a long time. I sat down with author and researcher David Fitzgerald, whose book Nailed: Ten Christian Myths That Show Jesus Never Existed at All has stirred up both fascination and controversy in both historical and secular circles.

Before anyone clutches their pearls — or their study Bible — this conversation isn’t about bashing belief. It’s about asking how we know what we think we know, and whether our historical standards shift when faith enters the equation.

Fitzgerald has spent over fifteen years investigating the evidence — or lack of it — surrounding the historical Jesus. In this first part of our series, we cover Myth #1 (“The idea that Jesus being a myth is ridiculous”) and Myth #4 (“The Gospels were written by eyewitnesses”). We also start brushing up against Myth #5, which explores how the Gospels don’t even describe the same Jesus.

We didn’t make it to Myth #7 yet — the claim that archaeology confirms the Gospels…. so, stay tuned for Part Two.

And for my visual learners!! I’ve got you. Scroll below for infographics, side-by-side Gospel comparisons, biblical quotes, and primary source references that make this episode come alive.

🧩 The 10 Myths About Jesus — According to Nailed

Myth #1: “The idea that Jesus was a myth is ridiculous!”
→ Fitzgerald argues that the assumption of Jesus’ historicity persists more from cultural tradition than actual historical evidence, and that questioning it isn’t fringe. It’s legitimate historical inquiry.

Myth #2: “Jesus was wildly famous — but somehow no one noticed.”
→ Despite claims that Jesus’ miracles and teachings drew massive crowds, there’s an eerie silence about him in the records of contemporaneous historians and chroniclers who documented far lesser figures.

Myth #3: “Ancient historian Josephus wrote about Jesus.”
→ The so-called “Testimonium Flavianum” passages in Josephus’ work are widely considered later Christian insertions, not authentic first-century testimony.

Myth #4: “Eyewitnesses wrote the Gospels.”
→ The Gospels were written decades after the events they describe by unknown authors relying on oral traditions and earlier written sources, not firsthand experience.

Myth #5: “The Gospels give a consistent picture of Jesus.”
→ Each Gospel portrays a strikingly different version of Jesus — from Mark’s suffering human to John’s divine Logos — revealing theological agendas more than biographical consistency.

Myth #6: “History confirms the Gospels.”
→ When examined critically, historical records outside the Bible don’t corroborate the key events of Jesus’ life, death, or resurrection narrative.

Myth #7: “Archaeology confirms the Gospels.”
→ Archaeological evidence supports the general backdrop of Roman-era Judea but fails to verify specific Gospel claims or the existence of Jesus himself.

Myth #8: “Paul and the Epistles corroborate the Gospels.”
→ Paul’s letters — the earliest Christian writings — reveal no awareness of a recent historical Jesus, focusing instead on a celestial Christ figure revealed through visions and scripture.

Myth #9: “Christianity began with Jesus and his apostles.”
→ Fitzgerald argues that Christianity evolved from earlier Jewish sects and mystery religions, with “Jesus” emerging as a mythologized figure around whom older beliefs coalesced.

Myth #10: “Christianity was totally new and different.”
→ The moral teachings, rituals, and savior motifs of early Christianity closely mirror surrounding pagan traditions and Greco-Roman mystery cults.


📘 Myth #1: “The Idea That Jesus Being a Myth Is Ridiculous”

This one sets the tone for the entire book — because it’s not even about evidence at first. It’s about social pressure.

Fitzgerald opens Nailed by calling out how the mythicist position (the idea that Jesus might never have existed) gets dismissed out of hand…even by secular historians. As he points out, the problem isn’t that the evidence disproves mythicism. The problem is that we don’t apply the same historical standards we would to anyone else.

Case in point: Julius Caesar crossing the Rubicon.

Julius Caesar crossing the Rubicon at the head of his army, 49 BC. Illustration from Istoria Romana incisa all’acqua forte da Bartolomeo Pinelli Romano (Presso Giovanni Scudellari, Rome, 1818-1819).

When historians reconstruct that event, we have:

  • Multiple contemporary accounts from major Roman historians like Suetonius, Plutarch, Appian, and Cassius Dio.
  • Physical evidence — coins, inscriptions, and monuments produced during or shortly after Caesar’s lifetime.
  • Political and military documentation aligning with the timeline.

In contrast, for Jesus, we have:

  • No contemporary accounts.
  • No archaeological or physical evidence.
  • Gospels written decades later by anonymous authors who never met him.

That’s the difference between history and theology.

Even historian Bart Ehrman, who does believe Jesus existed, has called mythicists “the flat-earthers of the academic world.” Fitzgerald addresses that in the interview (not defensively, but critically) asking why questioning this one historical figure provokes so much emotional resistance.

As he puts it, if the same level of evidence existed for anyone else, no one would take it seriously.


✍️ Myth #4: “The Gospels Were Written by Eyewitnesses”

We dive into the authorship problem — who actually wrote the Gospels, when, and why it matters.


🔀 Myth #5: “The Gospels Don’t Describe the Same Jesus”

⚖️ Contradictions Between the Gospels

1. Birthplace of Jesus — Bethlehem or Nazareth?

Matthew 2:1 – “Jesus was born in Bethlehem of Judea in the days of Herod the king.”
Luke 2:4–7 – Joseph travels from Nazareth to Bethlehem for the census, and Jesus is born there.
John 7:41–42, 52 – Locals say, “The Messiah does not come from Galilee, does he?” implying Jesus was known as a Galilean, not from Bethlehem.

🔍 Mythicist take:
Bethlehem was retrofitted into the story to fulfill the Messianic prophecy from Micah 5:2. In early Christian storytelling, theological necessity (“he must be born in David’s city”) trumps biographical accuracy.

2. Jesus’ Genealogy — Two Lineages, Zero Agreement

Matthew 1:1–16 – Jesus descends from David through Solomon.
Luke 3:23–38 – Jesus descends from David through Nathan.
Even Joseph’s father differs: Jacob (Matthew) vs. Heli (Luke).

🔍 Mythicist take:
Two contradictory genealogies suggest not historical memory but theological marketing. Each author tailors Jesus’ lineage to fit symbolic patterns — Matthew emphasizes kingship; Luke, universality.

3. The Timing of the Crucifixion — Passover Meal or Preparation Day?

Mark 14:12–17 – Jesus eats the Passover meal with his disciples before his arrest.
John 19:14 – Jesus is crucified on the day of Preparation — before Passover begins — at the same time lambs are being slaughtered in the Temple.

🔍 Mythicist take:
This isn’t a detail slip; it’s theology. John deliberately aligns Jesus with the Paschal lamb, turning him into the cosmic sacrifice — a theological metaphor, not an eyewitness timeline.

4. Jesus’ Last Words — Four Versions, Four Theologies

Mark 15:34 – “My God, my God, why have you forsaken me?” → human anguish.
Luke 23:46 – “Father, into your hands I commit my spirit.” → serene trust.
John 19:30 – “It is finished.” → divine completion.
Matthew 27:46 – Echoes Mark’s despair, but adds cosmic drama (earthquake, torn veil).

🔍 Mythicist take:
Each Gospel shapes Jesus’ death to reflect its theology — Mark’s suffering human, Luke’s faithful martyr, John’s omniscient divine being. This isn’t eyewitness diversity; it’s evolving mythmaking.

5. Who Found the Empty Tomb — and What Did They See?

Mark 16:1–8Three women find the tomb open, see a young man in white, flee in fear, tell no one.
Matthew 28:1–10Two women see an angel descend, roll back the stone, and tell them to share the news.
Luke 24:1–10Several women find the stone already rolled away; two men in dazzling clothes appear.
John 20:1–18Mary Magdalene alone finds the tomb, then runs to get Peter; later she meets Jesus himself.

🔍 Mythicist take:
If this were a consistent historical event, we’d expect some harmony. Instead, we see mythic escalation: from a mysterious empty tomb (Mark) → to heavenly intervention (Matthew) → to divine encounter (John).


6. The Post-Resurrection Appearances — Where and to Whom?

Matthew 28:16–20 – Jesus appears in Galilee to the eleven.
Luke 24:33–51 – Jesus appears in Jerusalem and tells them to stay there.
Acts 1:4–9 – Same author as Luke, now extends appearances over forty days.
Mark 16 (longer ending) – A later addition summarizing appearances found in the other Gospels.

🔍 Mythicist take:
The resurrection narrative grows with time — geographically, dramatically, and theologically. Early silence (Mark) gives way to detailed appearances (Luke/John), mirroring the development of early Christian belief rather than eyewitness memory.


🌿 Final Thought

Whether you end up agreeing with Fitzgerald or not, the point isn’t certainty… it’s curiosity. The willingness to look at history without fear, even when it challenges what we’ve always been told.

And here’s the fun part! David actually wants to hear from you. If you’ve got questions, pushback, or something you want him to unpack next time, drop it in the comments or send it my way. I’ll collect your submissions and bring a few of them into Part Two when we dig into Myth #7 — “Archaeology Confirms the Gospels.”

and as always, maintain your curiosity, embrace skepticism, and keep tuning in. 🎙️

📖 Further Reading 📖 

Foundational Mythicist Works:

  • Richard Carrier – On the Historicity of Jesus
  • Robert M. Price – The Christ-Myth Theory and Judaizing Jesus 
  • Earl Doherty – The Jesus Puzzle
  • Gospel Fictions – Randel Helms
  • The Fable of Christ – Joseph Wheless
  • The Pagan Christ – Tom Harpur
  • The Historical Jesus – William Benjamin Smith
  • The mythic past : biblical archaeology and the myth of Israel

Did Jesus Exist? Jacob Berman and Dr. Jack Bull Versus Dr. Aaron Adair and Neil Godfrey

Mainstream Scholarship & Context

  • Bart Ehrman – Did Jesus Exist?
  • Jonathan Haidt – The Righteous Mind Why Good People are Divided by Religion and Politics

Critiques of Bart Ehrman

Broader Philosophical & Cultural Context

  • Christianity before Christ  –  John G Jackson
  • The World’s Sixteen Crucified Saviors – Kersey Graves
  • The Christ Conspiracy – Acharya S (D.M. Murdock)


Sacred or Strategic? Rethinking the Christian Origin Story

The Bible Isn’t History and Trump Isn’t Your Savior

It’s Been a Minute… Let’s Get Real

Hey Hey, welcome back to Taste of Truth Tuesdays! it’s been over a month since my last episode, and wow—a lot has happened. Honestly, I’ve been doing some serious soul-searching and education, especially around some political events that shook me up.

I was firmly against Trump’s strikes on Iran. And the more I dug in, the more I realized how blind I’d been completely uneducated and ignorant about the massive political power Zionism holds in this country. And it’s clear now: Trump is practically bent over the Oval Office for Netanyahu. The Epstein files cover-up only confirms that blackmail and shadow control are the real puppet strings pulling at the highest levels of power. Our nation has been quietly occupied since Lyndon B. Johnson’s presidency and that’s a whole other episode I’ll get into later.

But what really cracked something in me was this:

In the 1990s, Trump sponsored Elite’s “Look of the Year” contest—a glitzy, global modeling search that lured teenage girls with promises of fame and fashion contracts. Behind the scenes, it was a trafficking operation. According to The Guardian’s Lucy Osborne and the BBC documentary Scouting For Girls: Fashion’s Darkest Secret, these girls weren’t being scouted—they were being sold to rich businessmen.

This wasn’t just proximity. Trump was part of it.

Once I saw that, the religious right’s worship of him stopped looking like misguided patriotism and started looking like mass delusion. Or complicity. Either way, I couldn’t unsee it.

And that’s when I started asking the bigger questions: What else have we mistaken for holy? What else have we accepted as truth without scrutiny?

For now, I want to cut to the heart of the matter: the major problem at the root of so much chaos: the fact that millions of Christians still believe the Bible is a literal historical document.

This belief doesn’t just distort faith-it fuels political agendas, end-times obsession, and yes, even foreign policy disasters. So, let’s dig into where this all began, how it’s evolved, and why it’s time we rethink everything we thought we knew about Scripture.

Thanks for reading Taste of Truth! Subscribe for free to receive new posts and support my work.

For most Christians, the Bible is more than a book-it’s the blueprint of reality, the inspired Word of God, infallible and untouchable. But what if that belief wasn’t original to Christianity? What if it was a reaction…. a strategic response to modern doubt, historical criticism, and the crumbling authority of the Church?

In this episode, we’re pulling back the veil on the doctrine of biblical inerrancy, the rise of dispensationalism, and the strange marriage of American politics and prophetic obsession. From the Scofield Bible to the belief that modern-day Israel is a fulfillment of God’s plan, we’re asking hard questions about the origins of these ideas.

As Dr. Mark Gregory Karris said when he joined us on a previous episode: “Can you imagine two different families? One, the Bible is the absolute inerrant word of God every.Word, every jot and title, so to speak, is meant to be in there due to the inspiration of God. And so every story you read, you know, God killing Egyptian babies and God flooding the entire planet and thinking, well yeah, there’s gonna be babies gasping for air and drowning grandmothers and all these animals. And that is seen as absolute objective truth. But then in another family, oh, these are, these are myths. These are sacred myths that people can learn from. No, that wasn’t like God speaking and smiting them and burning them alive because they touch this particular arc or now that this is how they thought given their minds at the time, given their understandings of and then like you talked about oh look at that aspect of humanity interesting that they portrayed god and not like it becomes like wow that’s cool instead of like oh my gosh i need 3-4 years of therapy because I was taught the bible in a particular way.”

Once you trace these doctrines back to their roots, it’s not divine revelation you find: it’s human agendas.

Let’s get uncomfortable. Was your faith formed by sacred truth… or centuries of strategic storytelling?

How Literalism Took Over

In the 19th century, biblical literalism became a kind of ideological panic room. As science, archaeology, and critical scholarship began to chip away at traditional interpretations, conservative Christians doubled down. Instead of exploring the Bible as a complex, layered anthology full of metaphor, moral instruction, and mythology, they started treating it like a divine press release. Every word had to be accurate. Every timeline had to match. Every contradiction had to be “harmonized” away.

The Myth of Inerrancy

One of the most destructive byproducts of this era was the invention of biblical inerrancy. Yes, invention. The idea that the Bible is “without error in all that it affirms” isn’t ancient…. it’s theological propaganda, most notably pushed by B.B. Warfield and his peers at Princeton. Rogers and McKim wrote extensively about how this doctrine was manufactured and not handed down from the apostles as many assume. We dive deeper into all that—here.

Inerrancy teaches that the Bible is flawless, even in its historical, scientific, and moral claims. But this belief falls apart under even basic scrutiny. Manuscripts don’t agree. Archaeological timelines conflict with biblical ones. The Gospels contradict each other. And yet this doctrine persists, warping believers’ understanding and demanding blind loyalty to texts written by fallible people in vastly different cultures.

That’s the danger of biblical inerrancy: it treats every verse as historical journalism rather than layered myth, metaphor, or moral instruction. But what happens when you apply that literalist lens to ancient origin stories?

📖 “Read as mythology, the various stories of the great deluge have considerable cultural value, but taken as history, they are asinine and absurd.” — John G. Jackson, Christianity Before Christ

And yet, this is the foundation of belief for millions who think Noah’s Ark was a literal boat and not a borrowed flood myth passed down and reshaped across Mesopotamian cultures. This flattening of myth into fact doesn’t just ruin the poetry-it fuels bad politics, end-times obsession, and yes… Zionism.

And just to be clear, early Christians didn’t read the Bible this way. That kind of rigid literalism didn’t emerge until centuries later…long after the apostles were gone. We’ll get to that.

When we cling to inerrancy, we’re not preserving truth. We’re missing it entirely.

Enter: Premillennial Dispensationalism

If biblical inerrancy was the fuel, C.I. Scofield’s 1909 annotated Bible was the match. His work made premillennial dispensationalism a household belief in evangelical churches. For those unfamiliar with the term, here’s a quick breakdown:

  • Premillennialism: Jesus will return before a literal thousand-year reign of peace.
  • Dispensationalism: History is divided into distinct eras (or “dispensations”) in which God interacts with humanity differently.

When merged, this theology suggests we’re living in the “Church Age,” which will end with the rapture. Then comes a seven-year tribulation, the rise of the Antichrist, and finally, Jesus returns for the ultimate battle after which He’ll rule Earth for a millennium. Sounds like the plot of a dystopian film, right? And yet, this became the dominant lens through which American evangelicals interpret reality.

The result? A strange alliance between American evangelicals and Zionist nationalism. You get politicians quoting Revelation like it’s foreign policy, pastors fundraising for military aid, and millions of Christians cheering on war in the Middle East because they think it’ll speed up Jesus’ return.

But here’s what I want you to take away from this episode today: none of this works unless you believe the Bible is literal, infallible, and historically airtight.

How This Shaped Evangelical Culture and Politics

The Scofield Bible didn’t just change theology. It changed culture. Dispensationalist doctrine seeped into seminaries like Dallas Theological Seminary and Moody Bible Institute, influencing generations of pastors. It also exploded into popular culture through Hal Lindsey’s The Late Great Planet Earth and the Left Behind series. Fiction, prophecy, and fear blurred into one big spiritual panic attack.

But perhaps the most alarming shift came in the political realm. Dispensationalist belief heavily influences evangelical support for the modern state of Israel. Why? Because many believe Israel’s 1948 founding was a prophetic event. Figures like Jerry Falwell turned theology into foreign policy. His organization, the Moral Majority, was built on an unwavering belief that supporting Israel was part of God’s plan. Falwell didn’t just preach this, he traveled to Israel, funded by its government, and made pro-Israel advocacy a cornerstone of evangelical identity.

This alignment between theology and geopolitics hasn’t faded. In the 2024 election cycle, evangelical leaders ranked support for Israel on par with anti-abortion stances. Ralph Reed, founder of the Faith and Freedom Coalition, explicitly said as much. Donald Trump even quipped that “Christians love Israel more than Jews.” Whether that’s true or not, it reveals just how deep this belief system runs.

And the propaganda doesn’t stop there…currently Israel’s Foreign Ministry is funding a week-long visit for 16 prominent young influencers aligned with Donald Trump’s MAGA and America First movements, part of an ambitious campaign to reshape Israel’s image among American youth.

But Let’s Talk About the Red Flags

This isn’t just about belief-it’s about control. Dispensationalist theology offers a simple, cosmic narrative: you’re on God’s winning team, the world is evil, and the end is near. There’s no room for nuance, no time for doubt. Just stay loyal, and you’ll be saved.

This thinking pattern isn’t exclusive to Christianity. You’ll find it in MLMs, and some conspiracy theory communities. The recipe is the same: create an in-group with secret knowledge, dangle promises of salvation or success, and paint outsiders as corrupt or deceived. It’s classic manipulation-emotional coercion wrapped in spiritual language.

And let’s not forget the date-setting obsession. Hal Lindsey made a career out of it. People still point to blood moons, earthquakes, and global politics as “proof” that prophecy is unfolding. If you’ve ever been trapped in that mindset, you know how addictive and anxiety-inducing it can be.

BY THE WAY, it’s not just dispensationalism or the Scofield Bible that fuels modern Zionism. The deeper issue is, if you believe the Bible is historically accurate and divinely orchestrated, you’re still feeding the ideological engine of Zionism. Because at its core, Christianity reveres Jewish texts, upholds Jewish chosenness, and worships a Jewish messiah. That’s not neutrality it’s alignment.

If this idea intrigued you, you’re not alone. There’s a growing body of work unpacking how Christianity’s very framework serves Jewish supremacy, whether intentionally or not. For deeper dives, check out Adam Green’s work over at Know More News on Rumble, and consider reading The Jesus Hoax: How St. Paul’s Cabal Fooled the World for Two Thousand Years. You don’t have to agree with everything to realize: the story you were handed might not be sacred it might be strategic.

Why This Matters for Deconstruction

For me, one of the most painful parts of deconstruction was realizing I’d been sold a false bill of goods. I was told the Bible was the infallible word of God. That it held all the answers. That doubt was dangerous. But when I began asking real questions, the entire system started to crack.

The doctrine of inerrancy didn’t deepen my faith… it limited it. It kept me from exploring the Bible’s human elements: its contradictions, its cultural baggage, and its genuine beauty. The truth is that these texts were written by people trying to make sense of their world and their experiences with the divine. They are not divine themselves.

Modern Scholarship Breaks the Spell

Modern biblical scholarship has long since moved away from the idea of inerrancy. When you put aside faith-based apologetics and look honestly at the evidence, the traditional claims unravel quickly:

  • Moses didn’t write the Torah. Instead, the Pentateuch was compiled over centuries by multiple authors, each with their own theological agendas (see the JEDP theory).
  • King David is likely a mythic figure. Outside of the Bible, there’s no solid evidence he actually existed, much less ruled a vast kingdom.
  • The Gospels weren’t written by Matthew, Mark, Luke, and John. Those names were added later. The original texts are anonymous and they often contradict each other.
  • John didn’t write Revelation. Not the Apostle John, anyway. The Greek and style are completely different from the Gospel of John. The real author was probably some unknown apocalyptic mystic on Patmos, writing during Roman persecution.

And yet millions still cling to these stories as literal fact, building entire belief systems and foreign policies on myths and fairy tales.


🧠 Intellectual Starvation in Evangelicalism

Here’s the deeper scandal: it’s not just that foundational Christian stories crumble under modern scrutiny. It’s that the church never really wanted you to think critically in the first place.

Mark Noll, a respected evangelical historian, didn’t mince words when he wrote:

“The scandal of the evangelical mind is that there is not much of an evangelical mind.”

In The Scandal of the Evangelical Mind, Noll traces how American evangelicalism lost its intellectual life. It wasn’t shaped by a pursuit of truth, but by populist revivalism, emotionalism, and a hyper-literal obsession with “the end times.” The same movements that embraced dispensationalism and biblical inerrancy also gutted their communities of academic rigor, curiosity, and serious theological reflection.

The result? A spiritually frantic but intellectually hollow faith—one that discourages questions, mistrusts scholarship, and fears nuance like it’s heresy.

Noll shows that instead of grappling with ambiguity or cultural complexity, evangelicals often default to reactionary postures. This isn’t just a relic of the past. It’s why so many modern Christians cling to false authorship claims, deny historical context, and accept prophecy as geopolitical fact. It’s why Revelation gets quoted to justify Zionist foreign policy without ever asking who actually wrote the book or when, or why.

This anti-intellectualism isn’t an accident. It was baked in from the start.

But Noll doesn’t leave us hopeless. He offers a call forward: for a faith that engages the world with both heart and mind. A faith that can live with tension, welcome complexity, and evolve beyond fear-driven literalism.

What Did the Early Church Actually Think About Scripture?

Here’s what gets lost in modern evangelical retellings: the earliest Christians didn’t treat Scripture the way today’s inerrantists do.

For the first few centuries, Christians didn’t even have a finalized Bible. There were letters passed around, oral traditions, a few widely recognized Gospels, and a whole lot of discussion about what counted as authoritative. It wasn’t until the fourth century that anything close to our current canon was even solidified. And even then, it wasn’t set in stone across all branches of Christianity.

Church fathers like Origen, Clement of Alexandria, and Irenaeus viewed Scripture as spiritually inspired but full of metaphor and mystery. They weren’t demanding literal accuracy; they were mining the texts for deeper meanings. Allegory was considered a legitimate, even necessary, interpretive method. Scripture was read devotionally and theologically, not scientifically or historically. In other words, it wasn’t inerrancy that defined early Christian engagement with Scripture, it was curiosity and contemplation.

For a deeper dive, check out The Gnostic Informant’s incredible documentary that uncovers the first hundred years of Christianity, a period that has been systematically lied about and rewritten. It reveals how much of what we take for granted was shaped by political and theological agendas far removed from the original followers of Jesus.

If you’re serious about understanding the roots of your faith or just curious about how history gets reshaped, this documentary is essential viewing. It’s a reminder that truth often hides in plain sight and that digging beneath the surface is how we reclaim our own understanding.

Protestantism: A Heretical Offshoot Disguised as Tradition

The Protestant Reformation shook things up in undeniable ways. Reformers like Martin Luther and John Calvin challenged the Catholic Church’s abuses and rightly demanded reform. But what’s often missed (or swept under the rug) is how deeply Protestantism broke with the ancient, historic Church.

By insisting on sola scriptura—Scripture alone—as the sole authority, the Reformers rejected centuries of Church tradition, councils, and lived community discernment that shaped orthodox belief. They didn’t invent biblical inerrancy as we know it today, but their elevation of the Bible above all else cracked the door wide open for literalism and fundamentalism to storm in.

What began as a corrective movement turned into a theological minefield. Today, Protestantism isn’t a single coherent tradition; it’s a sprawling forest of over 45,000 different denominations, all claiming exclusive access to “the truth.”

This fragmentation isn’t accidental…. it’s the logical outcome of rejecting historic continuity and embracing personal interpretation as the final authority.

Far from preserving the faith of the ancient Church, Protestantism represents a fractured offshoot: one that often contradicts the early Church’s beliefs and teachings. It trades the richness of lived tradition and community wisdom for a rigid, literalistic, and competitive approach to Scripture.

The 20th century saw this rigid framework perfected into a polished doctrine demanding total conformity and punishing doubt. Protestant fundamentalism turned into an ideological fortress, where questioning is treated as betrayal, and theological nuance is replaced by black-and-white dogma.

If you want to understand where so much of modern evangelical rigidity and end-times obsession comes from, look no further than this fractured legacy. Protestantism’s break with the ancient Church set the stage for the spiritual and intellectual starvation that Mark Noll so powerfully exposes.

Rethinking the Bible

Seeing the Bible as a collection of human writings about God rather than the literal word from God opens up space for critical thinking and compassion. It allows us to:

  • Study historical context and cultural influences.
  • Embrace the diversity of perspectives in Scripture.
  • Let go of rigid interpretations and seek core messages like love, justice, and humility.
  • Move away from proof-texting and toward spiritual growth.
  • Reconcile faith with science, reason, and modern ethics.

When we stop demanding that the Bible be perfect, we can finally appreciate what it actually is: a complex, messy, beautiful attempt by humans to understand the sacred.

This shift doesn’t weaken faith…. I believe it strengthens it.

It moves us away from dogma disguised as certainty and into something deeper…. something alive. It opens the door for real relationship, not just with the divine, but with each other. It makes space for growth, for disagreement, for honesty.

And in a world tearing itself apart over whose version of truth gets to rule, that kind of open-hearted spirituality isn’t just refreshing-it’s essential.

Because if your faith can’t stand up to questions, history, or accountability… maybe it was never built on truth to begin with.

Let’s stop worshiping the paper and start seeking the presence.

🔎 Resources Worth Exploring:

  • “The Jesus Hoax: How St. Paul’s Cabal Fooled the World for Two Thousand Years” by David Skrbina
  • “Christianity Before Christ” by John G. Jackson
  • The Scandal of the Evangelical Mind” by Mark Noll – A scathing but sincere critique from within the evangelical tradition itself. Noll exposes how anti-intellectualism, biblical literalism, and cultural isolationism have gutted American Christianity’s ability to engage the world honestly.
  • Check out Adam Green’s work at Know More News on Rumble for more on the political and mythological implications of Christian Zionism
  • And don’t miss my interview with Dr. Mark Gregory Karris, author of The Diabolical Trinity: Wrathful God, Sinful Self, and Eternal Hell, where we dive deep into the psychological damage caused by toxic theology

When “Helping the Homeless” Becomes a Trojan Horse

Why Trump’s new executive order deserves close scrutiny

President Trump signed an executive order on July 24, 2025, calling on states and cities to clear homeless encampments and expand involuntary psychiatric treatment, framed as a move to improve public safety and compassion

At first glance, it seems reasoned: address the homelessness crisis in many progressive cities, restore order, & help those with severe mental illness. But when I read it closely, and the language….phrases like “untreated mental illness,” “public nuisance,” and “at risk of harm”is vague enough, subjective enough, and feels ripe for misuse 😳

This goes beyond homelessness. It marks a shift toward normalizing forced institutionalization, a trend with deep roots in American psychiatric history.

We explored this dark legacy in a recent episode, Beneath the White Coats 🥼 and if you listened to that episode, you’ll know that

compulsory commitment isn’t new.

Historically, psychiatric institutions in the U.S. served not just medical needs but social control. Early 20th-century asylums housed the poor, the racially marginalized, and anyone deemed “unfit.”

The International Congress of Eugenics’ Logo 1921

The eugenics movement wasn’t a fringe ideology….it was supported by mainstream medical groups, state law, and psychiatry. Forced sterilization, indefinite confinement, and ambiguous diagnoses like “moral defectiveness” were justified under the guise of public health.

Now, an executive order gives local governments incentives (and of course funding 💰 is always tied to compliance) to loosen involuntary commitment laws and redirect funding to those enforcing anti-camping and drug-use ordinances instead of harm reduction programs

Once states rewrite their laws to align with the order’s push toward involuntary treatment and if “public nuisance” or “mental instability” are to be interpreted broadly…

Now, you don’t have to be homeless to be at risk. A public disturbance, a call from a neighbor, even a refusal to comply with treatment may trigger involuntary confinement.

Is it just me, or does this feel like history is repeating?

We’ve seen where badly defined psychiatric authority leads: disproportionate targeting, loss of civil rights, and institutionalization justified as compassion. Today’s executive order could enable a similar expansion of psychiatric control.

So.. what do you think? Is this just a homelessness policy? or is it another slippery slope?

Beneath the White Coats: Psychiatry, Eugenics, and the Forgotten Graves

Dogma in a Lab Coat

We like to believe science is self-correcting—that data drives discovery, that good ideas rise, and bad ones fall. But when it comes to mental health, modern society is still tethered to a deeply flawed framework—one that pathologizes human experience, medicalizes distress, and often does more harm than good.

Psychiatry has long promised progress, yet history tells a different story. From outdated treatments like bloodletting to today’s overprescription of SSRIs, we’ve traded one form of blind faith for another. These drugs—still experimental in many respects—carry serious risks, yet are handed out at staggering rates. And rather than healing root causes, they often reinforce a narrative of victimhood and chronic dysfunction.

The pharmaceutical industry now drives diagnosis rates, shaping public perception and clinical practice in ways that few understand. What’s marketed as care is often a system of control. In this episode, we revisit the dangers of consensus-driven science—how it silences dissent and rewards conformity.

Because science, like religion or politics, can become dogma. Paradigms harden. Institutions protect their power. And the costs are human lives.

But beneath this entire structure lies a deeper, more uncomfortable question—one we rarely ask:

What does it mean to be a person?

Are we just bodies and brains—repairable, programmable, replaceable? Or is there something more?

Is consciousness a glitch of chemistry, or is it a window into the soul?

Modern psychiatry doesn’t just treat symptoms—it defines the boundaries of personhood. It tells us who counts, who’s disordered, who can be trusted with autonomy—and who can’t.

But what if those definitions are wrong?

We’ve talked before about the risks of unquestioned paradigms—how ideas become dogma, and dogma becomes control. In a past episode, How Dogma Limits Progress in Fitness, Nutrition, and Spirituality, we explored Rupert Sheldrake’s challenge to the dominant scientific worldview—his argument that science itself had become a belief system, closing itself off to dissent. TED removed that talk, calling it “pseudoscience.” But many saw it as an attempt to protect the status quo—the high priests of data and empiricism silencing heresy in the name of progress. We will revisit his work later on in our conversation. 

We’ve also discussed how science, more than politics or religion, is often weaponized to control behavior, shape belief, and reinforce social hierarchies. And in a recent Taste Test Thursday episode, we dug into how the industrial food system was shaped not just by profit but by ideology—driven by a merger of science and faith.

To read more:

This framework—that science is never truly neutral—becomes especially chilling when you look at the history of psychiatry.

To begin this conversation, we’re going back—not to Freud or Prozac, but further. To the roots of American psychiatry. To two early figures—John Galt and Benjamin Rush—whose ideas helped define the trajectory of an entire field. What we find there presents a choice: a path toward genuine hope, or a legacy of continued harm.

This  story takes us into the forgotten corners of that history, a place where “normal” and “abnormal” were declared not by discovery, but by decree.

Clinical psychiatrist Paul Minot put it plainly:

“Psychiatry is so ashamed of its history that it has deleted much of it.”

And for good reason.

Psychiatry’s early roots weren’t just tangled with bad science—they were soaked in ideology. What passed for “treatment” was often social control, justified through a veneer of medical language. Institutions were built not to heal, but to hide. Lives were labeled defective. 

We would like to think that medicine is objective, that the white coat stands for healing. But behind those coats was a mission to save society from the so-called “abnormal.”
But who defined normal?
And who paid the price?


The Forgotten Legacy of Dr. John Galt

Lithograph, “Virginia Lunatic Asylum at Williamsburg, Va.” by Thomas Charles Millington, ca.1845. Block & Building Files – Public Hospital, Block 04, Box 07. Image citation: D2018-COPY-1104-001. Special Collections.

Long before DSM codes and Big Pharma, the first freestanding mental hospital  in America called Eastern Lunatic Asylum opened its doors in 1773—just down the road from where I live, in Williamsburg, Virginia. Though officially declared a hospital, it was commonly known as “The Madhouse.” For most who entered, institutionalization meant isolation, dehumanization, and often treatment worse than what was afforded to livestock. Mental illness was framed as a threat to the social order—those deemed “abnormal” were removed from society and punished in the name of care.

But one man dared to imagine something different.

Dr. John Galt II, appointed as the first medical superintendent of the hospital (later known as Eastern State), came from a family of alienists—an old-fashioned term for early psychiatrists. The word comes from the Latin alienus, meaning “other” or “stranger,” and referred to those considered mentally “alienated” from themselves or society. Today, of course, the word alien has taken on very different connotations—especially in the heated political debates over immigration. It’s worth clarifying: the historical use of alienist had nothing to do with immigration or nationality. It was a clinical label tied to 19th-century psychiatry, not race or citizenship. But like many terms, it’s often misunderstood or manipulated in modern discourse.

Galt, notably, broke with the harsh legacy of many alienists of his time. Inspired by French psychiatrist Philippe Pinel—often credited as the first true psychiatrist—Galt embraced a radically compassionate model known as moral therapy. Where others saw madness as a threat to be controlled, Galt saw suffering that could be soothed. He believed the mentally ill deserved dignity, freedom, and individualized care—not chains or punishment. He refused to segregate patients by race. He treated enslaved people alongside the free. And he opposed the rising belief—already popular among his fellow psychiatrists—that madness was simply inherited, and the mad were unworthy of full personhood.

Credit: The Valentine
Original Author: Cook Collection
Created: Late nineteenth to early twentieth century

Rather than seeing madness as a biological defect to be subdued or “cured,” Galt and Pinel viewed it as a crisis of the soul. Their methods rejected medical manipulation and instead focused on restoring dignity. They believed that those struggling with mental affliction should be treated not as deviants but as ordinary people, worthy of love, freedom, and respect.

Dr. Marshall Ledger, founder and editor of Penn Medicine, once quoted historian Nancy Tomes to summarize this period:

“Medical science in this period contributed to the understanding of mental illness, but patient care improved less because of any medical advance than because of one simple factor: Christian charity and common sense.”

Galt’s asylum was one of the only institutions in the United States to treat enslaved people and free Black patients equally—and even to employ them as caregivers. He insisted that every person, regardless of race, had a soul of equal moral worth. His belief in equality and metaphysical healing put him at odds with nearly every other psychiatrist of his time.

And he paid the price.

The psychiatric establishment, closely allied with state power and emerging medical-industrial interests, rejected his human-centered model. Most psychiatrists of the era endorsed slavery and upheld racist pseudoscience. The prevailing consensus was rooted in hereditary determinism—that madness and criminality were genetically transmitted, particularly among the “unfit.”

This growing belief—that mental illness was a biological flaw to be medically managed—was not just a scientific view, but an ideological one. Had Galt’s model of moral therapy been embraced more broadly, it would have undermined the growing assumption that biology and state-run institutions offered the only path to sanity. It would have challenged the idea that human suffering could—and should—be controlled by external authorities.

Instead, psychiatry aligned with power.

Moral therapy was quietly abandoned. And the field moved steadily toward the medicalized, racialized, and state-controlled version of mental health that would pave the way for both eugenics and the modern pharmaceutical regime.

“The Father of American Psychiatry”

Long before Auschwitz. Long before the Eugenics Record Office. Long before sterilization laws and IQ tests, there was Dr. Benjamin Rush—signer of the Declaration of Independence, founder of the first American medical school, and the man still honored as the “father of American psychiatry.” His portrait hangs today in the headquarters of the American Psychiatric Association.

Though many historians point to Francis Galton as the father of eugenics, it was Rush—nearly a century earlier—who laid much of the ideological groundwork. He argued that mental illness was biologically determined and hereditary. And he didn’t stop there.

Rush infamously diagnosed Blackness itself as a form of disease—what he called “negritude.” He theorized that Black people suffered from a kind of leprosy, and that their skin color and behavior could, in theory, be “cured.” He also tied criminality, alcoholism, and madness to inherited degeneracy, particularly among poor and non-white populations.

These ideas found a troubling ally in Charles Darwin’s emerging theories of evolution and heredity. While Darwin’s work revolutionized biology, it was often misused to justify racist notions of racial hierarchy and biological determinism.

Rush’s medical theories were mainstream and deeply influential, shaping generations of physicians and psychiatrists. Together, these ideas reinforced the belief that social deviance and mental illness were rooted in faulty bloodlines—pseudoscientific reasoning that provided a veneer of legitimacy to racism and social control within medicine and psychiatry.

The tragic irony? While Rush advocated for the humane treatment of the mentally ill in certain respects, his racial theories helped pave the way for the pathologizing of entire populations—a mindset that would fuel both American and European eugenics movements in the next century.

American Eugenics: The Soil Psychiatry Grew From

Before Hitler, there was Cold Spring Harbor. Founded in 1910, the Eugenics Record Office (ERO) operated out of Cold Spring Harbor Laboratory in New York with major funding from the Carnegie Institution, later joined by Rockefeller Foundation money. It became the central hub for American eugenic research, gathering family pedigrees to trace so-called hereditary defects like “feeblemindedness,” “criminality,” and “pauperism.”

Between the early 1900s and 1970s, over 30 U.S. states passed forced sterilization laws targeting tens of thousands of people deemed unfit to reproduce. The justification? Traits like alcoholism, poverty, promiscuity, deafness, blindness, low IQ, and mental illness were cast as genetic liabilities that threatened the health of the nation.

The practice was upheld by the U.S. Supreme Court in 1927 in the infamous case of Buck v. Bell. In an 8–1 decision, Justice Oliver Wendell Holmes Jr. wrote, “Three generations of imbeciles are enough,” greenlighting the sterilization of 18-year-old Carrie Buck, a young woman institutionalized for being “feebleminded”—a label also applied to her mother and child. The ruling led to an estimated 60,000+ sterilizations across the U.S.

And yes—those sterilizations disproportionately targeted African American, Native American, and Latina women, often without informed consent. In North Carolina alone, Black women made up nearly 65% of sterilizations by the 1960s, despite being a much smaller share of the population.

Eugenics wasn’t a fringe pseudoscience. It was mainstream policy—supported by elite universities, philanthropists, politicians, and the medical establishment.

And psychiatry was its institutional partner.

The American Journal of Psychiatry published favorable discussions of sterilization and even euthanasia for the mentally ill as early as the 1930s. American psychiatrists traveled to Nazi Germany to observe and advise, and German doctors openly cited U.S. laws and scholarship as inspiration for their own racial hygiene programs.

In some cases, the United States led—and Nazi Germany followed.

The International Congress of Eugenics’ Logo 1921

This isn’t conspiracy. It’s history. Documented, peer-reviewed, and disturbingly overlooked.


From Ideology to Institution

By the early 20th century, the groundwork had been laid. Psychiatry had evolved from a fringe field rooted in speculation and racial ideology into a powerful institutional force—backed by universities, governments, and the courts. But its foundation was still deeply compromised. What had begun with Benjamin Rush’s biologically deterministic theories and America’s eugenic policies now matured into a formalized doctrine—one that treated human suffering not as a relational or spiritual crisis, but as a defect to be categorized, corrected, or eliminated.

This is where the five core doctrines of modern psychiatry emerge.

The Five Doctrines That Shaped Modern Psychiatry

These five doctrines weren’t abandoned after World War II. They were rebranded, exported, and quietly absorbed into the foundations of American psychiatry.

1. The Elimination of Subjectivity

Patients were no longer seen as people with stories, pain, or meaning—they were seen as bundles of symptoms. Suffering was abstracted into clinical checklists. The Diagnostic and Statistical Manual of Mental Disorders (DSM) became the gold standard, not because it offered clear science, but because it offered utility: a standardized language that served pharmaceutical companies, insurance billing, and bureaucratic control. If you could name it, you could code it—and medicate it.

2. The Eradication of Spiritual and Moral Meaning

Struggles once understood through relational, existential, or moral frameworks were stripped of depth. Grief became depression. Anger became oppositional defiance. Existential despair was reduced to a neurotransmitter imbalance. The soul was erased from the conversation. As Berger notes, suffering was no longer something to be witnessed or explored—it became something to be treated, as quickly and quietly as possible.

3. Biological Determinism

Mental illness was redefined as the inevitable result of faulty genes or broken brain chemistry—even though no consistent biological markers have ever been found. The “chemical imbalance” theory, aggressively marketed throughout the late 20th century, was never scientifically validated. Yet it persists, in part because it sells. Selective serotonin reuptake inhibitors (SSRIs)—still widely prescribed—were promoted on this flawed premise, despite studies showing they often perform no better than placebo and come with serious side effects, including emotional blunting, dependence, and sexual dysfunction.

4. Population Control and Racial Hygiene

In Germany, this meant sterilizing and exterminating those labeled “life unworthy of life.” In the U.S., it meant forced sterilizations of African-American and Native American women, institutionalizing the poor, the disabled, and the nonconforming. These weren’t fringe policies—they were mainstream, upheld by law and supported by leading psychiatrists and journals. Even today, disproportionate diagnoses in communities of color, coercive treatments in prisons and state hospitals, and medicalization of poverty reflect these same logics of control.

5. The Use of Institutions for Social Order

Hospitals became tools for enforcing conformity. Psychiatry wasn’t just about healing—it was about managing the unmanageable, quieting the inconvenient, and keeping society orderly. From lobotomies to electroshock therapy to modern-day involuntary holds, psychiatry has long straddled the line between medicine and discipline. Coercive treatment continues under new names: community treatment orders, chemical restraints, and state-mandated compliance.

These doctrines weren’t discarded after the fall of Nazi Germany. They were imported. Adopted. Rebranded under the guise of “evidence-based medicine” and “public health.” But the same logic persists: reduce the person, erase the context, medicalize the soul, and reinforce the system.


Letchworth Village: The Human Cost

I didn’t simply read this in a textbook. I stood there—on the edge of those woods—next to rows of numbered graves.

In 2020, while waiting to close on our New York house, my husband and I were staying in an Airbnb in Rockland County. We were walking the dogs one morning nearing the end of Call Hollow Road, there is a wide path dividing thick woodland when we came across a memorial stone:

“THOSE WHO SHALL NOT BE FORGOTTEN.”

We had stumbled upon the entrance to Old Letchworth Village Cemetery, and we instantly felt it’s somber history. Beyond it, rows of T-shaped markers each one a muted testament to the hundreds of nameless victims who perished at Letchworth. Situated just half a mile from the institution, these weathered grave markers reveal only the numbers that were once assigned to forgotten souls—a stark reminder that families once refused to let their names be known. This omission serves as a silent indictment of a system that institutionalized, dehumanized, and ultimately discarded these individuals.

When we researched the history, the truth was staggering.

Letchworth was supposed to be a progressive alternative to the horrors of 19th-century asylums. Instead, it became one of them. By the 1920s, reports described children and adults left unclothed, unbathed, overmedicated, and raped. Staff abused residents—and each other. The dormitories were overcrowded. Funding dried up. Buildings decayed.

The facility was severely overcrowded. Many residents lived in filth, unfed and unattended. Children were restrained for hours. Some were used in vaccine trials without consent. And when they died, they were buried behind the trees—nameless, marked only by small concrete stakes.

I stood among those graves. Over 900 of them. A long row of numbered markers, each representing a life once deemed unworthy of attention, of love, of dignity.

But the deeper horror is what Letchworth symbolized: the idea that certain people were better off warehoused than welcomed, that abnormality was a disease to be eradicated—not a difference to be understood.

This is the real history of psychiatric care in America.


The Problem of Purpose

But this history didn’t unfold in a vacuum. It was built on something deeper—an idea so foundational, it often goes unquestioned: that nature has no purpose. That life has no inherent meaning. That humans are complex machines—repairable, discardable, programmable.

This mechanistic worldview didn’t just shape medicine. It has shaped what we call reality itself.

As Dr. Rupert Sheldrake explains in Science Set Free, the denial of purpose in biology isn’t a scientific conclusion—it’s a philosophical assumption. Beginning in the 17th century, science removed soul and purpose from nature. Plants, animals, and human bodies were understood as nothing more than matter in motion, governed by fixed laws. No pull toward the good. No inner meaning.

By the time Darwin’s Origin of Species arrived in the 19th century 1859, this mechanistic lens was fully established. Evolution wasn’t creative—it was random. Life wasn’t guided—it was accidental.

Psychiatry, emerging in this same cultural moment, absorbed this worldview. Suffering was pathologized, difference diagnosed, and the soul reduced to faulty genetics and broken wiring.

Today, that mindset is alive in the DSM’s ever-expanding labels, in the belief that trauma is a chemical imbalance, that identity issues must be solved with hormones and surgery, and in the reflex to medicate children who don’t conform.

But what if suffering isn’t a bug in the system?

What if it’s a signal?

What if these so-called “disorders” are cries for meaning in a world that pretends meaning doesn’t exist?

The graves at Letchworth aren’t just a warning about medical abuse. They are a mirror—reflecting what happens when we forget that people are not problems to be solved, but souls to be seen.

Sheldrake writes, “The materialist denial of purpose in evolution is not based on evidence, but is an assumption.” Modern science insists all change results from random mutations and blind forces—chance and necessity. But these claims are not just about biology. They influence how we see human beings: as broken machines to be repaired or discarded.

As we said, in the 17th century, the mechanistic revolution abolished soul and purpose from nature—except in humans. But as atheism and materialism rose in the 19th century, even divine and human purpose were dismissed, replaced by the ideal of scientific “progress.” Psychiatry emerged from this philosophical soup, fueled not by reverence for the human soul but by the desire to categorize, control, and “correct” behavior—by any mechanical means necessary.

What if that assumption is wrong? What if the people we label “disordered” are responding to something real? What if our suffering has meaning—and our biology is not destiny?

“Genetics” as the New Eugenics

Today, psychiatry no longer speaks in the language of race hygiene.

It speaks in the language of genes.

But the message is largely the same:

You are broken at the root.

Your biology is flawed.

And the only solution is lifelong medication—or medical intervention.

We now tell people their suffering is rooted in faulty wiring, inherited defects, or bad brain chemistry—despite decades of inconclusive or contradictory evidence.

We still medicalize behaviors that don’t conform.

We still pathologize pain that stems from trauma, poverty, or social disconnection.

We still market drugs for “chemical imbalances” that have never been biologically verified.

And we still pretend this is science—not ideology.

But as Dr. Rupert Sheldrake argues in Science Set Free, even the field of genetics rests on a fragile and often overstated foundation. In Chapter 6, he challenges one of modern biology’s core assumptions: that all heredity is purely material—that our traits, tendencies, and identities are completely locked in by our genes.

But this isn’t how people have understood inheritance for most of human history.

Long before Darwin or Mendel, breeders, farmers, and herders knew how to pass on traits. Proverbs like “like father, like son” weren’t based on lab results—they were based on generations of observation. Dogs were bred into dozens of varieties. Wild cabbage became broccoli, kale, and cauliflower. The principles of heredity weren’t discovered by science; they were named by science. They were already in practice across the world.

What Sheldrake points out is that modern biology took this folk knowledge, stripped it of its nuance, and then centralized it—until genes became the sole explanation for almost everything.

And that’s a problem.

Because genetics has been crowned the ultimate cause of everything from depression to addiction, from ADHD to schizophrenia. When the outcomes aren’t clear-cut, the answer is simply: “We haven’t mapped the genome enough yet.”

But what if the model is wrong?

What if suffering isn’t locked in our DNA?

What if genes are only part of the story—and not even the most important part?

By insisting that people are genetically flawed, psychiatry sidesteps the deeper questions:

  • What happened to you?
  • What story are you carrying?
  • What environments shaped your experience of the world?

It pathologizes people—and exonerates systems.

Instead of exploring trauma, we prescribe pills.

Instead of restoring dignity, we reduce people to diagnoses.

Instead of healing souls, we treat symptoms.

Modern genetics, like eugenics before it, promises answers. But too often, it delivers a verdict: you were born broken.

We can do better.

We must do better.

Because healing doesn’t come from blaming bloodlines or rebranding biology.

It comes from listening, loving, and refusing to reduce people to a diagnosis or a gene sequence.


The Hidden Truth About Trauma and Diagnosis

As Pete Walker references Dr. John Briere’s poignant observation: if Complex PTSD and the role of early trauma were fully acknowledged by psychiatry, the Diagnostic and Statistical Manual of Mental Disorders (DSM) could shrink from a massive textbook to something no larger than a simple pamphlet.

We’ve previously explored the crucial difference between PTSD and complex PTSD—topics like trauma, identity, neuroplasticity, stress, survival, and what it truly means to come home to yourself. This deeper understanding exposes a vast gap between real human experience and how mental health is often diagnosed and treated today.

Instead of addressing trauma with truth and compassion, the system expands diagnostic categories, medicalizes pain, and silences those who suffer.

The Cost of Our Silence

Many of us know someone who’s been diagnosed, hospitalized, or medicated into submission.

Some of us have been that person.

And we’re told this is progress. That this is compassion. That this is care.

But when I stood at the edge of those graves in Rockland County—row after row of anonymous markers—nothing about this history felt compassionate.

It felt buried. On purpose.

We must unearth it.

Not to deny mental suffering—but to reclaim the right to define it for ourselves.

To reimagine what healing could look like, if we dared to value dignity over diagnosis.

Because psychiatry hasn’t “saved” the abnormal.

It has often silenced, sterilized, and sacrificed them.

It has named pain as disorder.

Difference as defect.

Trauma as pathology.

The DSM is not a Bible.

The white coat is not a priesthood.

And genetics is not destiny.

We need better language, better questions, and better ways of relating to each other’s pain.

And that brings us full circle—to a man most people have never heard of: Dr. John Galt II.

Nearly 200 years ago, in Williamsburg, Virginia, Galt ran the first freestanding mental hospital in America. But unlike many of his peers, he rejected chains, cruelty, and coercion. He embraced what he called moral treatment—an approach rooted in truth, love, and human dignity. Galt didn’t see the “insane” as dangerous or defective. He saw them as souls.

He was influenced by Philippe Pinel, the French physician who famously removed shackles from asylum patients in Paris. Together, these early reformers dared to believe that healing began not with force, but with presence. With relationship. With care.

Galt refused to segregate patients by race. He treated enslaved people alongside the free. And he opposed the rising belief—already popular among his fellow psychiatrists—that madness was simply inherited, and the mad were unworthy of full personhood.

But what does it mean to recognize someone’s personhood?

Personhood is more than just being alive or having a body. It’s about being seen as a full human being with inherent dignity, moral worth, and rights—someone whose inner life, choices, and experiences matter. Recognizing personhood means acknowledging the whole person beyond any diagnosis, disability, or social status.

This question isn’t just philosophical—it’s deeply practical and contested. It’s at the heart of debates over mental health care, disability rights, euthanasia and even abortion. When does a baby become a person? When does someone with a mental illness or cognitive difference gain full moral consideration? These debates all circle back to how we define humanity itself.

In Losing Our Dignity: How Secularized Medicine Is Undermining Fundamental Human Equality, Charles C. Camosy warns that secular, mechanistic medicine can strip people down to biological parts—genes, symptoms, behaviors—rather than seeing them as full persons. This reduction risks denying people their dignity and the respect that comes with being more than the sum of their medical conditions.

Galt’s approach stood against this reduction. He saw patients as complex individuals with stories and struggles, deserving compassion and respect—not just as “cases” to be categorized or “disorders” to be fixed.

To truly recognize personhood is to honor that complexity and to affirm that every individual, regardless of race, mental health, or social status, has an equal claim to dignity and care.

But… Galt’s approach was pushed aside.

Why?

Because it didn’t serve the state.

Because it didn’t serve power.

Because it didn’t make money.

Today, we see a similar rejection of truth and compassion.

When a child in distress is told they were “born in the wrong body,” we call it gender-affirming care.

When a woman, desperate to be understood, is handed a borderline personality disorder label instead.

When medications with severe side effects are pushed as the only solution, we call it science.

But are we healing the person—or managing the symptoms?

Are we meeting the soul—or erasing it?

We’ve medicalized the human condition—and too often, we’ve called that progress.

We’ve spoken before about the damage done by Biblical counseling programs when therapy is replaced with doctrine—how evangelical frameworks often dismiss pain as rebellion, frame anger as sin, and pressure survivors into premature forgiveness.

But the secular system is often no better. A model that sees people as nothing more than biology and brain chemistry may wear a lab coat instead of a collar—but it still demands submission.

Both systems can bypass the human being in front of them.

Both can serve control over compassion.

Both can silence pain in the name of order.

What we truly need is something deeper.

To be seen.

To be heard.

To be honored in our complexity—not reduced to a diagnosis or a moral failing.

It’s time to stop.

It’s time to remember that human suffering is not a clinical flaw. It’s time to remember the metaphysical soul/psyche. 

Our emotional pain is not a chemical defect.

That being different, distressed, or deeply wounded is not a disease.

It’s time to recover the wisdom of Dr. John Galt II.

To treat those in pain—not as problems to be solved—but as people to be seen.

To offer truth and love, not labels, not sterilizing surgeries and lifelong prescriptions.

Because if we don’t, the graves will keep multiplying—quietly, behind institutions, beneath a silence we dare not disturb.

But we must disturb it.

Because they mattered.

And truth matters.

And the most powerful medicine has never been compliance or chemistry.

It’s being met with real humanity.

Being listened to. Believed.

Not pathologized. Not preached at. Not controlled.

But loved—in the deepest, most grounded sense of the word.

The kind of love that doesn’t look away.

The kind that tells the truth, even when it’s costly.

The kind that says: you are not broken—you are worth staying with.

Because to love someone like that…

is to recognize their personhood.

And maybe that’s the most radical act of all.

SOURCES:

  • “Director of the Kaiser Wilhelm Institute for Anthropology, Human Heredity, and Eugenics from 1927 to 1942, [Eugen] Fischer authored a 1913 study of the Mischlinge (racially mixed) children of Dutch men and Hottentot women in German southwest Africa. Fischer opposed ‘racial mixing, arguing that “negro blood” was of ‘lesser value and that mixing it with ‘white blood’ would bring about the demise of European culture” (United States Holocaust Memorial Museum, “Deadly Medicine: Creating the Master Race,” HMM Online: https://www.ushmm.org/exhibition/deadly-medicine/ profiles/). See also, Richard C. Lewontin, Steven Rose, and Leon J. Kamin, Not in Our Genes: Biology, Ideology, and Human Nature 2nd edition (Chicago: Haymarket Books, 2017), 207.
  • Gonaver, The Making of Modern Psychiatry
  • Saving Abnormal-The Disorder of Psychiatric Genetics-Daneil R Berger II
  • Lost Architecture: Eastern State Hospital – Colonial Williamsburg
  • 📘 General History of American Eugenics
    Lombardo, Paul A.
    Three Generations, No Imbeciles: Eugenics, the Supreme Court, and Buck v. Bell (2008)
    This book is the definitive account of Buck v. Bell and American eugenics law. It documents how widespread sterilizations were and provides legal and historical context.
    Black, Edwin.
    War Against the Weak: Eugenics and America’s Campaign to Create a Master Race (2003)
    Covers the U.S. eugenics movement in depth, including funding by Carnegie and Rockefeller, Cold Spring Harbor, and connections to Nazi Germany.
    Kevles, Daniel J.
    In the Name of Eugenics: Genetics and the Uses of Human Heredity (1985)
    A foundational academic history detailing how early American psychiatry and genetics were interwoven with eugenic ideology.

    🧬 Institutions & Funding
    Cold Spring Harbor Laboratory Archives
    https://www.cshl.edu
    Documents the history of the Eugenics Record Office (1910–1939), its funding by the Carnegie Institution, and its influence on U.S. and international eugenics.
    The Rockefeller Foundation Archives
    https://rockarch.org
    Shows how the foundation funded eugenics research both in the U.S. and abroad, including programs that influenced German racial hygiene policies.

    ⚖️ Sterilization Policies & Buck v. Bell
    Supreme Court Decision: Buck v. Bell, 274 U.S. 200 (1927)
    https://supreme.justia.com/cases/federal/us/274/200/
    Includes Justice Holmes’ infamous quote and the legal justification for forced sterilization.
    North Carolina Justice for Sterilization Victims Foundation
    https://www.ncdhhs.gov
    Reports the disproportionate targeting of Black women in 20th-century sterilization programs.
    Stern, Alexandra Minna.
    Eugenic Nation: Faults and Frontiers of Better Breeding in Modern America (2005)
    Explores race, sterilization, and medical ethics in eugenics programs, with data from states like California and North Carolina.

    🧠 Psychiatry’s Role & Nazi Connections
    Lifton, Robert Jay.
    The Nazi Doctors: Medical Killing and the Psychology of Genocide (1986)
    Shows how American eugenics—including psychiatric writings—helped shape Nazi ideology and policies like Aktion T-4 (the euthanasia program).
    Wahl, Otto F.
    “Eugenics, Genetics, and the Minority Group Mentality” in American Journal of Psychiatry, 1985.
    Traces how psychiatric institutions were complicit in promoting eugenic ideas.
    American Journal of Psychiatry Archives
    1920s–1930s issues include articles in support of sterilization and early euthanasia rhetoric.
    Available via https://ajp.psychiatryonline.org

1984 and The Handmaid’s Tale: Misplaced Parallels and Liberal Delusion

Breaking Free: A Conversation with Yasmine Mohammed on Radical Islam, Empowerment, and the West’s Blind Spots

After finishing George Orwell’s 1984, I noticed its resurgence in popularity, especially after Trump’s election. Ironically, it’s not the conservative right but the progressive left that increasingly mirrors Orwellian themes. Similarly, Margaret Atwood’s The Handmaid’s Tale has become a rallying cry for liberals who claim to be on the brink of a dystopian theocracy. Yet, as Yasmine Muhammad pointed out in this week’s episode, this comparison is not only absurd but deeply insulting to women who live under regimes where Atwood’s fiction is a grim reality.

1984: Rewriting Language and History

The Democratic Party’s obsession with redefining language is straight out of Orwell’s playbook. They tell us biology is bigotry and that there are infinite genders, forcing people to adopt nonsensical pronouns or risk social ostracism. This is not progress—it’s the weaponization of language to control thought, eerily similar to Orwell’s Newspeak.

But it doesn’t stop there. They actively rewrite history by renaming monuments, military bases, and even schools, erasing cultural markers in the name of ideological purity. This is doublespeak in action: the manipulation of truth for political orthodoxy. Orwell’s warning that “orthodoxy is unconsciousness” feels disturbingly apt when observing the modern left.

The Handmaid’s Tale: An Insult to Women Who Actually Suffer

In our conversation, Yasmine highlighted the absurdity of liberal claims that America is The Handmaid’s Tale come to life. Yasmine, who grew up under Islamic theocracy, knows firsthand what it’s like to live in a world where women have no autonomy. These women cannot see a doctor without a male guardian, are forced to cover every inch of their bodies, and are denied basic freedoms like education or the right to drive.

Contrast this with the West, where women have more freedom than any other point in history. Liberal women can run around naked at Pride parades, freely express their sexuality, and redefine what it means to be a woman altogether. And yet, they cry oppression because they are expected to pay for their own birth control or endure debates over abortion limits. This level of cognitive dissonance—claiming victimhood while living in unprecedented freedom—is a slap in the face to women who actually suffer under real patriarchal oppression.

Liberal Orthodoxy: Lost in the Sauce

What’s truly Orwellian is how the left uses its freedom to strip others of theirs. They shout about inclusivity but cancel anyone who disagrees. They claim to fight for justice while weaponizing institutions to enforce ideological conformity. Meanwhile, they are so consumed with their own victim complex that they fail to see how absurd their comparisons to dystopian fiction really are.

Orwell and Atwood warned against unchecked power and ideological extremism. If liberals actually read these books instead of using them as aesthetic props, they might realize they’re mirroring the very authoritarianism they claim to oppose. Instead, they’re lost in the sauce, preaching oppression in a society where they have more freedom than they can handle.

As Yasmine said, “You want to see The Handmaid’s Tale? Try being a woman in Saudi Arabia, Iran, or Afghanistan.” The left would do well to remember that before playing the victim in their cosplay dystopia.