I’m not an architect or a construction engineer, but I know enough to know that if you want to erect a tall building, you start with a sturdy foundation. The foundation must be in place to hold stable all that is built above it.
Assuming the physical possibility of doing so, what then happens if you remove the foundation from under the tower you’ve put in place above it.
A tipping point will eventually be reached, and the building will collapse. This should be obvious.
A few weeks ago, we sketched — that’s all it was, a sketch — the rise of the West. Its foundation was Aristotelianism-Christendom. Aquinas’s effort to integrate the two in his Summa made him a pivotal thinker in the history of Western ideas.
In this world picture, which included a picture of ourselves, God the Creator fashioned the world and fashioned us — designed us, if you will — with minds that were finite reflections of His infinite mind. The world he fashioned was governed by Logos: rational causality graspable through logical reasoning. The human world was to be governed by Ethos: He had fashioned beings of free will and placed them in a world where rules to govern human conduct were necessary conditions for living and flourishing. Just as there were rights and wrongs about what to eat — eat the wrong thing and you’re poisoned and die — there are rights and wrongs about how to live with others in communities. The point was: embrace God’s principles and flourish; reject them, and whatever edifice you’ve built is corrupted and eventually collapses.
We humans have been very good at rejecting God’s principles, which in this way of looking at things, is why history is littered with the ashes of failed empires.
Technology exemplified our ability to use to our advantage the causal principles we either understood intuitively or consciously. Science relied on the a priori premise that God’s creation is intelligible to the human mind, the mind of beings created in His image. Intelligibility meant that explanations of the world’s phenomena, from falling objects to orbiting planets, were possible. Being finite, our explanations weren’t perfect or exact. But we were such that generations of inquirers could improve them and extend them through a combination of thought and experiment.
That was how things looked in the early 1700s, with Isaac Newton considered the greatest natural philosopher who ever lived (the word scientist wouldn’t be coined until the 1830s).
By the end of that century, the intellectual wing of humanity was basking in the pride of its increasing ingenuity. There was nothing we couldn’t explain.
And we didn’t need God to do it.
That “God did it” no longer seemed to explain anything.
So God was jettisoned from our ontological and explanatory bestiary.
After all, philosophical arguments to prove that He must exist had all proven to have fatal weaknesses (especially with Kant’s transcendental turn). For centuries — back to Aquinas, in fact — and long before (all the way back to Aristotle, a pre-Christian philosopher) — we’d placed more trust in the reasoning abilities God gave us when He created us than we placed in Him, in our confidence that He exists even if we can’t see Him!
This bit of hubris arguably compromised Aristotelianism-Christendom before we got to modern times.
Embracing empiricism as an epistemology for science began the removal of that Aristotelian-Christian foundation apace.
And why not? Hadn’t Copernicus laid the groundwork for the removal of the Earth from its privileged place at the center of the universe? Hadn’t Newton been able to show the reasonableness of holding that the physical laws governing bodies in motion or at rest here on Earth are the same laws maintaining the moon and the planets in their orbits, and which seemed to work universally — a hypothesis no one could test, but which seemed reasonable.
Nature was uniform. The present is the key to the past. The emergent rule: never postulate events in the past that aren’t reflected by processes we can observe in the present.
This became the foundation for the science of geology (Sir Charles Lyell, early 1800s), which set us up for biological evolution (Charles Darwin, late 1800s). If Copernicus had decentered the Earth, Darwin decentered humanity itself. No longer was there any reason for believing our species held a special, privileged place in a “creation.” We had arrived through natural selection, a process that had neither foreseen nor planned us.
Freud, finally, decentered our minds with his psychoanalysis. We did not really know ourselves, because we did not understand our deepest subconscious motivations and how they were shaped by, e.g., unremembered childhood trauma.
The cosmos itself was abjectly indifferent to human beings; not only that, we had become — literally! — strangers to ourselves!
Could the ethical view of the world we’d inherited from Christianity survive all this?
If Nietzsche could be trusted, that would be a No! Nietzsche — surely among the half-dozen or so most widely studied, weritten about, and debated philosophers of all time (not always understood, mind you) — observed in essence that once you removed God from your world picture, you also removed everything His existence had given meaning to.
The foundation was being dismantled, and that meant the building was destined to begin tottering if and when that tipping point was reached.
Nietzsche had come down hard on Christian ethics, but also on those secular attempts to replace it, arguing that all had embedded Christian assumptions (moral praise for sacrifice, the essential goodness of servitude, etc.).
He’d argued that we faced an “advent of nihilism” unless we could construct an ethical system on a new foundation, one suited for life in an indifferent universe, in which at the end of life we were dropped in a hole and that was it.
He called for a “revaluation of all values.”
Bertrand Russell across the English Channel, far more scientific/analytic and less poetic and discursive than Nietzsche, argued a parallel thesis in his essay “A Free Man’s Worship” which brought us fully into the 1900s, the century when the building’s tottering became self-evident if you know where to look.
Paraphrasing: in the dead universe disclosed by modern science, our highest ideals of peace and justice must find a home.
This was very different, of course, from Nietzsche’s envisioning a replacement of such Christian-sounding and therefore outdated ideals with ideals based on independence instead of servitude, strength instead of sacrifice, and a defiant resilience in the face of the universe’s basic indifference.
Arguably, Russell’s ideals never found that home. Nietzsche’s “revaluation” came far closer to what actually ensued, at least in the centers of power. Just a decade or so after Russell penned “A Free Man’s Worship,” the world exploded into history’s most violent war up to that time, shattering, for the historical moment at least, whatever illusions we’d developed about the possibility of building a scientific/technological Utopia on the secular foundation thinkers like Comte, Russell, and others envisioned.
Our sense of justice persevered, but mostly because it developed severed from the philosophical foundations positivists believed they’d eliminated.
And because it appealed to that “spark” in each of us that knows, objectively, that there’s a difference between right and wrong (wouldn’t beings created in God’s image have such a “spark”?). In Rawls’s theory of justice, morality and metaphysics are logically independent of one another: just arrangements are deduced not from given first principles but from behind his “veil of ignorance.”
It was a stopgap measure, a delay of the inevitable, which was the collapse of all foundational and systematic thinking in the face of that “advent of nihilism” Nietzsche had warned of, in which human lifes because increasingly expendable. In mild forms, human beings could be thrown to the wolves of the indifferent economy when they couldn’t work or when their work ceased to be profitable. In more extreme forms, entire groups could be depersonalized, removed conceptually from the moral community altogether, and exterminated. The Holocaust, of which that of the Nazis was just one and not even the largest (Stalin’s and Mao’s minions mass murdered many times more people than Hitler did), exemplies this move.
One could argue, of course, that human lives had been expendable all along, because of the “us versus them” dichotomy, that of “in-groups” versus “out-groups” which thousands of years of history have hardwired into us. (Evolutionary psychologists and anthropologists would say, of course, that our hardwiring had survival value for the species, as does whatever culturally-based morality various communities eventually developed.) Yet we had been making steady small improvements all across the board. Because of that “spark,” again.
The twentieth century was the scene of this conflict: between those who sought power and global reach as an end in itself — because they answered to nothing higher than themselves — and those who, in one way or another, still pursued lives built around ideals of empathy, morality, justice, and all their trappings, which included reducing the vast and growing inequality neoliberal economics was bringing about, an inequality the mixed economy that preceded neoliberalism had at least moderated.
Not only that, we began to enter a “post truth” world. Truth was a social construct, the product of biases of various sorts. Or just a property of propositions, an artifact of certain ways of speaking, and therefore ultimately subjective. Objective states of affairs? We couldn’t get outside our historicity and our group-derived situatedness to see them as they (presumably?) were.
Objectivity didn’t really exist; rationality was a “straight white male construct” that wasn’t “inclusive” (I didn’t understand: were the academic lefties saying that the women and ethnic and sexual minorities whose interests they claimed to be representing were not capable of rationality, of objectivity, of logicality??? Plus: if truth was biased and nonobjective, then wasn’t the implied truth they arrogated for their own claims a product of bias and nonobjectivity?).
Now, in the twenty-first, civilization itself as we’ve understood it for the past few hundred years hangs in the balance. The power-mongers — who couldn’t care less about the peccadillos of the above — are operating practically in the open (they had to “conspire” in the past) because they know no one has the resources to oppose them effectively as they re-engineer the world through combinations of money flows, war, and fomented revolutions. The weapons we’ve built threaten to destroy us; we tell ourselves that doctrines like MAD (Mutually Assured Destruction) place a check on their use — but no one can be sure that something as minor as a computer error couldn’t unleash them (it almost happened in the early 1980s!).
In short, the building the foundation of which we’ve all but removed is still standing, but all honest and forthright persons realize that it is living on a certain forward momentum and borrowed time.
I would suggest that in the absence of any better ideas, the wise thing to do would be to consider revisiting and reconstructing that original foundation on which the West was built before it is truly too late.
There, in two parts: the rise of the Grandest Narrative; and its decline and possible fall, a fall most of us might well live to see!
Doubtless superficial by academic-philosophical standards. I’ve not set out to write an analytic masterpiece. Just to hit the high spots. Given that the number of people who see it might not exceed what can fit comfortably into our kitchen, and also given how academic philosophy has dropped the ball and left everything up for grabs, who cares? This is where we are, and every intellectually honest person who pursues these topics long enough comes to realize it.
Written without AI!