Auguste Comte’s Law of Three Stages: What It Is, Why the Third Stage is Dying, What Comes Next. (A Major Statement.)

[Note: this may be the longest blog post I have ever made on this site. The result of several weeks of effort, it may be read as a progress report on what may turn into my life’s work: if I have anything final and definitive to say to the world, or to that remnant that cuts through the information clutter and pays attention to such things, it will be found in the culmination of the ideas found here, and in related tracts to come.] 

In this post we outline Auguste Comte’s Law of Three Stages, with commentary. I’ve perceived, whether in my own work or in the occasional comments that I find myself wanting to leave on comment-threads, that having a succinct statement of the idea in one place to refer back to might be a good idea. Why? Because even if we disagree about where he ended up, there are reasons to believe Comte was onto something, and that a stages views of civilization might be instructive today. The post is divided into sections to make the discussion easier to follow. The early sections are adapted from a core section of the second chapter of a book I am writing entitled What Should Philosophy Do? A Vision for the Discipline’s Future. The full expression of the ideas near the end will have to wait for The Fifth Stage of Civilization: Beyond Modernity, Postmodernity, and Scarcity.

Auguste Comte and the Law of Three Stages.

Auguste Comte (1798 – 1857) is best known for two things: founding sociology as a special science with its own identity, and establishing a new school of thought about the nature of proper inquiry, in philosophy or otherwise: positivism. The former applied the latter, which advocated applying methods of empirical science to the study of society or parts of society such as specific populations or institutions or problem situations. Data-driven studies soon came out of this, so it’s hard to dismiss the idea as irrelevant.

Comte developed a conceptual framework for understanding Western society’s intellectual development. He called it the Law of Three Stages (Course in Positive Philosophy, 1838). He didn’t invent the idea that we can isolate “states” or “conditions” or “stages” through which a civilization passes. Earlier versions can be found in Vico and Condorcet. But Comte gave the idea its most concise expression. It is important to note that Comte’s stages are not historical epochs: they both can and do exist side by side in the same civilization: uneasily at best, sometimes in open conflict. It will not be hard to see why.

The first stage or state — First Stage thinking, we will call it — Comte calls “theological or fictitious.” In his words:

the human mind, seeking the essential nature of beings, the first and final causes (the origin and purpose) of all effects — in short, absolute knowledge — supposes all phenomena to be produced by the immediate action of supernatural beings…. The theological system arrived at the highest perfection of which it is capable when it substituted the providential action of a single Being for the varied operations of the numerous divinities that had been before imagined.

First Stage thinking could be called the state of Primitive Faith. It looks to multiple inscrutable, supernatural agencies as causes of events ranging from diseases to storms to earthquakes. At its most advanced state of development, says Comte, it consolidates these in a single, specific Supreme Being (e.g., the Christian God; or for Muslims, Allah). This Supreme Being remains mostly inscrutable, but may have revealed Himself and His will through texts such as the Old and New Testaments, or the Quran. First Stage political thinking tends to be theocratic and authoritarian. In civilizations advanced enough to support strong central governance, a priestly class dominates, usually within a monarchy. There are enforcers with police powers. These all rule the public mind through fear of hellfire and damnation (or of execution by some spectacularly nasty and painful means).

First Stage thought does not, in the long run, survive the influence of intellectually curious souls who don’t accept the authority of a priesthood on its word that they have a monopoly on what God wants, or exact knowledge of His will. Historically, philosophers tended to throw cold water on such notions as the “divine right of kings,” a prevalent notion in cultures where First Stage thought dominates. Thus the beginnings of the next stage.

Second Stage thinking is “metaphysical and abstract”:

In the metaphysical stage, which is only a modification of the first, the mind supposes, instead of supernatural beings, abstract forces, veritable entities (that is, personified abstractions) inherent in all beings, and capable of producing all phenomena. What is called the explanation of phenomena is, in this stage, a mere reference of each to its proper entity….  In the same way, in the last stage of the metaphysical system, men substitute one great entity (Nature) as the cause of all phenomena, instead of the multitude of entities at first supposed (ibid.)

Simplifying: Second Stage philosophy, beginning with Thales of Miletus (“Water is the first principle of all things”) and seeing its first full expression (or at least the first to survive) in the sweeping, systematic philosophies of Plato and Aristotle, develops systematic and comprehensive accounts of reality, knowledge, morality, etc. These are based on some set of first principles deduced by the philosopher’s reason. St. Thomas Aquinas, with his attempt to merge Aristotelian philosophy into Christianity, also exemplifies Second Stage philosophy which became, in his hands, a “handmaiden to theology.” Modern Second Stage philosophy could be called the stage of Pure Reason, reaching its highest development in the philosophies of Descartes, Locke, Kant, Hegel, and Whitehead.

Second Stage thinkers may conclude on the basis of very detailed reasoning that God exists, or that He doesn’t, or that the problem of His existence lies beyond proof or disproof. Descartes believed the first. Julien de la Mettrie and Baron D’Holbach concluded the second; Immanuel Kant, the third. In its moral and political expression, Second Stage thought, in the Anglo-American and Austrian worlds anyway, saw the individual human being as fundamental, and individual rights as grounded in the relationship human beings bear to Nature and to the conditions for human flourishing independent of legal structures: natural rights (God given or not). In this world Nature is the arbiter of the conditions of life, to which individuals and societies either conform or perish.

As should be clear just from the above, Second Stage thinkers are very different from one another. David Hume’s brand of British empiricism reached far different conclusions about the possibility of justifying our claims to knowledge; and he rooted morality in a combination of social sentiment and utility. In many respects, his rejection of metaphysical thinking with his celebrated remark at the end of his Enquiry Concerning Human Understanding (orig. 1748) powerfully anticipated the next stage:

If we take in our hand any volume; of divinity or school metaphysics, for instance; let us ask, Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning concerning matter of fact and existence? No. Commit it then to the flames: for it can contain nothing but sophistry and illusion.

Thus the misty beginnings of Third Stage thinking, we will call it. Comte’s words again:

In the final, the positive state, the mind has given over the vain search after absolute notions, the origin and destination of the universe, and the causes of phenomena, and applies itself to the study of their laws — that is, their invariable relations of succession and resemblance. Reasoning and observation, duly combined, are the means of this knowledge. What is now understood when we speak of an explanation of facts is simply the establishment of a connection between single phenomena and some general facts, the number of which continually diminishes with the progress of science….  In the same way, again, the ultimate perfection of the positive system would be (if such perfection could be hoped for) to represent all phenomena as particular aspects of a single general fact — such as gravitation, for instance.

With this, we are on the way towards a view of philosophy to which Hume would probably have been sympathetic: as, at best, a “handmaiden” to natural science. Third Stage thought could be called the state of Empirical Science and Utility. And with intellectuals looking more and more to science for explanations, in a Third Stage intellectual ordering of disciplines, philosophy’s epistemic authority rapidly declines.

Comte’s Law of Three Stages encourages us to think of First Stage thought as possessing a civilization in its childhood, looking to a god for explanations and security, analogous to small children in a nuclear family who see their parents as godlike beings whose powers and motivations they cannot begin to comprehend.

Second Stage thought becomes the product of a civilization’s adolescence: its philosophers’ ambitions exceed their grasp given their a priori methods. Just as adolescents with new drives, sensations, and ambitions, will take chances and break rules whose purposes they don’t understand, vaguely resent, and will circumvent if they can. The immature, adolescent mind impatiently “wants it all, and wants it right now.”

Third Stage thinking, in this case, signals that a civilization is outgrowing childish fantasies and adolescent extravagances. Entering adulthood, it embraces adult realities and responsibilities. The intellectual centers of Third Stage civilization relinquish supernaturalism in all forms, be they First Stage Primitive Faith or Second Stage apriorism and “divine watchmaker” reasoning. They repudiate philosophers’ “quest for certainty” as futile because it is unsound methodologically. They accept, based on the history of its rise, that empirical science is more likely to deliver the answers philosophers have sought, because its method of patient, empirical observation and hypothesis-testing is superior to all that has gone before.

Figures from Sir Charles Lyell, Charles Darwin and Ernst Mach, to Stephen Hawking and Richard Dawkins, stand at the culmination of this trajectory. Among professional philosophers, the first exemplar is probably Bertrand Russell, although one may also look to John Stuart Mill who was instrumental in introducing many of Comte’s ideas to the English-speaking world.

Third Stage Thought: Science and Morality.

Among the “adult realities” science had revealed long before Comte’s time is that Earth appears to occupy no special place in the universe, or even in the solar system. Copernicus had “decentered” our planet from the privileged place Aristotle had assigned it, and which Christianity had assumed, as the literal center of Creation. Galileo had produced empirical evidence that Venus orbited the sun, not the Earth, and that the heavens did not disclose Christian-Aristotelean perfection. Newton had showed, contra Aristotle, that physical reality could be understood without a division into terrestrial and celestial realms. Both are governed by universal gravitation, expressed mathematically. Newton placed physics and astronomy on a new foundation with his Principia (1687). Science seemed to advance by subsuming more and more of the world under fewer and fewer basic physical principles.

Thus came the unprecedented revolution that led to modern science, a revolution that proved unstoppable when it began to deliver technological and commercial fruits. Thus also the Western Enlightenment, as philosophers expressed what they saw as their achievement of Third Stage maturity which they sought to spread to the rest of Western civilization.

Darwin, whose ideas were circulating before On the Origin of Species (1859) appeared,  appeared to “decenter” us from our privileged place at the center of biological Creation. The human race, according to the theory of evolution by natural selection, may be the most complex species in existence. But according to Darwinism we are just the most advanced of many species in the complicated tree of life. We emerged over a long period of time as a result of a continuous natural process that had no goals, much less the production of beings like ourselves. One of the culminations of Third Stage thought is that there is no need to posit a god to explain human existence, or the existence of life or of the world generally. We are, at best, a fortuitous accident in a vast cosmos.

Arguably, the removal of morality from the province of the divine followed: morality, according to Third Stage thinking, was neither handed down by a supernatural agency nor originates in a transcendent Platonist realm nor even through Kantian rational agency. Hume, again looking across the bridge toward Third Stage thought, had opened the door to what ensued with his idea that morality was based on the sensitive rather than the cognitive side of our nature, i.e., was grounded in our natural sentiments toward what is useful in society, i.e., social utility, or what improves human lives materially or brings about greater happiness.

For Third Stage thinkers morality is just one of many distinctively human traits that had survival value. First, because it benefited social hunter-gatherer groups of various sizes. Later, because it brought benefits to developing societies such as stability and predictability through the building of trust among their members. Community is not sustainable if its members cannot trust one another at least most of the time, and if they have no rules for dealing with those who prove themselves untrustworthy. Hence truth-telling and promise-keeping became moral imperatives, and expected behaviors (habits) in most cultures.

Perhaps, for the Third Stage moral philosopher, ethical perspectives rest on little more than such down-to-earth realizations that our actions affect others, that happiness spread to others is more beneficial and productive than unhappiness (or happiness just for oneself), that alleviating suffering is better than allowing it, and that the future can be better (more pleasant, more efficient, more prosperous) than the past. This can happen if we work to improve ourselves through education and specific business and cultural activities, guided within a protective sphere of governance which will tend more and more to respond to the will of its people (the origin of liberal democracy as an ideal).

Third Stage thinking thus embraces meliorism, the idea that we can improve ourselves morally — actually becoming better people — through our own concerted efforts. This ran counter to what the West had inherited from Christianity: human nature is inherently sinful, and this will invariably hold us back. Against this, Comte and future positivists were optimistic — in a word, positive — about human potential. They were optimistic and positive about our capacity to discover more and greater truths. Years of careful study in subject domains (physics, biology, psychology, etc.) gave its experts a final say in providing a consensus on what was true in those domains. They were optimistic about our chances of building a better world independent of outworn beliefs about gods and divine commands.

According to positivism, empirical science both has, and ought to have, the final say in matters epistemic (dealing with knowledge): Settled Science, one might call it. It was not an issue that Settled Science shifted its opinions from time to time, since its methods were always turning up new findings. These rationally compelled scientists to revise their consensuses, and this was a good thing. Scientific inquiry was not about attaining epistemic perfection, or absolute certainty (First and Second Stage obsessions). It was about improving knowledge and the material conditions of life piecemeal. And it seemed to be succeeding brilliantly!

By the twentieth century, new inventions flooded the market and improved the lives of millions: the electric power grid, the telegraph, automobiles, refrigerators, freezers, washing machines, microwave ovens; the transistor, which led in turn to radio, television, and eventually to telecommunications and early computing machinery.

What Third Stage thinking offers is a world, and worldview, based on science, technology, commerce, public education, and responsible governance. This was not precisely what Comte had in mind. In terms of political economy, he, like his mentor Henri de Saint-Simon, was basically a socialist (of the utopian variety Marx ridiculed). Neither he nor anyone else properly estimated the resilience capitalism would have, including its capacity to embrace socialistic elements in order to achieve an adaptive balance between what its elites wanted and what its masses would accept. More and more, society judged itself moral to the extent it protected the interests of its weakest members and prevented people from falling through the economy’s cracks.

The question before us: does Third Stage civilization offer prospects for indefinite betterment of the human condition, moral as well as material? Its defenders have said, and still say, that its era of dominance has seen more improvements than in all previous centuries put together.

Gathering Doubts about the Third Stage: A Prelude.

At first glance, this is hard to argue with. The most advanced and accomplished civilization in human history sent men to the moon and returned them safely to Earth! It came to span the globe, with no end in sight, especially with the end of perversions of its basic ideas such as Soviet Communism.

Evaluating Third Stage thinking and civilization obviously goes beyond a single blog post (which is why I am writing a book).

But we can safely make a few concise and occasionally pithy observations.

Reiterating, just to be clear: Third Stage civilization, especially its power centers and its intellectual centers, privileges science, technology, commerce, public education, and responsible government.

Its worldview is that of materialism, meaning by that both a theory of the universe (that no gods or other supernatural entities have real existence outside our imaginations) and a focus of the bulk of human energies on matters of this world, not some other.

Third Stage thinking sees progress as inevitable, provided we stay the course and recognize that we were bound to fall and skin our knees a few times. There may be no Utopias up ahead in the sense of someone like Plato, or Saint-Simon, or Marx, or any of those guys … but things will continue to get better and better!

The solutions to whatever problems are created by science, technology, and commerce are found in better science, better technology, and better commerce!

All these are open to challenge, and have been challenged. Some of the challenges have been obvious products of history themselves. Others are more subtle.

The two most violent and destructive wars in human history (World Wars I and II) challenged the idea that the Third Stage mindset was somehow serving up better humans in the moral sense.

Add to these the acts of genocide committed by totalitarian dictatorships.

These amount to something more than skinned knees, one might say.

One could argue, of course, that the latter weren’t truly Third Stage in their orientation, if one wanted. Because mature Third Stage societies do not do such things.

Except when they do.

Abortion, anyone?

But that gets us ahead of ourselves.

Third Stage Science and the Fate of the Enlightenment. 

More subtle issues arose in challenge to Third Stage thinking well before we get to our present global era.

Enlightenment philosophers envisioned Universal Human Rights (UHR), whether developed along Kantian lines (suggesting something Second Stage about such notions), or along those of the British utilitarians such as Bentham and Mill. It was the age of secular moral theories, one might call them.

But no single discovery in any science suggests a basis for any UHR. If anything, anthropologists such as Ruth Benedict (cf. her Patterns of Culture, 1936) leave us with the conclusion that morality is, at best, a cultural artifact. In this view, moral agency is limited to one’s own, and does not include the other who looks and acts differently, speaks a different language, and might prove a danger. Within one’s own culture, whatever rights one has, one has because common belief says so. Governing authorities may or may not back up common belief. What the government gives, of course, the government can take away, and sometimes does.

The epistemic problem for ethics in Third Stage thought: science has simply not found a basis for a universal morality the way it has uncovered unifying principles, or prospects for such, in physics. Among academic philosophers, this prompted bizarre theories such as ethical emotivism, the idea that moral judgments are expressions of emotion, and that is all. Such theories accepted the handmaiden-to-science view of Third Stage philosophy: never challenge the premises of Settled Science.

One of the goals of What Should Philosophy Do?, however quixotic this might be, is to do just this, when it needs to be done. Materialism is, after all, a worldview and set of premises, not the result of any specific set of scientific findings. According to materialism, reality is exhausted by spatio-temporal reality, and there is no transcendent God to prescribe eternal verities. Within these premises we find no basis for a universal morality. This is because there is no universal culture. There can only be (mostly futile) gestures of ungrounded stipulation: commands we may choose to give ourselves, or to approve behaviors we like while condemning those we dislike. The Libertarian injunction against initiating physical force is one of these. The social justice warriors’ demand for equality of all groups is a competing gesture. The majority of the cultures of the world find both unintelligible.

The slow and agonizing collapse of UHR in the face of the neo-tribalism of identity politics over the past three decades or so surely supports this thesis.

But are the fundamental metaphysical premises of Third Stage thinking even correct? Sooner or later, we have to ask this question.

In some respects, a number of academic philosophers of science have done us the courtesy of opening the door for us. For the Third Stage account of reifying science has largely collapsed under their analyses. Much of their work is difficult and technical (to be discussed in part in the book to come). It involves such conclusions as that scientific observation never occurs independently of theory (Norwood Russell Hanson), that mature science is always paradigm-bound (Thomas S. Kuhn), or that major scientific advances do not conform to any philosophical theory of the rationality of scientific progress at all (Paul Feyerabend).

These came on the heels of mental adventures such as the Paradoxes of Confirmation: courtesy of Nelson Goodman, any observation that confirms a scientific generalization (e.g., All emeralds are green) also confirms a potentially infinite number of aberrent predicates, we might call them (e.g., All emeralds are grue, where grue is understood as green before time t and blue after t, for any t we want to postulate). Unless we postulate an a priori principle of uniformity, or simplicity, there is no rational way to rule out such predicates even if they are never formulated, much less tested, much less confirmed, by actual science. As a logical empiricist, Goodman would not go there. But supposing we do, why should the non-designed, godless universe of Third Stage thought be uniform, or simple, or intelligible to the human mind?

The upshot of all this work has been the slow deconstruction of many Third Stage epistemological ideals about science bringing us ever closer to something called “the truth,” if by that we mean something akin to a complete, theory-neutral and culture-neutral account of reality. If indeed our fundamental premises about what exists, or of specific ways in which the world (or the part of it being explored by a particular science) is intelligible to us are wrong, then we are moving progressively deeper into error, whatever our findings seem to be.

To be clear, we do not deny those scientific findings we can validate because they are right under our noses: findings in physics, chemistry, medicine, and so on, that we make use of everyday. This would not make sense.

But can anyone truthfully say the idea of life coming from nonlife has been validated? This is not something anyone has observed. Even the laboratory creation of something that could interact with its surroundings in such a way as to replicate itself, would not show that this happened under uncontrolled conditions that we cannot know ever existed, e.g., a “primordial soup” of lifeless chemicals experiencing electrical discharges. The credibility of extrapolations about such states of affairs depend entirely on the credibility of the materialist premise. Since this premise is among the things at issue, that is circular reasoning.

Yet Third Stage Settled Science has no other options! If life was not the creation of a deity, it had to come about through some form of abiogenesis, the technical term for the chemical evolution of life. There is no third option!

The bottom-line: once we look at the fine details about what must happen for abiogenesis to occur, scientists are pretty much clueless about the origins of life. Anyone who says otherwise is lying to you. (Cf. The Mystery of Life’s Origins: Reassessing Current Theories, by Charles B. Thaxton, Walter L. Bradley, and Roger L. Olsen, 1984.)

So is materialism true? Is it believable? What we know is that one of its most important necessary conditions disintegrates when we look at it up close, and in detail.

What of other Third Stage preoccupations, e.g., in political economy (it has been a long while since I have been comfortable with the artificial academic separation between political science and economics)? Do they fare any better?

The Unsustainability of Third Stage Political Economy.

No one can deny that we have creature comforts our ancestors could never have dreamt of in their wildest imaginings. To that extent, the principles behind technology are at least reliable in the here and now. But then again, not all technological changes have worked to our advantage. Some have come with a price tag. Consider the changes in food technology under the assumption that artificial is superior to natural (the former can be patented!), and in pharmaceuticals.

Consider factory farming, in which animals are forced-fed grain with hormones to stimulate artificial growth and fattening. More meat means higher profits for food corporations, of course. But the growth hormones used have made their way into our food, and from there into our body systems, one of the milder results being children entering puberty at progressively younger ages, long before they are emotionally ready.

Are such practices damaging our health, not to mention what they are doing to the water table and the food chain?

A good resource to begin exploring these issues is Randall Fitzgerald’s The Hundred Year Lie: How Food and Medicine Are Destroying Your Health (2006).

Much of the direction food technology and pharmaceuticals have taken over the past century or so have occurred because of the market-driven or profit-based system. One of the reasons we have crises in public health is that we now have modestly unhealthy populations beset with chronic conditions. These need not be cured, but rather are managed for profit. A healthy population does not, after all, need doctors, hospitals, prescription drugs, managed care, health insurance, etc. We find ourselves with a state of affairs in which “health care” is not really about public health but about how (increasingly expensive!) care is to be paid for.

Think Obamacare!

Many pharmaceuticals, moreover, leave those taking them worse off!

Such observations bring us to commerce within Third Stage civilization, and to a mares nest, as I don’t plan to get into a lengthy discussion of “capitalism” versus “socialism.”

Here is a Cliff Notes version: Third Stage political economy, which in practice aimed to address specific problems rather than build the holistic Utopians of philosophers and some economists, has become a mixture of the two abstractions, but with the capitalist side of the mix the dominant one because of who has the money (corporations).

The mixed economy, responding to a variety of pressures, appears to evolve naturally into a culture based around mass consumption and convenience, the latter a lure corporations use to entice desirable forms of behavior on the part of the masses. The state tries to regulate corporate behavior, but rarely succeeds as more than a blunt instrument.

So what? (some might ask). Is it not true that the masses’ standard of living has been greatly increased? Even the poor are better off, poverty always having been a relative concept.

Again, Third Stage thought never promised holistic solutions. It is pragmatic, always balancing improvements against costs.

Unfortunately, matters are not as simple as that. It is true enough that Third Stage political economy satisfies needs through advancing technology and improved access, leaving most of us better off than were royalty just a few short centuries ago.

But once the majority of basic needs are satisfied for the bulk of the population, what occurs next?

Third Stage political economy (whether we call it capitalism or a mixed economy) cannot stand still. Companies must continue to produce and sell, otherwise they fail and must lay people off. That means the masses must buy what the companies produce, or the marketplace is glutted, prices fall, again profits cannot be made, and the system falls into crisis. Thus the obsession with economic growth as a sign of economic health, despite finite space and finite resources, even if everyone must take on debt to make it work. The masses need jobs, after all. Entrepreneurs need buyers. Economic growth supplies both, within a system that is constantly changing and churning. Joseph Schumpeter referred to this last as creative destruction (see his Capitalism, Socialism and Democracy, 1947). Economists tend to approve. So do advertisers! Advertising schemes are devised to create artificial needs in advanced political-economies so that people will buy! (Individuals are often better off, of course, if they keep their money!)

The alienating features of Third Stage life — the literature of which fills bookshelves! — suggested reasons to Schumpeter, all those decades ago, why the system could not survive in the long run. When it falls into crisis, some kind of government intervention is inevitable as a pragmatic solution to get things moving again. The Keynesians figured out how to do this, even if their solutions were short-term (their downfall). Financialization eventually arose during the final third of the last century. Financial institutions had known all along that they could inject capital into the system to keep it growing and changing. The creation of money through fractional banking has systematically devalued it, while redistributing wealth upward, one might say. A few reaped windfalls from this, and continue to do so. That would be those often called the cosmopolitan or globalist (or bicoastal) elites in the largest investment banks (think: Goldman Sachs).

That subpopulation within the masses known as the middle class began to fall behind long ago as wages failed to keep up with inflation. Working class people have not had true representation in government in a long time now, at least not since corporatism took control of the Democratic Party (it had long controlled Republicans).

This is all well known, and I need not use more bandwidth space to recount it here.

Suffice it to say: at present the entire global economy rides atop an ocean of red ink, the product of several decades of borrowing against the future to stave off, as long as possible, the inevitable crash that is the fate of all unsustainable systems. In the meantime, the world has experienced lesser crashes of increasing severity, the worst to date having happened in 2008. Many financial writers believe a far worse crash is right around the corner.

Third Stage Civilization and Covert Authoritarianism.

The real zinger, however, is that contrary to all the bluster about “liberal democracy,” Third Stage civilization is fundamentally authoritarian. Perhaps Sheldon Wolin’s concept of inverted totalitarianism applies, in which systemic demands replace dictatorial decrees. Third Stage authoritarianism manifests itself in carefully directed incentives (e.g., tax breaks for corporations, discounts for consumers) and economically-grounded pressures of various sorts instead of overt police powers of the sort seen in full-fledged dictatorships. Third Stage civilization began to centralize around heavy industry in the late 1800s, backed by financial institutions. Eventually our central bank, the Federal Reserve, was created to control the money supply and further centralize the economy. Centralized systems cannot work without introducing authoritarianism at some level, because there is no visibility of the bottom from the top; the systems are too large and expansive. Hence the necessity of command-and-control, however structured and implemented.

The ideal, therefore, became socially engineering masses that would accept increasing encirclements without question or complaint. This is precisely what public education set out to produce, replacing real education (which emphasizes liberal arts learning) with socialization to encourage conformity and vocationalism to produce compliant workers (and taxpayers). Subjects like critical thinking were ratcheted down. What was left was watered down. (Philosophy students were taught formal logic, but not told that the systems encircling them discouraged rational thinking.)

Again, that story is a long one, but abundant documentation exists. Start with John Taylor Gatto’s The Underground History of American Education (2001).

Or consider this, the opening paragraph of Edward Louis Bernays’s 1928 tract on Propaganda (recently republished):

The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.

We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we have never heard of. This is a logical result of the way in which our democratic society is organized. Vast numbers of human beings must cooperate in this matter if they are to live together as a smoothly functioning society….

…. Whatever attitude one chooses toward this condition, it remains a fact that in almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons — a trifling fraction of our hundred and twenty million — who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind, who harness old social forces and contrive new ways to bind and guide the world.

Bernays, nephew of Sigmund Freud, was the founding father of public relations and advertising. He became a multimillionaire developing advertising campaigns for major corporations who sought him out. It was Bernays who hit on the idea of using celebrities in advertisements. The odds are very good that he knew exactly what he was talking about, and that as the technology of advertising systems improved, their capacity to exert subtle pressures on individual consumers via appeals to emotions ranging from vanity to the fear of missing out increased proportionally.

I trust no one who has read this far thinks all this is a “conspiracy theory” of recent history and society!

It was against the latent systemic authoritarianism of evolving Third Stage systems that significant fractions of a generation rebelled (late 1960s), even if what that generation served up in response was sometimes worse.

The point is, in our time, democracy appears to have been unmasked as the sham it has been for a very long time now. The reality is that the U.S. is a plutocratic oligarchy (Gilens & Page 2014), as is the case with all the other Western powers.

Eastern powers are somewhat different. China is openly oligarchic, the product of its having embraced many trappings of Western capitalism — for China’s Communist Party remains solidly in control of Chinese corporations (with the likely exception of the country’s central bank). Singapore has proven that one can have flourishing and wealth-generating (for its elites) capitalism without any trappings of democracy.

Arguably, the Eastern mindset is more honest!

Interlude: Crisis Yesterday Versus Crisis Today.

A sense of crisis is stirring throughout the advanced world. That admittedly sounds like a cliché. We’ve been in crises before, of course, many of them more painful than anything occurring at present (if we could ask those who lived through the Great Depression, I am sure we would find out). The U.S. was severely divided in the late 1960s and early 1970s, especially by a war favored by the Establishment but opposed by that significant fraction of a generation of youth who had learned how to use the platforms of their time to become mouthpieces of dissent.

Their dissent was the dissent of idealism, however. What is different now is that the idealism that prevailed in those years is largely gone, or at least greatly muted by the information explosion. There are exceptions (cf. Matt Ridley’s The Rational Optimist), but these appear to be exceptions to a general pessimism, leading more and more people to turn inward and tend their own gardens, as it were. Their focus is not on high ideals. It is frequently on earning enough devalued money to survive. This is the plight of the millennials, graduating from universities often with worthless degrees but with five and sometimes six figures of student loan debt.

More recent writers speak openly of the slow collapse of American society and how you as a person can prepare to survive it, even if you live in the U.S. (see, e.g., Dmitry Orlov, The Five Stages of Collapse). The arguments of such authors are compelling and not to be casually dismissed.

Upshot: Third Stage civilization, and its worldview, are dying. We see the reasons every day.

We find ourselves in what we might call a Comtean Fourth Stage, something Comte could not have envisioned.

Postmodernity: Our Present Fourth Stage Condition (with an Aside on Abortion).

Extending the metaphor used earlier, the Fourth Stage of a civilization is evidenced not by the supposed maturity of adult responsibility but rather advancing age and infirmity, perhaps even cognitive deterioration. In the West, Fourth Stage thought is filled with alienation, cultural pessimism, disillusionment, and sometimes rage. Think first of the existentialists, especially their fiction (but sometimes nonfiction works such as Albert Camus’s The Myth of Sisyphus). The most advanced Fourth Stage thinking could be identified with postmodernity in a broad sense: once aware of itself, it questions all the narratives and metanarratives that have gone before, including its own first principles. Its writers include French philosophers such as Michel Foucault (“knowledge / power”) and the Austrian-born iconoclastic philosopher of science Paul Feyerabend (Against Method: Outline of an Anarchistic Theory of Knowledge, orig. 1975).

Postmodern epistemology is clearly not an intellectual fad (although its spirit has been incorporated into numerous academic fads).

Fourth Stage thinkers, who invented “sociology of science” with flourishes of irony, wonder how much of actual science is able to find truths in the old, Third Stage sense, independent of a money economy. Actual science after all is just human communities dependent on university and peer support (including the same desire for job security as nonscientists), corporate sponsorship, grantsmanship, etc., and therefore embedded in the market-based system and no less vulnerable to its vagaries. In this case, in the Fourth Stage world, is supposed truth more than what has been bought and paid for by those who can command the necessary resources, the same as any other mass commodity?

How did we get here? Although there were strong hints of a Fourth Stage in writers such as Kierkegaard and Dostoevsky, arguably it was Nietzsche who did the most to kick open the door to Fourth Stage thought. When he called for a revaluation of all values and warned against a future “advent of nihilism,” he was telling us that when God was erased from our philosophical and cultural map of reality, the conceptual support for everything God’s existence gave meaning to and empowered was also removed.

The ideas behind Enlightenment ideals of UHR were rooted in the Christian sensibility, after all, along with all the moral gestures that came in its wake. Belief that all persons were created in the image of God was UHR’s original basis for support. Once that support was gone, like a ladder kicked out from under a worker atop a high wall leaving him hanging by his fingernails, UHR’s days were numbered.

Eventually the Enlightenment itself would be eclipsed. Today we read anguished articles by elite authors about how “democracy is dying” at the hands of so-called populists like Hungary’s Viktor Orbán and America’s Donald Trump.

There is, finally, the rage of writers such as Pankaj Mishra, author of Age of Anger: A History of the Present: rage against colonialism, against the fact that many of the world’s populations have not enjoyed the fruits of Third Stage modernity, but also against the tendency of the latter to embrace the Orbáns and Trumps of the world out of their own misguided resentments.

This didn’t just happen.

The twentieth century, bit by bit, art school by art school (think: Dadaism), literary figure by literary figure (think: Hemingway and Camus), and philosopher by philosopher, worked out the consequences of life in a godless cosmos, especially if we faced our predicament honestly instead of evading it. What predicament are we talking about?

That our lives have been shorn of meaningful value other than exchange value, that they are reduced to an empty and absurd nothingness against the vastness of a dead cosmos: that our choices lie between suicide (pondered by Camus in Sisyphus; Hemingway did commit suicide), or an authentic existence (as the existentialist uses this phrase) which stares the absurdity of life in the face but elects to enjoy whatever momentary experiences and memories life can bring us.

Except that if you are not a flamboyant artist or a famed existentialist writer or a member of the political or corporate class or a celebrity, and if your life is one of cubicle-job drudgery or worse, as has proven to be the case for the majority even in advanced, Third Stage conditions, then if you remain solely within the realm of reality, you might not have much to enjoy. Camus did not do much to articulate the choices of this majority, which are to bury themselves in private activities (including sexual fetishes), fly into some fantasy world of which there are plenty, or devote one’s entire existence to some political cause, usually a futile one.

The ephemeral and contingent value of human life can be seen in the abortion epidemic, in which over 60 million unborn babies have been killed in their mothers’ wombs, and sometimes on the abortionist’s table if by some chance they survive the procedure, since 1973. These, on any accurate scientific reading of the situation, are the most vulnerable human beings on the planet!

But is a fetus human?

When I once asked students this in a contemporary moral issues class, one quipped sarcastically, “Well, they aren’t goldfish.”

But are they persons and therefore members of the moral community? And don’t women have a right to control their bodies (the standard response)?

How does one product nonarbitrary criteria for admission into the moral community in the Third Stage intellectual environment? We have only biology to go by: if X has a complete set of human DNA, then X is human; and if X is human then X is a person and a member of the moral community.

There. I’ve done it. The roof hasn’t yet caved in, so we can drop the real bombshell.

If a fetus is human, then we can tell the feminist: it isn’t your body alone, but also the body of your unborn child, a human being whose vulnerability is as close to being absolute as you are going to find in this world.

What we can say is that if the morality of a society is indeed measured by its willingness to institute protections of its most vulnerable members, then ours fails as badly as any form of overt genocidal totalitarianism!

Materialism as a worldview provides no final social protections, much less intellectual arguments, against simply writing entire populations with human DNA out of the moral community! The Nazis did it with the Jews; the Soviets did it with even larger populations that resisted collective farming; American feminists and other left-liberals do it with the unborn!

Forward to a Fifth Stage of Civilization? The Case for a Technology of Abundance.

Although limits of space and time preclude a full development of the ideas here (many of which are not finished, anyway), I would close by asking whether the Fourth Stage thinking in which we have found ourselves is any more sustainable than Third Stage thinking.

I do not think it is. I will not state my full reasons here, as they are even longer than this has been (this is not a topic for short attention spans, obviously). They will be found in the two book-length manuscripts on which I am at work, and I can only hope and pray that my present and future resources will enable them to be finished, published, and that they find their audience.

I will only state that Third Stage thinking, if it takes itself seriously and faces its consequences for human life honestly, generated Fourth Stage thinking. Among the remnant of thinking people, it cannot do otherwise.

The telos of Fourth Stage thinking is suicide: if not personal, then cultural and cognitive. We probably have an explanation here for why fantasy of various sorts (not always labeled as such!) is so popular on today’s world, especially among the young. Not to mention why suicide is one of the leading causes of preventable death in the advanced world, especially among the young.

What I would argue: we cannot simply go back to an earlier stage, although we can identify features of those earlier stages which were fundamentally sound, and from which we can still learn.

We can only go forward. We refers here to that forward-thinking remnant.

I would therefore urge you to think about the possibilities of a Fifth Stage of civilization, one which recognizes, as did all its predecessors, the weaknesses of what came before, and resolves not to make those mistakes again. I hope to characterize this Fifth Stage more fully in future work. I will only say here what it will not do: it will reaffirm God, restoring him to our philosophical and cultural map of what is real, but not evince the “blind” faith in Him characteristic of First Stage thought which empowers theocrats. It will recover God through the realization of human folly (Biblical sin) and through realizing that purging Him from our world has been disastrous.

Fifth Stage thought will not be abstract and dichotomous as was Second Stage thought, but will preserve the commitment to systematic thought through ideas made available though systems theory and by recognition of the importance of process. It will note the organic nature of communities and the beliefs that animate them and give them meaning, including belief in a God. It will not be positivist and scientistic, and ultimately elitist, as was Third Stage thought. It will recognize where science and technology have given us genuine advances and insights, and under the right circumstances can continue to do so. What it will deny is that science and technology either are or should be thought capable of solving every human problem.

What technology might be capable of doing, given the right liberating circumstances, is creating abundance, rather than maintaining systems based on scarcity. The keys here are energy and its production, alongside the rise of robotics and artificial intelligence (understood by this as expert systems we have now that are capable of replacing human workers en masse). I do not wish to spell out the totality of what I am thinking at this moment regarding the former. But I invite any readers who have followed me this far to investigate for themselves what Nikola Tesla might have been working on that caused J.P. Morgan to pull his funding, and the U.S. federal government to classify all his research papers following his death. The bulk of Tesla’s later work remains classified to this day. The question: are there forms of energy a few technologists already know about that would not just end our dependence on oil but put all existing energy corporations out of business, even as they generated sufficient abundance to make the basic necessities of life readily available to all?

This, of course, would end the threat of technological unemployment, which is really just the threat of homelessness and starvation. A technologically-produced abundance of food, clean water, housing, etc., would end the near-absolute need of people to work in order to obtain money to pay for those basic necessities.

Would this not be potentially the end of involuntary poverty, period?

And given that most wars and more limited conflicts are waged over presumably scarce resources, would not a world in which technology has created abundance have a better hope of eliminating war, and building the kinds of bridges a realistic conception of UHR requires, than anything we are doing at present?

And finally, would not an end to our dependence on oil and its products do more than a thousand treaties and UN-sponsored agendas to alleviate our fears about what Third Stage industrial civilization might have done to the climate?

Utopia? Or Oblivion?

With apologies to R. Buckminster Fuller, who came up with these ideas long before I did.

Such thinking may seem utopian in the present environment. I don’t believe it is, but given the prevailing cynicism, or just the prevalence of those locked into a favored ideology, whether of the so-called left or the so-called right, what is being proposed here probably looks utopian. But Fifth Stage thinking, if it comes to be, will not be negativist, cynical, neo-tribalist, and anger-driven, as so much Fourth Stage thought has turned out to be. Nor will it cling to “moral principles” that amount to no more than abstract stipulations, as do Libertarians whose ideals owe more to Second than Third Stage thought.

My view, for whatever it is worth, is that we have no choice. This is for reasons stated above: our present course, and the assumptions behind it, is/are unsustainable. If you think I am wrong, then go back and read those sections, and consult the works I reference. Then feel free to leave your critical observations in a comment below.

Nor am I alone in thinking these goals might be achievable, if we adopt and engage the proper mindset. Fuller, genius inventor and systems thinker, made the observation a half-century ago that we have the technological ability to feed everyone on Spaceship Earth.

That was 1970 or thereabouts. The question I asked, all those years ago, was Why aren’t we doing it?

Today, with new technologies like 3D Printing, we are close to having the means to offer the world decent housing, and at a negligible cost! (Every economist will tell you that as you increase the supply of anything, you lower its cost; abundance will bring that cost to nothing or almost nothing. What, then, of the need to make money? The answer is, this must happen in a political economy in which people do not need to earn profits or money in order to live! I do not think of this as socialism, since socialists still operate within a conceptual framework based on a presumption of eternal scarcity. They just want scarce resources distributed equitably. I want to eliminate the presumption of scarcity, conceptually and technologically, thus taking us beyond capitalism-socialism disputes!)

Having said all this, I want to be clear: at present there is more than one possible outcome here. The actual outcome, obviously, will be based on decisions made now, or within the next decade. As a people we will either decide to pursue some variant on the kinds of goals discussed here, or at the appointed time, once our present debt-fueled economic bubble will run its course and collapses. When the dirt settles, we will have found ourselves returning, by necessity, to a fundamentally feudal type of society, advanced technology notwithstanding, in which a chronic lack of good-paying work will render the majority impoverished, struggling, and dependent in large measure on the good graces of their elders or on a shrinking population of fortunate haves.

Those who flocked to cities believing them to be havens of opportunity might find themselves in serious trouble. For our urbanized masses long ago forgot how to grow food; they believe it comes from grocery stores. Most cannot make simple home or car repairs. Without a car in a typical American urban or suburban environment, you are effectively stranded. In the case of an extended power outage as the result of a major emergency, most would have no idea how to heat their homes in cold weather without risking starting dangerous fires and taking their apartment-bound neighbors out with them.

In such a world, philosophy as a professional activity would probably have no future.

Conclusion.

We have surveyed Auguste Comte’s Law of Three Stages, which converged on Third Stage philosophy and civilization as its ideal. With a 20-20 hindsight Comte could never have mustered, we found this ideal wanting on numerous grounds.

A civilization with its eyes exclusively on this world, based around science, technology, commerce, public education, governance of whatever sort, and presuming the givenness of progress and meliorism (the idea that we can make ourselves morally better by our own efforts) has turned out to be unsustainable in practice.

The wars, genocides … declines in public health, with an abundance of evidence that the central priorities of both government and corporations are solely with how managed care is paid for  … worsening educational systems that turn out functional illiterates … the epidemics of suicide … the sense that in the face of rapid technological change driven by a handful of leviathan corporations we have lost control over information and over our lives … all further the sense of unsustainability.

We have this sense of impending crisis. While crises are nothing new, the cultural optimism that resolved them in the past is gone. We have a strong sense that with our personal, corporate, and national debt climbing, we are living on borrowed time no less than on borrowed money.

In that past, we used technology to send men to the moon and return them safely. Today we use it to follow celebrities, take selfies, and chat mindlessly. We are all wired into technology, but as persons we have never been lonelier. We kid ourselves into thinking our “Facebook friends” are really friends.

Are we wired into online “communities” because real communities, in which people who care deeply about one another and interact face-to-face, are dying if not already dead?

Postmodernity is a kind of Fourth Stage … in art, philosophy, literature, commerce, technology, and in numerous other arenas.

Unlike Third Stage thought, it questions whether objective truth is any more obtainable than objective morality, while not denying that an abundance of (often conflicting) truth claims (like moral claims) can be bought and sold like any other commodity.

The challenge for the future is to conceptualize, and move towards, a Fifth Stage of thought and civilization, and do it amidst the present information glut, in which the first challenge any thinker and writer faces is to have his/her work actually seen.

The Fifth Stage of civilization — if it can be made to happen — will have turned to first premises and foundations where we find them, and where we find ourselves.

Where we find ourselves is lost without any sense of a God. Perhaps we should consider (I am speaking figuratively, of course) restoring Him to His rightful place in the world, as the Creator and therefore the Center of all value, with all that this requires of us.

A Fifth Stage of civilization will then be the scene of positive boots-on-the-ground work to recover our health and rebuild our lives and communities. All lives will matter, because all lives were made in the image of God.

Science and technology will become our servants, not our masters.

Perhaps at the end of whatever path a Fifth Stage mind goes down, it will find genuine freedom within communities whose members have not been thrown into ruthless and divisive competition with one another for an increasingly limited number of jobs amidst an artifically-maintained scarcity.

This may create conditions for a spreading peace of mind that will come from knowing what one knows (while recognizing and deeply appreciating mystery), doing what one can do (recognizing limitations), and simply being what one is (recognizing that, in the end, we are not God).

 

 

Posted in Culture, Philosophy, Political Economy, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , , , , , , , , , | 11 Comments

A New American Philosophical Association Organization?

This past week, philosophy’s top blogger Brian Leiter posted a poll on a quite interesting topic: Would you leave the [American Philosophical Association] and join a new dues-charging professional philosophy association that does much of what the APA does, but without the current political agendas/projects? 

Results: just 13% said Definitely Not. Another 10% said Probably Not. Ten percent were Undecided. And 25% said Probably and a telling 42% said Definitely.

That is to say, a solid majority of Leiter’s readers, 67%, either would leave or would probably leave the APA and join a new professional organization were one to be formed. They would effectively abandon the APA to the machinations of the identity-politics activists who have, to a large degree, hijacked the humanities over the past two to three decades.

As a subscriber to the APA’s blog, I can confirm that the majority of posts on it are like this one, playing off current events to further their collective grievance cottage industries.

Purveyors of the politics of collective grievance, or identity politics, are nowhere near a majority in the discipline, of course. I do not know what percentage of those who are either professional philosophers working in academia, philosophers working in other occupations, or just observers who find philosophy interesting, actually support this kind of thing.

You can be left-leaning or center-left in your overall outlook, and still realize that identity politics is completely out of control. Leiter is an example of such a person. I do not always agree with him by any means, but he doesn’t go into attack-dog mode against those whose views he disagrees with or even disdains. He doesn’t try to censor them, or sabotage their careers.

The majority of college and university professors are center-left, after all. Most are not crazy. An aging demographic grew up during the civil rights era, after all. Others of us came of age in its aftermath. Today the majority of professors of philosophy probably prefer just to teach their classes, conduct whatever research they are conducting, and be left alone. If they once had visions of changing the world, they relinquished them years ago.

They certainly don’t want to be called out for one inadvertent slip of the tongue that, e.g., “triggers” some sexual minority they barely knew existed, and suddenly find their careers in jeopardy.

The idea of a new philosophical organization has been supported here (by blogger Daniel Kaufman, who reviews some of the history on which I commented in an older post.  I’ll concede: I went a little overboard with that title. But I was not wrong (and there is an additional object lesson there about how social media can bring out the worst in all of us, unfortunately).

I left the American Philosophical Association almost two decades ago over this sort of thing. I was weary of its Newsletters and other activities that used my dues money to promote this or that political agenda of an extreme minority in the profession as if that were (or should be) the profession’s overriding concern …

… while doing nothing to support a far larger group. This group, which cuts across genders and ethnicities, consists of those who are untenured, not on a tenure-track line, and as we weren’t getting any younger, were less and less likely to find tenure-track jobs every year.

To paraphrase how I put it on one occasion back in those days when I was far more Libertarian than I am now: how many Newsletters about Philosophy and Individual Liberty have you seen lately? (I’ve since realized that Libertarianism is no less Utopian than Marxism in how it misreads human nature and motivations, but that’s a post for another day.)

I later left the profession itself, of course. It had become clear that my words, the words of an adjunct instructor at an isolated branch campus in a Southern state well away from the centers of philosophical gravity, weren’t going to change anything … and I had little interest in sticking around to see what was likely to come eventually.

This would be administrators, and possibly students, complaining that my Introduction to Philosophy and Contemporary Moral Issues courses assigned too many readings by philosophers and others who were straight white males (and gasp! I even had a Christian or two in there!).

I didn’t know whether or not such a thing would ever happen in South Carolina (to my knowledge it hasn’t), but identity politics does not appear likely to run its course any time soon, and stranger things have happened. I had other reasons for not sticking around.

Getting back to my main subject here….

Were a new dues-paying organization of professional philosophers to form, and if its leadership was open to nonacademic philosophers, I would join in a heartbeat.

There are, after all, things about the APA I’ve always missed.

Membership in such an organization is one of the best ways to keep a pulse on any new developments or trends in the field that are worth watching. I haven’t seen much of anything since the late 1990s, but I am open to the possibly that by dropping my APA membership and then exiting the academic stage voluntarily, I’ve simply missed them.

 

 

Posted in Academia, Philosophy, Where Is Philosophy Going? | Tagged , , , , , , , , | 2 Comments

Open Letter to Professor C. Christine Fair, Georgetown University

Re:

“Look at this chorus of entitled white men justifying a serial rapist’s arrogated entitlement. 
All of them deserve miserable deaths while feminists laugh as they take their last gasps. Bonus: we castrate their corpses and feed them to swine? Yes.”

— (((Christine Fair))) (@CChristineFair) September 29, 2018

Professor C. Christine Fair,

Saludos from Santiago, Chile.

I’ve not written anything quite like this, and I am not quite sure how to begin it.

I’ve been around the block a few times and seen some vile stuff, but nothing quite like that remark from your Twitter feed.

Yeah, I’m a white guy, and on top of that, I’m straight as an arrow. Sue me.

I’m not going to go into attack mode, though. Others have probably done that far better than I.

I just wonder, though: what do you really think you accomplished with that tweet? Do you think such remarks do anything to heal the divisions that are tearing American society apart, much less further the mission of your institution (whatever it is these days)?

Or maybe something as breathtakingly constructive as trying to heal divisions is not your aim.

Maybe your aim in writing that was just to piss people off, so you’d get a predictable reaction.

And from the top comment on your Facebook page, it looks like you got one.

Not here. You see, Christine, I’ve been following the decomposition of academia for over 25 years now.  And you know something? A lot of us, out here in the boonies, have written folks like you off.

But I just have to say what is clear: you can’t possibly be interested in what is true and factual, much less what is right, or fair, or just. If you were interested in truth, you wouldn’t have called Brett Kavanaugh a “serial rapist” when there isn’t the slightest scrap of evidence the accusation is true.

Just chalk it up to my perspective as a white guy who doesn’t live in your academic corner of the universe where all straight white Christian men with conservative ideas are history’s criminals, where we are guilty if accused, and deserve to die miserable deaths and be castrated afterwards and our nuts fed to swine while your ilk laughs.

So sorry about that.

Make the best of it.

The only thing you could accuse me of is being a masochist, and you would be right. You see, Christine, I left academia several years ago, having gotten the message that my small voice wasn’t going to change anything, and I saw stuff like this coming.

I moved overseas, married a chilena (women here really are women, not … whatever you’ve become, up there in the former Land of the Free).

I’ve noted that free speech seems to apply to you. This:

The views of faculty members expressed in their private capacities are their own and not the views of the University. Our policy does not prohibit speech based on the person presenting ideas or the content of those ideas, even when those ideas may be difficult, controversial or objectionable. While faculty members may exercise freedom of speech, we expect that their classrooms and interaction with students be free of bias and geared toward thoughtful, respectful dialogue.

How nice that your wise administration has your back. Not so much for this person:

“[Kavanaugh accuser Julie] Swetnick is 55 y/o,” [Dean William] Rainford wrote. “Kavanaugh is 52 y/o. Since when do senior girls hang with freshmen boys? If it happened when Kavanaugh was a senior, Swetnick was an adult drinking with&by her admission, having sex with underage boys. In another universe, he would be victim & she the perp!”

From your university president, John Garvey (obviously going boldly where no administrators have gone before):

Rainford’s tweets of the past week are unacceptable. We should expect any opinion he expresses about sexual assault to be thoughtful, constructive, and reflective of the values of Catholic University, particularly in communications from the account handle @NCSSSDean. While it was appropriate for him to apologize and to delete his Twitter and Facebook accounts, this does not excuse the serious lack of judgment and insensitivity of his comments.

Rainford has led the National Catholic Social of Social Service since 2013. It is my desire that he continue to lead the school. But in light of these recent actions I have suspended him as dean for the remainder of this semester. Rainford understands and accepts this decision. Associate Dean Marie Raber has agreed to serve as Acting Dean during that time.

Double-standard at Georgetown? Who’da thunk it?

Well, Christine, I’d like to think I’ve made my point. Not holding my breath for confirmation of that, of course.

I presume you’ve heard, Brett Kavanaugh was confirmed to the U.S. Supreme Court earlier this afternoon (it’s 7:15 pm in my time zone as I write this).

Do have a very nice rest of the weekend, Christine.

But stay away from books like Heather MacDonald’s new one The Diversity Delusion (St. Martin’s Press). Your head will explode on contact. Friendly advice from your friendly neighborhood straight white Christian male. (Oh, I will also add you to my prayer list, that somehow your worldview might become a little less violent, bitter. and hateful.)

And don’t worry that I’ve started trolling you or something because I accessed your Facebook page to link to it. I assure you, I have better things to do with my time.

Sincerely yours,

Steven Yates, Ph.D., Philosophy

Writing from Santiago, Chile.

Posted in Academia, Media, Where is Civilization Going? | Tagged , , , , , , , , | 2 Comments

The Fate of Civilizations

Should a philosopher be interested in the trajectory of civilizations, from their rise to dominance in a region, and then the reasons why a civilization seems to lose its collective capacity and go into decline?

Most professional philosophers are not, of course, mostly because of the micro-specialization of academia generally. But suppose we can identify philosophically significant premises believed within populations as well as by leaders … premises that might empower the rise of a civilization. If these premises then start to disappear, or are removed, the civilization starts to falter.

Historically important philosophers such as Condorcet, Comte, Marx, all had theories of stages civilizations went through. Each believed that progress would lead to a final state of affairs, that which Hegel called the Absolute. For Comte, the ideal society was a society based on the applications of science to every aspect of human life. Bertrand Russell agreed. Since subsequent history has shown abundantly that science and technology are just as prone to abuse as any other human products, there is now grave doubt that a society based on science (and technology) would be the ideal.

Other writers saw civilizations as moving in cycles: as having life spans not unlike that of a person, with all the stages of life a person goes through. Edmund Gibbon wrote his classic Rise and Fall of the Roman Empire. Spengler penned The Decline of the West. Carroll Quigley, in The Evolution of Civilizations, wanted to answer Spengler as he believed civilizations in trouble could turn themselves around and continue making progress. Writing in the 1950s and 1960s, Quigley believed we were in trouble even then.

A lesser known writer we ought to investigate is Sir John Bagot Glubb (1897 – 1986). His major works began to appear around the same time as Quigley’s, and for some time after. Glubb was British, the son of a Royal Engineers officer, whose own military education and service (Royal Military Academy, followed by his own stint in Royal Engineers), followed by service in the first World War, eventually took him to the Arab world where he settled in. In 1930 he signed a contract to serve the Transjordan Government (later: Jordan). He came to command the Jordan Arab League. By this time he had assimilated into Arab culture, a culture he appreciated and even loved. He became a leading authority on the history of the Arabs, eventually writing some 20 books on the Middle East.

By the 1950s, however, Glubb noted with some dismay that his native Great Britain was in full retreat around the world. He’d learned that the Arabs had a vast empire a thousand years ago. He found himself studying other empires. He soon came to the conclusion that vast civilizations follow a pattern that transforms them into empires and then sees them destroyed. This led to his best known essay, “The Fate of Empires” (1978).

What pattern did he see?

He saw civilizations going through six stages, or phases. Here they are:

(1) The Outburst and Age of Pioneers.

(2) An Age of Conquest.

(3) An Age of Commerce.

(4) An Age of Affluence.

(5) An Age of Intellect.

(6) An Age of Decadence.

Glubb would agree, that is, with the idea that civilizations are usually not conquered but fall from within. Let’s consider each stage in a bit more detail.

(1) An Outburst may follow the appearance of some new ideal that captures the imaginations of a population, or a founding document such as the U.S. Constitution that expresses this ideal. The U.S. Declaration of Independence and its Constitution surely count as such documents. This, Glubb notes, has happened elsewhere. It happened with the emergence and rise of Islam. What follows is the start of a rapid expansion.

(2) This expansion is called the civilizations Age of Conquest. Those leading the Conquest become national heroes. Heaven help any other cultures unfortunate enough to be in the way. Ask the indigenous cultures that populated North America as the U.S. expanded westward during the very early to mid 1800s. For that matter, ask those who got in the way of Roman expansion, or in Napoleon’s way.

(3) With territory claimed, an Age of Commerce ensues. Farms and factories are built, trade routes are laid down, a single language is spoken, and a single administrative system falls into place across the region. It is during this period that the seeds of trouble get planted, however. For as the first native fortunes are accrued, those building them start to notice the power money gives them. Power otherwise unavailable within the political and administrative system. This they find fascinating!

(4) An Age of Affluence begins (there will be considerable overlap between this and its predecessor). To all appearances, the High Noon of a civilization is its Age of Affluence. Because new technologies are appearing and the builders of fortunes create millions of jobs, the standard of living rises exponentially. With sufficient surplus wealth floating around, large universities can be created and endowed, research institutes formed, etc. The generations that follow experience the results of this overall rise in prosperity but not the effort that went into them, and this, too, presages trouble. Moreover, making money starts to become an end in itself and not a means to advancing the common good of communities.

(5) An Age of Intellect begins, also overlapping with its predecessors. Almost every major community will soon have its college or university. Some of these will be very good at this stage; others will be mediocre. With the basic necessities of life now assured for a sufficient fraction of the population, acquisitions of academic honors start to replace honors achieved through military conquest and even by commercial achievement. At the same time, the Age of Intellect is marked by the appearance of disputations that more and more, seem to lack seriousness in the sense that they don’t address real problems. They may well be steeped in false premises, not recognized as such because they are not really tested against the world but protected within academia’s safe groves. Inevitably, such disputations turn to the foundations of the civilization itself, be they religious or otherwise. A civilization begins to drift as its first premises are called into question by its ostensibly best minds. Intellectual and eventually political leadership is thus beset by quandaries and doubt that did not exist before. Questionable decisions will be made, some involving which intellectual groups to support with lavish funding. Some will have bad consequences, as bad books are written and absorbed within a growing media culture. A result is that the moral “fiber” that holds communities together starts to unravel. This is accompanied by rapid “creative-destructive” advances in technology, achievements of convenience, and so on, that often lead to massive differences between parents and children, adding to uncertainty.

(6) Ages of Decadence call for lengthier attention. An Age of Decadence is marked by all or most of the following.

(a) Monetary policy is less and less responsible; as when, for us, financialization replaced production as a means of wealth-generation, allowing production to be outsourced to third world nations for cheap labor, all in the name of enhanced profitability. Civilization is by now highly centralized, so that monetary policy affects everyone within its borders in one way or another, and for that matter, will affect other societies that are trading partners.

(b) Rapid cultural changes are urged; these are eagerly embraced by some populations but not others, leading to rising division and dissension.

(c) There is rising alienation, as institutions of all sorts cease to serve persons and become expansive, impersonal bureaucracies serving only themselves. (Incidentally, think of the replacement of personnel departments with human resources departments, the implication being that human persons are resources not different in kind from other resources.)

(d) Increased frivolity sets in, as celebrities and sports stars replace achievers of the past who made genuine contributions to the civilization.

(e) Women begin to move into professions previously dominated by men, sometimes for economic reasons as the currency is devalued, wages flatline, and families need two breadwinners instead of just one.

(f) Immigrants begin to flow into population centers, the difference being that immigrants of the past learned to speak the dominant language and assimilated into the dominant culture while those of this new period do not. The result is that subcommunities form, and the capacities of schools, hospitals, and other institutions are overwhelmed by a babble of foreign languages. Some of these subcommunities are actively hostile to the dominant culture, furthering already existing divisions. (We see this happening in Europe, a civilization clearly in its Age of Decadence.)

(g) There is rising dependency on the instruments of the state sometimes for basic necessities. This may be because families have split up and communities have become divided, leaving elderly couples stranded and without other help; it may be because profit-driven outsourcing has resulted in a lack of jobs that match the skills of the population. It may be because of growing chronic health conditions resulting from imbibing unhealthy food, products of other questionable (but profitable) decisions.

(h) Schools fail to educate. Documentation of this is ignored. Educators begin to leave the profession out of frustration. The sources of their frustration may range from the growing indifference and unruliness of students, from bureaucratic interference with their teaching methods and content, or from pay so low that it fails to meet their basic expenses. Schools fill up with mediocrities and become less and less functional.

(i) Religious belief, healthy patriotism, a sense of duty to the common good, respect for matters of learning, and other commitments aggregated under the label tradition are replaced by materialist consumerism, a love of money, frivolity, and cynicism. These encircle the individual, who is increasingly isolated if he refuses to commit to them. A kind of pessimism suffuses the body politic however papered over with “eat, drink, and be merry.” Pessimism and anxiety will be reflected in literary, artistic, cinematic, musical, and other cultural products.

(j) An irrational fascination with sex of every variety comes to suffuse all cultural and commercial activity. We see its results all around us: distrust and hostility between the sexes, extramarital affairs, marriages breaking down or not happening at all as people choose to stay single (much easier to have multiple affairs that way), the appearance and mainstreaming of practices previously rejected sometimes as immoral but sometimes just on public health grounds. A general sense of the cheapness of human life manifests itself in the widespread acceptance of such practices as abortion. An added sense of the postmodern fluidity of truth is employed in their defense, which speaks euphemistically (e.g.) of a “woman’s reproductive rights” without the added observation that the “right” under discussion is a right to kill another human being without impunity.

(k) Finally, and most dangerously, civilizations, having entered their Ages of Decadence, take on the full characteristics of empire: over-expansion whether politically, economically, militarily, or in some combination of all three. They become aggressive toward other nations, often seeing themselves entitled to those nations’ resources or to be able to profit from having gotten them hopelessly entangled in debt (see John Perkins, The New Confessions of an Economic Hit Man, 2016). These same governments and corporations become increasingly aggressive towards their own citizens. Decisions are made on the basis of expedience, not principle. Those who criticize this system are driven to the margins, though they can appeal to increasingly alienated populations and sometimes gain an audience for their ideas. Among those ideas, if they are to retain a following, is hope. Satisfying that hope, however, is predicated on a fundamental change in the collective consciousness. Such change may go against the will (and the profit margins) of the corporate-state, and therefore be resisted violently by those in power if it catches on.

Is it not clear that the U.S. (and indeed, much of the rest of Western civilization) is deeply mired in its Age of Decadence?

I shouldn’t have to argue the point!

Those who keep up with current events saw the pathetic spectacle of the hearings over Supreme Court nominee Brett Kavanaugh over Dr. Christine Blasey Ford’s allegations of sexual misconduct during a party when he was 17 and she was 15. There were abundant reasons to believe the Democrats were holding onto the allegation, just in case they could not take Kavanaugh down on his legal qualifications and experience. The sense that Dianne Feinstein had little intrinsic interest in Dr. Ford’s complaint illustrates the cynicism of our times. As matters ensured, we were treated by Senate Judiciary Committee members to a recounting of the exact words used by teenagers in Kavanaugh’s high school yearbook as if they constituted evidence, something that did not occur even 25 years ago when Anita Hill accused Clarence Thomas of sexual harassment. What is clear, however, and this speaks to the sense of fluidity of truth mentioned above: if you’re a liberal or progressive, you believed her. If you’re a conservative, you believed him. Moreover, if you’re a liberal or progressive, you saw his visible anger as the arrogant outburst of an entitled white male of privilege. If you’re a conservative, you see it as the moral outrage of someone falsely and very publicly accused. This all exemplifies a divided nation. There is no clear way of ascertaining the truth, as what little evidence there is, is testimonial, and doesn’t support either story unequivocally (how could it?).

We get into messes like this because, first of all, in a culture saturated with sexuality and sexual innuendo, sexual misconduct is bound to occur. To that extent, her story becomes somewhat believable. In a culture of distrust between the sexes, moreover, in which allegations become weaponized for whatever reason, false accusations are bound to be thrown around. To that extent, his story becomes believable. Consider, moreover, social media technology which research shows allows people to group themselves voluntarily into silos, echo chambers, where their premises and conclusions won’t be challenged. This is human nature, if you think about it. Result: divides grow until they are all but unbridgeable, views on the other side of the aisle are seen as illegitimate, and public differences of opinion threaten to turn violent.

An Age of Decadence will be characterized by distrust. This distrust will manifest itself in countless ways, some very visible and others little more than nuisances. An example of the first is the highly intrusive vetting for positions such as a lifetime seat on the Supreme Court, a process bound to be very public in our age of total media saturation. Given that men are now guilty if accused in this environment, this may eventually ensure that no one, no matter how well qualified, will want such a position. We aren’t to that point yet, but why would anyone in his right mind want to endure what either family, Kavanaugh’s or that of Dr. Ford, have had to endure? (As an example of the second above, the nuisance factor, the other day I was temporarily locked out of my PayPal account because I had a typo in my password when I tried to log in. The system threw me several security hoops I was compelled to jump through to prove “I’m me.” The sad fact is, in this age of hackers, most such measures are justified.)

Returning to Sir John Bagot Glubb. He documents, from excursions into the histories of Greece, Rome, Persia, the Ottomans, and others, that we’ve never seen a civilization turn around from an Age of Decadence and regain its original foundation. The sexuality genie in particular is unlikely to go back into the bottle. Our monetary foibles are reaching a critical stage as debt of all sort continues to mount. The U.S. national debt is unpayable and continues to grow by leaps and bounds. Many of the U.S. federal government’s larger legal obligations will eventually be unpayable. A lot of student loan debt will not be paid, even as those struggling to pay it sacrifice major expenditures (e.g., housing) that would contribute to the economy. Add to this the fact that other nations, recognizing the dollar’s loss of value and increasing fragility, are starting to “de-dollarize” (do business in their own currencies).

All of this, of course, leaves the future of the U.S. very uncertain, no matter who is president, no matter which party controls Congress, no matter which technologies promise to emerge tomorrow to increase our convenience and save us from ourselves.

What often happens as an Age of Decadence runs its course is that the old older simply collapses, whether slowly or rapidly. There is a vast loss of influence and sometimes territory. This happened with the British Empire. It happened with the Soviet Union. It is likely to happen to the U.S., after it happens to the European Union. At the culmination of an Age of Decadence, institutions lose their capacity to enforce the rules because those in them lose their will. The system itself loses legitimacy. Citizens will already have turned inward, either to “tending their own gardens” as it were, or acting with their follows to actively separate, which is the start of the building of replacement institutions in a new culture. There are innumerable persons and communities, some within the borders of the U.S., some elsewhere, who have to all intents and purposes seceded from a political economy they see as dying.

What should a philosopher have to say about all of this? Er, plenty, it looks like, although very few philosophers are saying anything (most are, er, “tending their own gardens” in their safe comfort zones of academia).

As I stated at the outset, a philosopher should look to the first premises guiding any civilization, explicitly or tacitly, and get positioned to evaluate them. This will be the topic of my next few posts.

 

Posted in Culture, Media, Philosophy, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , | 1 Comment

Why Is Philosophy Important? An Expanded Comment

Daily Nous, the philosophy blog, posted a recent query raising this question in response to an undergraduate who had fallen in love with the subject. Presumably she’d gotten some flak from friends or maybe family. The blog’s editor, Justin Weinberg (South Carolina), solicited and received a number of responses. Most were interesting and worthwhile. One was from yours truly. Reviewing it, I decided to expand on it here because I think more can be said on, Why Is Philosophy Important? Some of it I’ve said before, but it bears repeating.

First, and as my comment noted (perhaps a bit too brusquely for the delicate tastes of most career academics), very little academic philosophy is important. It provides a paycheck for those fortunate enough to have found jobs in the field, or who didn’t eventually abandon them out of frustration.

Let me envision two roles for philosophy that could secure its importance in civilization. I will call them philosophy as service and philosophy as thought-leadership.

Philosophy as service will center on critical thinking and the analysis of language, offering a kind of mental housecleaning. This is appropriate for the academic setting if the instructor approaches it in the right way, warning in advance that some people might believe their toes are being stepped on. A good course in the subject should provide a student with a sense of what it means to support a conclusion with reasons (premises) and why this might matter. The student should learn what makes reasoning cogent or fallacious. Ideally, students will not be as prone to fallacious reasoning either in themselves or in others. A student should also come away from a philosophy course alert to the fact that not everything in our reasoning is stated openly. One’s beliefs might (usually?) contain hidden premises. How we identify these, and what we do then, will be crucial.

Philosophy might also draw attention to what seem to be the limits of our reasoning. Reason alone cannot answer every possible question or settle every dispute. First premises are notoriously difficult to prove or disprove, after all. Otherwise they would not be first premises!

A more practical focus on language in philosophy ought to alert us all to the fact that there are plenty of people in this world who use language as a means of control or even domination, sometimes as the equivalent of a weapon. Words or phrases, carefully selected, will encourage some lines of thought while inhibiting others. The political and commentary spheres provide an abundance of examples. Any reasonably intelligent person should be able to go to any popular newsfeed and find a dozen examples in less than a half hour.

If anything will hobble this approach to philosophy as service, as mental housecleaning, it is because as an academic subject, philosophy has been self-limiting and self-deprecating for well over a century now. Much of this was due to its deference to science in matters epistemic. From Auguste Comte on, positivists and their descendants saw themselves as, at best, handmaidens to science in the sense that Aquinas saw philosophy as a handmaiden to theology. For a long time, this was understandable. Unfortunately, philosophy as handmaiden to science tells us little about how to evaluate all manner of recent scientific developments ranging from nuclear weapons to genetic engineering to artificial intelligence and beyond. As Dr. Ian Malcolm (played by Jeff Goldblum) quipped to other characters in the film Jurassic Park, “Your scientists were so preoccupied with whether or not they could that they forgot to stop and ask if they should!”

Positivism is therefore dead and buried, one of our worst modern wrong turns. But self-limitations on philosophy have remained. As I’ve noted previously, the analytic tradition whether in its formal or natural language varieties developed powerful techniques but never used them to their full potential. Used to their full potential, philosophical analyses of how words and phrases have crept into our the general lexicon and what they are used to do might shed great light on how those seeking controls over others’ thought accomplish this. Did Wittgenstein not say near the end of the Tractatus that asking, What do we actually use this word or proposition for? repeatedly leads to valuable insights? It also matters who the speaker is, how he or she self-identifies, where he or she is, i.e., at what level of which hierarchy, etc.

If one needs examples, consider the phrase conspiracy theory. A simple search would turn up dozens of usages. What are these usages attempting to do? This example illustrates how any good analysis of a term or phrase should include its origin and history, as the origin of this phrase with the Central Intelligence Agency back in 1967 is known. The CIA’s aim, in introducing the usage, was to circumvent, a priori, all serious discussions of ideas or theories those in power did not want around.

Or consider the term homophobia, which for over 20 years now has come to be used reflexively in response to conservatives who criticize the homosexual lifestyle and its political and legal protections. What is a phobia? The term has a recognized use, as an irrational fear of something (think of a legitimate usage, e.g., agoraphobia). Use of the term therefore automatically suggests that critics of homosexual conduct and its promoters are by definition irrational. That which is irrational is not to be answered with logic but with cured with therapy. Hence the power of the term to misdirect and confuse. Good philosophical analysis should unearth this, but typically does not for obvious reasons: it quickly runs up against powerful prevailing political / cultural currents. (Use of the phrases transphobia and Islamophobia indicates that the phobia suffix is spreading. Why not play a successful meme for all the mileage one can get from it?)

At present, if one is interested in this kinds of usages of language as instruments of control, one will glean far more from the writings of Aldous Huxley and George Orwell than from Wittgenstein or Quine or any of the other heroes of mainstream twentieth century Anglophone philosophy. The former, of course, did not have to worry about offending those signing their paychecks, or being blacklisted within the profession for having offended the wrong people with words or supposed conduct.

There is another, more ambitious role for philosophy, however, which rejects roles as handmaiden to something else, some other enterprise. This is the role of philosophy as thought-leadership.

The best role philosophy could play in present-day civilization as a repository of thought-leaders is in identifying, clarifying, and critically evaluating worldviews.

By worldviews we do not mean personal opinions. We mean usually tacit but still fairly comprehensive systems of thought that direct civilizations through their institutions (governing, mediating, etc.), manifesting themselves in culture.

These are not theories that philosophers simply spun out of their imaginations, although past philosophical theories influenced them. Those in other leadership positions, or simply in dominant ones, in society state or imply worldviews with their choices of words and phrases, or influential choices of what they see as important. Afterwards worldviews may operate as unstated premises in discussions of public issues.

Supporters of these premises may hold them so deeply that they do not see the need to state them openly. They may think anyone who rejects them (also implicitly) is pernicious, or evil. This may be one of the reasons why those on opposite sides of, e.g., the conservative vs. progressive divide increasingly tend to fly at each other’s throats instead of sitting down across a table and finding out what the other is thinking.

What they should do is explore their worldviews. Even if they still did not agree, they would have more clarity on what they were disagreeing about. They would surely not be any worse off. They might even find common ground and recognize a common foe.

Is there a dominant worldview in the West right now? Is there more than one, perhaps vying with each other for dominance? I have identified materialism as dominant even if it takes more than one form, and Christianity, once dominant, as still its chief competitor. It, too, takes more than one form. These qualify as worldviews whatever their other features, because they fully suffuse all significant aspects of the lives of those immersed in them. They define reality for that group.

A worldview will usually be expressed in some core text such as the Holy Scriptures or in key statements such as Darwin’s theory or Russell’s “A Free Man’s Worship.” It will find expression in a culture’s art, its music, what its leading voices see as of value or important, and sometimes in political ambitions. Why have some civilizations’ leaders taken it upon themselves to try to dominate the world, or as much of it as possible? Because their worldview defines not just empirical reality for them but all that is good and superior. They see universal allegiance to their worldview as the path to Utopia. Communists saw revolution against the bourgeoisie this way, in accordance with the historical laws of dialectical materialism (Marx’s phrase). Global corporate capitalists since the fall of the Soviet Union have seen the superiority of a consumption-oriented marketplace as key to general material prosperity, not just for Westerners but for everybody in the world. This, to them, is superior to all else.

This brings us to: is a prevailing worldview helping us or harming us … or, perhaps, helping some (perhaps empowering them) at the expense of others? Does identifying and examining worldviews help philosophers engage systems of power and propaganda, doing what Noam Chomsky once described as the responsibility of intellectuals: “to speak the truth and to expose lies” (see his essay “The Responsibility of Intellectuals”)?

The academic system doesn’t encourage any of this, of course. It doesn’t encourage my service role for philosophy in this form — not really. Which is why most critical thinking courses are just logic courses that leave out their most important potential applications. As that great comedian and social commentator George Carlin once wryly observed, the last thing the truly powerful, owners of the leviathan corporations, want is a population of critical thinkers. Much less do those in dominant institutions want publicly accessible critiques of their worldview.

But philosophy still has a job to do, if it is to be a force to be reckoned with, and otherwise, why consider it important? The two roles for philosophy I outlined above are fundamentally flipsides of the same coin, for doctrinaire and controlling language is bound to be worldview-embedded. How to carry forth this kind of project is the question those who see philosophy as important should be asking themselves and each other, and also nonphilosophers concerned about where Western civilization is going (if it is going anywhere).

If the self-identified professionals ever get out of their office cubicles, or break free of various ideologically-induced blinders, whether to look at their language or consider the role of worldviews in modern advanced civilization — if at least some can courageously rise above their present stations and engage these kinds of questions and see where they lead, then Yes, philosophy as a discipline will clearly be important. Some, I firmly believe, are up to this task. They will be the thought-leaders of tomorrow if the West is to survive.

Posted in Academia, Language, Philosophy, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , | Leave a comment

“Should I Pursue a Doctorate in Philosophy These Days?”

Should you even consider getting a doctorate and going into academic philosophy today? Even if you find the subject endlessly fascinating, and you have talent for it?

The question comes up occasionally on forums. Someone I am “friends” with on Facebook floated the idea. He posted that he was seriously considering it. (I’ve not met him, just read a few of his writings, thought many of them were both interesting and good whether I agreed or not, and he responded favorably to a friend request.)

I don’t often comment on his page, as I don’t really know him or his circle of real friends, but this time I felt moved. I advised against.

It has since struck me that others might find these reasons of interest, assuming those others happen to find their way to this humble, low-traffic philosophy blog — which includes not just philosophy but also the business of philosophy.

No, I would NOT recommend going into academic philosophy.

I speak as someone who did, obviously. In many respects I am still paying the price.

(1) The first and most obvious consideration is the job market for philosophy PhDs. It was horrible when I started (1980s), improved a little (not much) in the 1990s and early 2000s, and collapsed again with the financial crisis of 2008-09 during which a number of vulnerable people not in tenure track positions lost their jobs — in the state where I was then working (South Carolina) at least. Maybe things were better elsewhere, though I have no reason to think so.

The gradual replacement of tenure track jobs with part-time, adjunct positions has attracted some attention, moreover. Neoliberal administrators like hiring adjuncts because they save the institution money — so that they can spend it instead on that new building or pay for the latest campus beautification project while their corporate board (it might as well be) pays them six figures plus perks.

Media attention paid to adjuncts happened mainly because some were discovered to be, for all practical purposes, homeless, a handful had died from treatable conditions, and groups were forming attempting to unionize and bargain for better wages and working conditions (most do not have their own office space or computer terminals, which surely helps them build credibility with students who these days are going massively into debt to go to college).

Eons ago (back in saner times), adjunct faculty were usually retired professionals, willing to share their years of hands-on expertise on a subject by teaching a class. For this they may have received a small honorarium. There were retired professionals who did such things just to keep busy.

Most of today’s legion of adjuncts, many of them newly minted PhD’s and not retired professionals, will not find decent-paying academic employment. Many, if eventually saddled with family responsibilities, will be forced to leave academia in search of decent-paying work, as was the case after the job market initially collapsed in the 1970s.

(2) Long-time readers of this blog know my view that academic philosophy is basically a mess. Many of those in the profession would say otherwise. There are, after all, plenty of books published by academic presses, plenty of conferences held, an abundance of backslapping at national events, and every so often someone makes a splash in the waters of intellectual life with something that gets read and discussed. No one says there isn’t an abundance of activity. But when push comes to shove, these are not the days of truly first-rate minds like W.V. Quine and Ludwig Wittgenstein, or even Michel Foucault if you lean Continental. These are not even the days of Thomas S. Kuhn and/or Paul Feyerabend or Richard Rorty. These philosophers did not merely make ripples with their works. Quine’s “Two Dogmas of Empiricism” caused a tsunami, as it were. As did Wittgenstein’s Philosophical Investigations and (much to the chagrin of their critics) Kuhn’s Structure of Scientific Revolutions and Feyerabend’s Against Method.

Those days were gone by the 1990s. “Feminisms of the week” had ensued; and while the field had always been prone to fashions, the latest owed more to their conformity with rising political agendas than they did the kind of intellectual prowess of the above-listed works. If you are in an officially designated “marginalized” group, now expanded to include sexual minorities, you’ll receive at least some added attention from search committees that could lead to a tenure-track job. This could cut the other way, however, since most committees do not want to hire someone they fear would be disruptive, or might turn their department into an ideological war zone (yes, it does happen, I was there and I saw it).

What you’ll also risk is being “branded.” That is, you’ll be expected to contribute to the “literature” of your tribe, as it were. Today this includes not just “feminisms of the week” but philosophy “from a queer perspective” or now from a “transgender” perspective. And if you step out of line, e.g., by “misappropriation,” not writing from within the unique perspective of the tribe you’re writing about — even inadvertently, having written something well-intentioned — you’ll be punished. If you don’t believe this, “google” Rebecca Tuvel’s name (or go here).

You’re end up walking on eggshells around the politically protected — or just the administratively favored. Academia is not a place where you want to make enemies, something very easy to do. Conceivably, this is because so much of the work is of so little importance.

Let me qualify this. It is not as if pivotal historical figures like Aquinas, Hume, Kant, Nietzsche and Wittgenstein are being driven out by professionalized agitators seeking to “expand the canon.” That’s an exaggeration. But the writings of “dead European white guys” are clearly overshadowed these days, deemed less relevant in an age of “inclusion” or simply as “unexciting” (i.e., hard). While I’m in no position to take a survey and find out, I would love to know how many “feminist philosophers” of whatever stripe, or “gay philosophers” or “transgender philosophers,” or whatever next year’s favorite “marginalized” group will be, can outline and evaluate, from memory, Aquinas’s cosmological argument, or Hume’s criticism of miracles, or the second version of Kant’s categorical imperative, or offer coherent thoughts on what Nietzsche might have meant by “God is dead,” or offer some original, informed, and thoughtful commentary on the strengths and limitations of Wittgenstein’s later philosophy. These are things my generation needed to be able to do. Questions about such figures and their key contributions were on my prelims (a series of both written and sometimes oral exams doctoral students have to pass before advancing to actual candidacy).

Summation: academic philosophy has declined. The decline may be irreversible. Those able to reverse it are either struggling to survive in multiple part-time jobs leaving them little or no time for sustained scholarly endeavors (this is what I would deem marginalized in an accurate sense of that term). Or they have left for greener pastures and taken their talents with them.

(3) These are not days when subjects like philosophy are taken seriously at the administrative level, or necessarily by average students. In this neoliberal age, there is no money in them. If anything, they use university “resources” and don’t give anything back. The department I was in when I lived in South Carolina was the most poorly funded on campus. I felt supremely lucky at the time to have escaped the axe in 2008-09, because at the time I needed the job! Today, entire philosophy departments are actually being closed down at some institutions (Wisconsin Stevens Point is an example; Western Illinois I think is another; more are likely to follow). Those with tenured positions at such places are losing them!

In my experience, the majority of students do not take the subject especially seriously. I had many students who appeared to expect good grades just for showing up. In a sense, I get it. As mentioned briefly in passing earlier, students are now paying through their noses to attend a university, or going massively into debt. Doubtless most are conscious of this, and want to make sure they graduate employable. They come to a philosophy class and wonder what studying Aquinas or Kant contributes to their future employability, and when they come up empty, they grow restless. They’ve been indoctrinated to think of themselves as consumers as well as students, future inhabitants of our mass-consumption paradise. With them having grown up in a media-saturated and entertainment-saturated culture, the professor who is a cross between Socrates and Seinfeld has something of an advantage in class. Can you do that? is a question I would ask a prospective academic philosopher. Are you willing to do it?

Let’s take note of another obvious sea-change of the past 20 years: the rise of mobile devices. There is probably no one in any advanced nation in the world that doesn’t own at least one. A recent study shows that social media has greatly shortened the average attention span (it has been measured as less than that of a goldfish). Students are now addicted to instant gratification, and the addiction is borderline-physical. We have other studies that have documented that checking Facebook on your phone and seeing the latest “likes” on your posts actually supplies a dopamine rush to your brain that reinforces the behavior. This means that millions of social media addicts are literally unable to go more than a few minutes without checking their phones. Tell students at the start of a class to turn their phones off, and by the end of a 50 minute class they may actually be feeling physically uncomfortable — the discomfort of the addict who needs his fix!

This is the landscape you’ll have to navigate if you decide to embark on a doctoral program in philosophy (or in many other subjects, for that matter). Incidentally, it will begin with not pissing off the wrong people when you’re still a graduate student. I knew a guy who did this simply by being an outspoken Republican, and eventually rose to being president of the campus Republican Party group. That was the 1980s. Reagan was president. The situation is magnitudes worse today, with Trump in the White House. Today, on some campuses, outspoken Republicans are called “Nazis” or “fascists.” They risk being physically assaulted. Cases are too numerous to link to individually, and there are likely many cases we don’t know about.

(4) Obviously, if you want to navigate this minefield, it’s your decision. It’s your life. In that case, choose a “ranked” doctoral program you’ll be hired out of. There are programs where you can learn a great deal, of course, and avoid most of the trendy rubbish. University of Pittsburgh seems to be such a place, or at least it still was a decade ago. University of California at Berkeley may seem zoolike because of all the adverse publicity surrounding conservative speakers there, but graduates of the school’s philosophy doctoral program tend to find good jobs. Numerous important recent philosophers have taught there: John Searle, Hubert Dreyfus, and the above-mentioned Paul Feyerabend, are just three examples. The department continues to have a top-flight reputation.

If you must embark on getting a PhD in philosophy, do your homework. Interview those in a prospective department, even as they are interviewing you. Presumably you are there because you have some idea where you want to specialize, and what will be your AOS (Area of Specialization) is well-represented there, and known by others to be well-represented there. Ask for the ratio of those who eventually found tenure-track jobs out of their program to the total number who sought jobs, which would include part-timers or those forced to take nonacademic employment.

Keep in mind, too, that if you end up in this final category, you’ll have to get used to being told you’re “overqualified” for whatever bullshit job you might find yourself applying for, increasingly out of desperation. Entrepreneurship is a possibility, but getting a doctorate in philosophy will not give you entrepreneurial skills. It might even do the exact opposite, by encouraging you to write for the tiniest and most academic of audiences material that will be light years over the heads of average readers. Let’s note in passing that the number of people who read books has also fallen off dramatically during the social media era.

(5) Given the World Wide Web, there are plenty of venues for discussing philosophical problems — and for all I know, some of them might be monetizable (for those who have asked, this blog doesn’t get enough traffic to make it worth the effort).

You can lecture on a YouTube channel if you’re so inclined (again, I’m not).

You can do podcasts.

My point is, there are ways of involving yourself with philosophy, and with other philosophers, that do not subject you to the abuses of academia, and to a discipline that is arguably slowly and painfully killing itself.

As I said, though, it’s your decision. Good luck. And remember: you were warned.

Posted in Academia, Higher Education Generally, Philosophy, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , , , , , , , , , | Leave a comment

“Anti-Intellectualism and How Fascism Works”: A Comment

I followed the link from here to IHE’s “Anti-Intellectualism and How Fascism Works,” an interview with Jason Stanley (Yale) who has authored a book entitled How Fascism Works. I’d been thinking of posting a comment, but discovered that the comments thread had been closed by the site administrator. This seems odd, since the interview is less than two days old.

Ninety comments appear. While comments sections do often degenerate into pointless slugfests, except for a very few posts this one did not strike me that way. While there was sustained and sometimes vigorous disagreement, some fundamental issues were being raised. I’ll leave it to readers to ponder why the comment section was closed so soon.

In any event …

Admittedly this was a short interview, and I’d been anticipating something longer, conspicuous in its absence was a clear, concise definition of what the author means by fascism. Why is this important? Not just because it is in his book title, but because of the way the term is thrown around and never defined. How gullible do you have to be to realize that fascist has become one of the big demonizing and weaponized words of the day?

The closest Stanley comes to a definition is this (it is, as we see, a definition not of fascism but of a variety of fascism he calls fascist anti-intellectualism).

Fascist anti-intellectualism sets the traditions of the chosen nation, its dominant group, above all other traditions. It represents more complex narratives as corrupting and dangerous. It prizes mythologizing about the nation’s past, and erasing any of its problematic features (as we see all too often in histories of the Confederacy and the Reconstruction period, or of the treatment in history books of our indigenous communities). It seeks to replace truth with myth, transforming education systems into methods of glorifying the ideologies and heritage of the members of the traditional ruling class. In fascist politics, universities, which present a more complex and accurate version of history and current reality, are attacked for being places where dominant traditions or practices are critiqued. Fascist ideology centers loyalty to power rather than truth. In fascist thinking, the university is simply another tool to legitimate various illiberal hierarchies connected to historically dominant traditions.

I don’t question that this was true of Hitler’s German and Mussolini’s Italy. But a version of this same thing is true of any totalitarian ideology, including those of the left such as Communism, the primary difference being that they “mythologize” about their futures instead of their pasts. They surely “replace truth with myth …”  I hope no one seriously believes universities under Communism presented an “accurate version of history and current reality …”  Surely we recall the Lysenko case.

This aside, in the absence of a definition for its key term, one has to suspect that the subtext is just another attack on President Trump and his supporters. The defense of elites here, however qualified (“Our suspicion of elites and what could be seen as anti-intellectualism can be healthy at times;…”), surely supports this interpretation, since it was anti-elitism that Trump successfully appealed to from the get-go.

Thus we see more of the same: possibly yet another lengthy ad hominem argument, with no real analysis. No analysis, that is, of the whys and hows of a guy with no previous experience in the political arena was able to trounce sixteen Republican competitors and then go on to defeat the Democrats’ and cosmopolitan elites’ anointed candidate. However small the margin in the Electoral College, and whoever won the popular vote, the point is: Trump won. How was that possible? Why did it happen?

Could it be because the mainstream of both political parties has collapsed, has simply lost credibility with the voting public? Could it be, too, that alternative sources of information readily available on increasingly sophisticated Internet platforms were successfully challenging dominant narratives? The latter would explain the cold war against “fake news,” the latest gambit being played out in this war is Alex Jones’s InfoWars being kicked off Facebook and numerous other social media sites. Whatever one thinks of Alex Jones, it is hard to see this as anything other than a move by those who see their current mission as establishing Ministries of Truth.

Returning to the closed comments thread, one comment leaped out at me. The author signs himself only as “Cultural Anthropologist”:

As a Professor Emeritus who has just completed fifty years of teaching at a Ph.D. granting university, I know for sure that the statement “universities…present a more complex and accurate version of history and current reality” is wantonly false. Also false is “Above all, the mission of the university is truth.” The mission of universities today is to advance “social justice,” “diversity,” and “inclusion” (but not of Asians). At least in the social “sciences,” humanities, education, and social work, the mission is to advance a far left wing ideology about society, to undermine the West and Western civilization, to negate liberal rights and protections in favour of statism and identity categories, and to push forward practical methods for implementing “social [in]justice.”

All true. It has been a long time since truth was central to the mission of academia, or education at any level.

But if inculcating herd behavior and obedience to authority are major prerogatives, departures from which are punished with ostracism at best and career destruction at worst, then higher education of the past 40 years or so has been a stunning success!

“Cultural anthropologist,” after all, was immediately attacked by subsequent posters, after all, the first of which accused him (her?) of “spew[ing] … bile.”

Another demanded evidence, making me wonder what cave he (she?) has been living in for going on 30 years now.

Admittedly it’s just a comments thread, but this is the sort of thing I’ve been talking about for a long time, and it’s hardly limited to comments threads.

And perhaps Jason Stanley defines fascism in his book, which I’ve not obtained as I am outside the U.S.; obtaining hardbound books in English in a timely manner where I currently live is possible but extremely expensive, and truth be known, resolving such matters as this is not my highest priority just now.

I’ll conclude by noting … I’ve no idea whether anyone reading this will believe me or not (or will care): the Right is far from getting everything right. I’ve no compunction to defend what the Koch Brothers do, and I’ve certainly no desire to defend the transformation of universities according to the “business model.” It seems to me, however, that this model wouldn’t have been so easy to implement over the years had academia truly had as its mission the discovery and communication of truth during the decades that preceded the current tendencies, much less in the present.

 

 

Posted in Academia, Books, Election 2016 and Aftermath, Higher Education Generally | Tagged , , , , , , , , , , | Leave a comment

Philosophers and Social Media: A Comment

Those who read last week’s note will probably say, “Wow, that was a short break!” This is a comment, though, not a stand-alone essay like many of its predecessors. This despite it’s getting longer than I intended.

Should philosophers “do” social media? Rebecca Kukla (Georgetown University) says yes, they have much to gain — especially younger philosophers (for a very short excerpt go here). She is not my favorite academic philosopher. I explain why here. She is among those I label pseudo-marginalized.*

Some of her observations on social media usage are interesting in their implications, however, as are Brian Leiter’s. I’ll confine my observations to Facebook, since I’ve used it more and know it better than any other social media platforms.

Some people, academics or not, despise Facebook and refuse to use it. Their reasons aren’t all that clear. Facebook has become a corporate empire, one of many in the tech world, but that’s not necessarily a reason to avoid it. It’s being in bed with the CIA, the NSA, and probably a dozen other shadowy federal agencies, may be more telling. Facebook stores your information, but it’s hardly alone in doing that. If you refuse to use Facebook out of fear for your privacy, you are naïve. Privacy went out the window with email.

Does Facebook censor? Of course it does. I’ve known people put in “Facebook jail” (their term for it) for politically “insensitive” posts, especially about Jews (surprise, surprise). I’ve not had a problem, maybe because I don’t post about the “Jewish problem.” This despite defending Donald Trump from what I consider myopic, incompetent criticism. I’ve penned countless exposés of academic political correctness and corporate media dishonesty.

What I suspect: the upper echelons of the Facebook world disdain political discussion generally. I’m not sure I blame them. The platform wasn’t designed for that. Moreover, the research is coming in: social media are among the dividers in American society. People have a tendency to congregate with those like themselves, who share their beliefs and opinions, especially in politics. Facebook unintentionally encourages this. Its system of friending, liking posts, commenting, etc., sets up feedback loops of positive reinforcement. Don’t like a friend’s posts. Ignore them and eventually you won’t see them. Or unfriend him or her. Thus the formation of echo chambers, whether of the right or of the left or anywhere in between.

Some Internet users, moreover, had become “keyboard commandos” who found it easy to insult or bully those outside their echo chamber before Facebook was around. Now, it is as if the differences between online and offline worlds have begun to blur. Public incidents we would never have heard about 30 years ago are now filmed on mobile devices, uploaded to social media, and viewed almost instantly by millions of people. Victims of this sort of thing become involuntary celebrities. Or perhaps better, celebrities-in-reverse, since we aren’t celebrating them but shaming them. Online shaming has almost become a sport!

I think this is a reason we are living in a more hostile society generally. While pundits (Steven Pinker comes to mind) tell us how much violent crime has dropped during recent decades, such measures don’t reflect cyberbullying, personal attacks, shaming incidents, etc., none of which are illegal (some may enter what is, at best, a gray area).

Left and right, unaccustomed to opposition due to online lives in their echo chambers, are more and more willing to demonize and confront one another violently. To be clear: my boots-on-the-ground sources tell me it is usually the left that gets violent first. But those on the right are increasingly willing to get in their faces. The latter aren’t afraid of guns like the former. Were a situation like those we’ve seen in Portland, Ore., and yesterday as I write this in Berkeley, Calif., to get out of control, there’s no reason to think leftists would win even if they have superior numbers.

Facebook did not create our current divisions, of course. But it set the stage for accentuating and aggravating them.

All that said, Facebook has advantages. Through its networking possibilities I’ve formed a few strong friendships with people I would never have heard of otherwise, rediscovered folks I went through high school and college with, and maintained friendships that would have fallen by the wayside when I relocated geographically several years ago.

There are, moreover, hundreds of private groups on Facebook devoted to every conceivable subject, including philosophy. Many of these groups are closed, and don’t allow insulting other members, or bullying, or trolling. Their administrators post rules up front and do not hesitate to expel those who refuses to follow them. Such groups can be useful venues for conversation, advice on mundane problem-solving, support for those coming to them with more serious issues, and more.

Many who use Facebook, just use it to announce family events (vacations or anniversaries) the way we used to do with photo albums in the pre-Internet days. I think Facebook’s algorithms are more attuned to such usages. At the start of the month I posted an anniversary photo taken of my wife and me four years before on the day we got married. It received over a hundred “likes” and dozens of congratulatory comments. I’ve seen this happen countless times.

On the other hand, my political posts rarely get more than five “likes,” unless I’ve shared a video. Somehow, that increases the number, probably because watching a video is less demanding than reading something. Absent a video, with just a link to an article or story and a paragraph or two of commentary, many don’t seem to be seen at all. (I’ve no means of knowing, of course, how many people “lurk,” i.e., read my material without doing anything to announce their presence.)

Enter Rebecca Kukla, who (speaking of social media generally) calls it “our main opportunity to craft our public persona and to forge connections with other philosophers.” She adds that staying off social media “can actively harm your career, while using it wisely can actively help you, and can enrich your professional and intellectual life.”

There are no a priori reasons it can’t do this. Her discussion converges on Facebook, where many of her observations parallel mine, in that it creates space for professional contacts that open doors, especially for younger scholars, by having “created a vast set of interlocking philosophical communities.” She continues:

Through Facebook (and to a lesser extent Twitter) I have been exposed to, had conversations with, and formed friendships with a dramatically wider range of philosophers than I otherwise would have. My philosophical community is no longer bounded by geography, by job status, by age or social identity, by type of institution, or even by subfield or methodological approach. I expect most of us tend to disproportionately make Facebook friends with ‘people like us’ to some extent, but there is just no doubt that social media has broadened many philosophers’ exposure to different kinds of scholars, issues, and conversations. Junior philosophers who give these communities and exposures a pass are missing out on something that could enrich their intellectual and social lives, and they are forgoing crucial networking opportunities.

This makes sense, but there are dangers she wants readers to be aware of. One of the features of the Facebook world (this was true of the online forums that preceded it) is that you never know who might stumble across it, or even seek it out when they want information about you. We all know this, but how easily we tend to forget it “in the heat of the moment” (my quotes, not hers). Hence the environment, she says, “is fraught with peril. An online fight with the wrong person or a post that rubs people the wrong way can do real damage.” The norms are still evolving, she adds. Posts intended for a particular audience that will read them favorably might be read quite differently, and negatively, by readers outside that loop.

All entirely correct. Kukla thus assembles a list of online best practices for younger philosophers, especially those struggling with a hostile job market or perhaps dealing with rejection from an academic journal. Don’t sound off about it online. It makes you sound bitter and uncollegial. She advises against posting trivial stuff — or material likely to be seen as trivial or juvenile by those a jobseeker may be trying to impress. She suggests creating a separate Facebook account for family, friends you went through high school with, and nonacademic friends generally.

But here’s a thought: is it possible to “do philosophy,” i.e., do more than simply try out ideas or banter about philosophical issues, on Facebook or other social media platforms. Kukla again has many valid points about the latter of these; she says little about the former. What she says is to refrain from dismissing entire areas of philosophy or dismissing philosophers who are well thought of or engaging a given philosopher’s post without doing some basic research to find out who they are (an easy mistake I once committed).

One must ask whether this kind of platform is really suitable for philosophical research (as opposed to networking, testing out new ideas on colleagues, etc.). Why? Because most of us originally majored in philosophy in order to “do philosophy,” not merely banter about it. Facebook wasn’t invented with that in mind, though. Nor was any other social media.

“Doing philosophy” on an independent blog such as this is hard enough! I have not done as much as I intended. I did not plan a news site like Leiter’s (who can compete with him on that, and why would anyone want to?). The Internet is simultaneously liberating and limiting! It is liberating in the sense that I don’t have an editor or referee board making trivial criticisms that I’m using this or that term “unclearly” when the truth is, he dislikes my main thesis or conclusion. On the other hand, the lack of oversight means taking full responsibility for what appears here, and seeing to it that what results is as good as I can make it! A couple of extra pairs of eyes would be helpful, but as an independent scholar with a different occupation, I don’t have that luxury! What limits me is a paranoia that what I have is not good enough! Hence a trove of things sitting in Word files!

Blog entries, moreover, no matter how thoroughly they argue a philosophical thesis, tackle a quandary, or how well they play by the rules of citing relevant literature, etc., are never cited in journals or in The Philosopher’s Index. Much of academic philosophy’s reporting system on the philosophical work out here is still stuck in the pre-Internet era. Having said that, yes, you will find philosophy on blogs that is simply lousy: unoriginal, poorly reasoned, etc.

But all this is aside. Social media is here. We might as well use if we can, if it solves certain problems like networking. But do we need to use it to advance philosophical conversation?

Leiter observed that the areas of philosophy he is most familiar with (e.g., philosophy of law) don’t make much use of Facebook. Younger philosophers will say he’s dating himself, as am I, for I am thinking he may be right. Most of us, of the generation now in its 50s and 60s — the first “lost generation,” some of us, anyway — grew up without computers. My generation had no social media when we were graduate students. I don’t believe researching the material that went into my dissertation the old fashioned way — hours of library work, consultations limited to senior faculty in my department red-penciling my work — was significantly hurt by this. We may have been limited by technological doors not yet built much less opened. Would Facebook have helped? I don’t know. Given that we were a rambunctious lot who rarely hesitated with our opinions, and in a “nonranked” department to boot, Facebook might have been disastrous for us.

What “Facebook philosophy” I’ve seen has been superficial, sometimes reinventing the wheel, sometimes taking positions that have been argued against effectively outside their preferred orbits, sometimes arguing theses so kooky and outlandish no one is going to take them seriously (e.g., about how we can know there are extraterrestrials among us). Much political-philosophical discussion, frankly, very much fits into the universe I described at the outset, in which bodies of like-minded folks have congregated because they work essentially from the same ideological premises. Academics are no less prone to the echo chamber effect than anyone else. Kukla — again: surprise, surprise! — eventually falls into this trap, advising readers:

…don’t trust people with fundamentally terrible values. The misogynists and the bigots and the Trump voters on your page are likely to harm you, because they are harmful people with no moral compass. Arguing across such large divides is emotionally exhausting and pointless anyhow. Just get rid of them and protect yourself.

Who gets to decide whose values are “terrible”? Is it obvious who has a legitimate grievance versus who is a mere “bigot”? How many “Trump voters,” I wonder, has she actually met and engaged, online or otherwise? Some of her neighbors may be “Trump voters,” after all. Is she saying that 63 million of her fellow Americans “have no moral compass” because they voted for Donald Trump?

This, of course, is the sort of arrogance that alienates career academics from their fellow Americans, even if their tenured status enables them not to have to care. It vitiates some of her earlier advice, while confirming what the research tells us about what social media might be doing to us.

What conclusions should be drawn from this? Kukla is right that academic philosophers — and those who aspire to be — have opportunities to use Facebook or other social media exercising caution appropriate to their personal situations. They can bring their work to the attention of others. This can have positive results. They should be aware that what they say online can have negative repercussions, however.

Nothing in my experience suggests that social media is of any help in “doing” philosophy. It sounds pedantic, but those who built the Western tradition, and later the various schools (analytic, continental) did just fine — and probably much better — without it. The problems with producing quality philosophy today have little or nothing to do with social media, though, and everything to do with the structural problems of academia and of a prevailing political economy in both academia and the larger society that is hostile to values presupposed by philosophy.

Real philosophy is difficult to produce, which is why we see so little of it these days. Discussions of fundamental philosophical problems, developments of extended arguments and counter-arguments against one or more premises of someone’s attempt to tackle such a problem, are bound to be far more involved than is possible on Facebook — which does have a limit on the length of a post or comment (as I’ve discovered by running up against it a few of times). Substantial contributions to philosophical debate cannot be composed in one sitting, like the majority of Facebook posts. They are not off-the-top-of-your-head events.

Again, and in sum, social media were not designed with long, involved, nuanced essays and followup conversations in mind. These call for concerted attention and effort on the part of both writer and readers. If anything, research is also showing that social media is actually shortening users’ attention spans. Blogs open some possibilities, but even they are limited as I’ve discovered. Are those of us who blog about the philosophical and larger academic community, events in the larger society that might impact on intelligent conversation, etc., really parts of an independent intellectual vanguard as we like to think of ourselves, or are we just borderline-narcissists venting in our private echo chambers?

Time will tell, but however many contacts I’ve made or maintained on Facebook, I don’t expect to see any major philosophical breakthroughs there, or on any other social media platforms.

 

*The pseudo-marginalized:

(1) invariably have tenure, typically at influential institutions almost guaranteeing visibility. Georgetown is not an insignificant university;

(2) strongly identify with identity politics, and hence can’t write without constant reminders to readers how prone to mistreatment they are, and how mistreated are those in their preferred group(s);

(3) are often bullies, without being aware that this is how they are seen by those not in their preferred group(s); Kukla’s blithe disdain for “Trump voters” is a case in point, as was her attack on philosopher of religion Richard Swinburne, someone whose work we theists find interesting and valuable;

(4) have no sense of the contradiction between their privileged status (tenure) often attained by their institution’s preferential policies, and their wearing the mantle of victimhood almost as a badge of honor; and finally,

(5) are mostly clueless about how power really operates in industrial and post-industrial civilization, and from where (what sorts of institutions) it emanates? As long as they are swinging broadsides at windmills of white-maleness (or straight white-maleness or straight white-Christian-maleness), we can expect their cluelessness on such matters to continue.

Posted in Academia, Media, Philosophy, Where Is Philosophy Going? | Tagged , , , , , , , , , , , | Leave a comment

Taking a Short Break from LGP

Hello. Those who chance to browse around this site may have noticed the dearth of posts in July, not that I posted a great deal during previous months. I’ve a few items planned, nothing completed, but truth be told, for the past several months my focus has not been on philosophy. Focusing on philosophy when you are not teaching and not independently wealthy is simply not a live option.

Thus for these past several months, last month in particular and most likely for a few months to come, my focus has been on developing what is likely to be my occupation for the next 10 to 15 years (hopefully): copywriting. I am doing what I need to do in order to learn the job and do it effectively. This takes huge chunks out of my day including my writing time, meaning that there is less time for projects like this that don’t make a contribution to it … but could eventually benefit from it.

I have taken note which posts I’ve done in the past seemed to generate the most traffic: the review of Stefan Molyneux’s book The Art of the Argument rose to the top far and away (doubtless because a number of sites with far more visibility than mine discovered it and linked to it); the posts (e.g., this, and this) on the follies and foibles of contemporary academic philosophy collectively came in second, interestingly the more specific the post the better the traffic; the posts on important twentieth century philosophers, on Wittgenstein, and on “Consciousness and the Brain” also did well. Sadly, my observations on thinkers such as Leopold Kohr have done wretchedly; also on topics such as globalism. This is unfortunate, because both need a broader audience and wider discussion. The former definitely has something to say regarding the latter.

All of this is noted for future reference in any event, and comments and suggestions from readers are always welcome.

In the meantime, you (you’re still there, right?) can expect either a handful of much shorter posts (shorter than this one) or no posts at all for the remainder of summer and possibly for much of the fall. Rest assured, where philosophy is assured, I am never that far away.

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment

Identity Politics Has Nearly Destroyed the Humanities; Now It Is Threatening the Hard Sciences

This article by Heather MacDonald is a must-read! If you thought you could escape identity politics by going into the sciences, or possibly even into engineering, think again.

The article’s opening paragraphs spell out clearly what is happening in scientific organizations, academic or in government:

Identity politics has engulfed the humanities and social sciences on American campuses; now it is taking over the hard sciences. The STEM fields — science, technology, engineering, and math — are under attack for being insufficiently “diverse.” The pressure to increase the representation of females, blacks, and Hispanics comes from the federal government, university administrators, and scientific societies themselves. That pressure is changing how science is taught and how scientific qualifications are evaluated. The results will be disastrous for scientific innovation and for American competitiveness.

scientist at UCLA reports: “All across the country the big question now in STEM is: how can we promote more women and minorities by ‘changing’ (i.e., lowering) the requirements we had previously set for graduate level study?” Mathematical problem-solving is being deemphasized in favor of more qualitative group projects; the pace of undergraduate physics education is being slowed down so that no one gets left behind.

The National Science Foundation (NSF), a federal agency that funds university research, is consumed by diversity ideology. Progress in science, it argues, requires a “diverse STEM workforce.” Programs to boost diversity in STEM pour forth from its coffers in wild abundance. The NSF jump-started the implicit-bias industry in the 1990s by underwriting the development of the implicit association test (IAT). (The IAT purports to reveal a subject’s unconscious biases by measuring the speed with which he associates minority faces with positive or negative words; see “Are We All Unconscious Racists?,” Autumn 2017.) Since then, the NSF has continued to dump millions of dollars into implicit-bias activism. In July 2017, it awarded $1 million to the University of New Hampshire and two other institutions to develop a “bias-awareness intervention tool.” Another $2 million that same month went to the Department of Aerospace Engineering at Texas A&M University to “remediate microaggressions and implicit biases” in engineering classrooms.

Back in the early to mid-1990s, I argued to anyone who would listen that if the preferential-admissions, preferential-hiring bandwagon was not stopped, and if the collective academic grievance industry was not exposed for what it was (at the time it was limited to English and comparative literature departments, and had crept into law schools) these movements would eventually inundate everything in their path, like successive tidal waves.

It was clear, at least to me, what was coming. I even detoured from my original intended academic career path (emphasizing history and philosophy of science, epistemology, and metaphysics) to write a book on the subject. My effort was not perfect. There were things I neglected. It was, however, researched from scratch and written on my own time without the sort of administrative support (e.g., grant money) that would likely have been available to someone on the other side of the fence and would have enabled a more complete job.

I argued, among other things, that we were dealing increasingly with a flood of people into academic disciplines, many of them already with tenure, who could not be reasoned with, arguing (incoherently) that reason itself as a “white male social or cultural construct.” Many are the intellectual equivalent of bullies who don’t care about anything except furthering their group-focused political agendas. Neither intellectual curiosity nor a desire for a better world for all of humanity is what motivates them.

The juggernaut, needless to say, was not stopped. It was not even seriously opposed. By serious opposition I do not mean holding a few seminars here and there, or putting together organizations and holding conferences once a year and putting out a journal filled with articles on how terrible things are getting in academia, as did the National Association of Scholars and a few similar groups characterized only by their utter ineffectiveness.

By serious opposition I mean putting financial resources behind the opponents, so they can get in front of and effectively use both national and social media, as Donald Trump did. Trump could do this because he’s a billionaire. Most of us are not billionaires, or anything close. The plain truth is, this is not a part-time job, or a hobby, something to be done on our own time! 

Read the above quote again. Governmental entities are now throwing millions into promoting “diversity” ideology!

I once contracted to write a piece critical of a “diversity” program in a city school district in Charleston, South Carolina. I was paid $2,000 by the organization that published my piece (long since taken down). I later learned that the author of the pro-“diversity” proposals I was criticizing was paid … are you sitting down?… $2 million!

That is the sort of thing critics of identity politics and “diversity” social engineering are up against!

Identity politics has advanced to the point where if you are known to oppose it, you will not be hired for a teaching position in a college or university, period. 

You cannot criticize it openly without tenure, or you will not be rehired, period.

You criticize it with tenure at your own peril, given at least one known case (Evergreen State) where so-called social justice warriors were successful in their efforts to drive a tenured biology professor (Bret Weinstein) off campus with threats of violence, which included protesters wielding baseball bats!

Cowardly administrators openly told Weinstein that if he returned to campus they could not guarantee his safety!

Telling the truth, or merely expressing points of view other than officially-approved ones, has thus become not merely difficult but dangerous, and not dangerous merely to individual academic dissidents but to Western civilization itself.

What happens when acceptability of scientific results starts to be dictated by whether or not it has the approval of what is become a national (and corporate) “diversity” police?

Do we want Western civilization to survive in any form our parents and grandparents would have recognized? If so, it will take organization, and it will take education: far more than it would have taken back in the 1990s!

What are we willing to do — if you are reading this, what are you willing to do — to help it survive?

I know of many authors and others who have given up, too many in fact to link to individually. Whether it is this issue, or the looming global debt bomb about which also nothing is being done, they figure the ship has sailed. They figure this is how empires gradually collapse.

Prove them wrong! This is a challenge! At present, no one I know of who stands on the side of more traditional forms of scholarship, in which truth and rationality in some sense of those terms meant something, is in a position to defend it in the way needed, with sufficient resources behind them. Having a blog and a website is clearly not enough. Writers with far more visibility than I have are unable to do it. Academic, governmental, and corporate entities are now almost entirely in the hands of the “diversity” committees and their thought-police forces.

What I would recommend doing, for educational background purposes, is returning to the original mindset that empowered the civil rights movement, a movement based on ideals of justice as basic fairness and encouraging of kindness (something today’s “diversity” crowd knows nothing about). I would also recommend abandoning materialism as a worldview, as I have argued elsewhere. One thing at a time, though. Read Dr. King’s letters, and other documents of the era, leaving aside cultural Marxists such as Herbert Marcuse, which was when things began to go off track; the Supreme Court’s catastrophic Griggs decision (1971) then threw everything into a tailspin from which it has never recovered. Read someone such as sociologist C. Wright Mills. Start with his The Power Elite (1956), to get a sense of how globalist forces looked to scholars at the time, in comparison to how those have advanced since. It is always helpful to remember that the globe’s real ruling class does not care about minority groups, it does not care about women as such, it does not care about homosexuals, it does not care about transgenders. It has no ideology other than money and power, although this does not prevent it from using ideologies, sometimes of the left and sometimes of the right, depending on what serves its purposes at a given time with a given audience.

In sum, there is no repairing this damage from the inside. The dominant organizations in the U.S. are gone. What needs to be done will require organization, education, and conversation directed from outside, including from outside of academia. It will require the eventual formation (and funding) of new institutions to carry whatever is left of Western ideas forward. If this is impossible, then we have indeed signed Western civilization’s death warrant. Given that identity politics is now affecting demographics on an enlarging scale, all we will have to do is wait for whites to become a minority group in the U.S., the one minority group with no legal protections, and then for the U.S. to end up like a much larger version of South Africa.

If you approve of ideas like these, support them! Go to my Patreon site and sign up to make a pledge. This is your civilization, too! If you want to preserve it, then do something about it! Remember, the promoters of “diversity” are receiving millions from centralized governmental and corporate entities to further their goals! A few pledges of, e.g., $25 or $50 per month won’t do much against that, but it is a place to start and it is better than nothing at all!

 

Posted in Academia, Culture, Higher Education Generally, Where is Civilization Going? | Tagged , , , , , , , , , , , , , , , | 3 Comments