E-Philosophy: A Brief Manifesto

The word philosophy comes to us from two Greek words meaning the love of wisdom. What is wisdom? Knowledge, both theoretical and practical, used in ways both defining and helping to bring about what is good and beneficial in life (i.e., in our lives as persons), based on respect for all life and its unique instantiations. Wisdom surely includes awareness of the limitations on our knowledge and the risks involved in action. It may seek to minimize them where possible, accommodating risk where minimizing it is neither possible nor necessarily desirable.

The Internet, moreover, has become an arena of mixed blessings. Its possibilities give its users a level of reach never before seen. When one uploads a post, one never knows who is reading, or where they are, or how (or if) they will respond. One of the problems is that the Internet has now become so large and cluttered that getting one’s posts noticed is now almost as difficult as it was in the days when we outsiders were printing and distributing leaflets through snail mail. But because of the potential reach leaflets did not have, we would be crazy not to make the attempt. Hence E-Philosophy, which moves philosophy from the physical classroom and publication to cyberspace.

We stand at a unique juncture in time. In the near future, a decision will be made: continued elite domination over the nations of peoples of the world, or an end to such domination however accomplished and the beginning of a world based on principles of responsible freedom, peace (including peaceful and voluntary interactions at all levels), decentralization, and technology that creates abundance instead perpetuating artificial scarcity. Whether we look at the current political unrest reflected in the rejection of elitism in its various manifestations by peoples around the world, the economic uncertainty motivating much restlessness, or just at the fact that technology is changing every facet of the world around us and taking our lives in directions our ancestors could never have begun to imagine, one thing becomes abundantly clear: either we rise to this occasion, or we pass into history as another failed civilization, as did Rome, the Ottomans, the British Empire, and many others. For continued elitism only presages global unrest, destabilization caused by resistance to corruption and perceived injustice, financial irrationality, and inevitable long-term collapse (which probably began in 2008).

Can E-Philosophy, the love of wisdom, contribute to this rising to the occasion? It can surely do not worse than its academic antecedent!

Academic philosophy, in its environment, has become (with rare exceptions) almost completely irrelevant. Even some of its more thoughtful practitioners such as Harry G. Frankfurt of Princeton (author of the celebrated On Bullshit) admit that academic philosophy is “in the doldrums”:

I believe that there is, at least in this country, a more or less general agreement among philosophers and other scholars that our subject is currently in the doldrums. Until not very long ago, there were powerful creative impulses moving energetically through the field. There was the work in England of G.E. Moore and Bertrand Russell and of Gilbert Ryle, Paul Grice, and Herbert Hart, as well as the work of various logical positivists. In the United States, even after interest in William James and John Dewey had receded, there was lively attention to contributions by Willard Quine and Donald Davidson, John Rawls and Saul Kripke. In addition, some philosophers were powerfully moved by the gigantic speculative edifice of Whitehead. Heidegger was having a massive impact on European philosophy, as well as on other disciplines – and not only in Europe, but here as well. And, of course, there was everywhere a vigorously appreciative and productive response to the work of Wittgenstein.

The lively impact of these impressive figures has faded. We are no longer busily preoccupied with responding to them. Except for a few contributors of somewhat less general scope, such as Habermas, no one has replaced the imposingly great figures of the recent past in providing us with contagiously inspiring direction. Nowadays, there are really no conspicuously fresh, bold, and intellectually exciting new challenges or innovations. For the most part, the field is quiet. We seem, more or less, to be marking time. (In Steven Cahn, ed., Portraits of American Philosophy [2013], pp. 125-126; italics mine).

Frankfurt probably exaggerates the contributions of the analysts if we understand philosophy as the love of wisdom as defined above, since the major analysts (1) had embraced an essentially positivist view of the discipline and its place in academia, one which was insular, self-contained, and seeing this as an advance over all past philosophy; and (2) which meant only rarely addressing the “big questions” in ways challenging to the institutional authority structures it was absorbed comfortably into.

Despite the powerful logical-linguistic techniques analytic philosophers developed, readily available to anyone who studies them, they are rarely employed in any effective way outside a rarefied and impotent philosophical literature gathering dust on the shelves of university libraries. Not used to their full potential, their influence was bound to fade in the face of the various rebellions we have seen since the 1960s, postmodernity being the most formidable. George Orwell was far more effective at showing the relevance of the analysis of language to the problems of civilization!

E-Philosophy proposes to revitalize philosophy as a discipline by finding not just the new arena in cyberspace but new ways to apply philosophy to today’s situation, present-day institutions, and to life in the world as it is. What will E-Philosophy be? As already explained, online, rather than in a university office cubicle or classroom. Independent, rather than affiliated with an institution (although individuals in institutions may participate as they chooses). Mobile and dynamic, rather than fixed and static. Pluralistic, rather than monistic, monolithic, and absolutist. Simultaneously global and local — having global reach as noted above, but local because it respects locality: local autonomy, local traditions / customs, local knowledge. Why localism? Because most people conduct their lives within a sense of place defining them; genuine “citizens of the world” are few and far between, and (interestingly!) usually limited to those steeped in Western Enlightenment thought and scientism. I don’t know that this need be the case, as a “remnant” might exist that recognizes the limitations of this, can see beyond locality to what we all have in common as human beings (or, indeed, in all conscious beings), but will nevertheless respect the fact of their status as an extreme minority. The transition that has probably already begun may allow this “remnant” to act, although this remains to be seen.

Be all this as it may, E-Philosophy will seek contact across cultural divides, community amidst diversity, and dialogue amidst disagreement and controversy. It will be future-oriented under the assumption that the future will come whether we plan for it or not, and it might be better if we tried to build a better future for ourselves and our posterity rather than leaving the matter to chance.

E-Philosophy’s “heroes” will be reflected in this kind of agenda. They range from Aristotle and logical systematicity to Stoics such as Epictetus or Marcus Aurelius, and their view of “living in accordance with nature,” reflected further in Bacon’s “nature, to be commanded, must be obeyed.” Other E-Philosophical heroes and sources of inspiration will be drawn from a broad list of rebels, prophets, observers, critics, provocateurs, innovators, creators: Jesus Christ, Michel de Montaigne, Bishop George Berkeley, Søren Kierkegaard, Friedrich Nietzsche, Charles Sanders Peirce, Charles Fort, Nikola Tesla, Albert Jay Nock, Ludwig Wittgenstein, Leopold Kohr, E.F. Schumacher, Ervin Laszlo, Robert Anton Wilson, Thomas S. Kuhn, Paul Feyerabend, Brian Eno; and fictional characters such as Dr. Ian Malcolm on Jurassic Park, Ambassador Delenn on Babylon 5 (the television series), Morpheus (The Matrix), and others. E-Philosophy will draw on the sciences where they are relevant, on technology where it is helpful, on popular culture where it can be illuminating, and on all those experiences — sometimes unique — that confront us with problems to solve as well as opportunities and resources for solving them, or which sometimes upset comforting and privileged dogmas.

E-Philosophy’s purpose is not to “found a new philosophical tradition”; its purpose is to challenge said dogmas, and urge new conversations on the problems our civilization confronts. Rather than defend a privileged metaphysics (be it materialist or Christian) it calls for interfaith dialogue and a search for a consensus among those articulating answers to common problems, with those affected by proposed answers having a say in whatever policies are embraced. Rather than defend a specific range of methods held as absolute and above critical examination, E-Philosophy proposes that all methods have limits — even seemingly obvious ones. Rather than outline a specific philosophy of life it goes on to insist is “the good life” for all human beings, it suggests there is much to be gained by encouraging a plurality of approaches to living, and that in the last analysis each person who chooses to do so should be allowed to walk his/her own path in life instead of being compelled to walk down paths laid down by others.

 

Key texts:

“The Sermon on the Mount” (Matt. 5 – 7)

Epictetus, “Enchiridion.”

Berkeley, George. A Treatise Concerning the Principles of Human Knowledge.

Søren Kierkegaard, Philosophical Fragments.

Frederic Bastiat, The Law.

Charles Sanders Peirce, “Some Consequences of the Four Incapacities.”

Charles Fort, The Book of the Damned.

Albert Jay Nock, “Isaiah’s Job.”

Ludwig Wittgenstein, Tractatus Logico-Philosophicus and Philosophical Investigations.

Aldous Huxley, Brave New World Revisited.

George Orwell, “Politics and the English Language.”

Ervin Laszlo, The Systems View of the World.

Thomas S. Kuhn, The Structure of Scientific Revolutions.

Paul Feyerabend, Against Method: Outline of an Anarchistic Theory of Knowledge.

Brian Eno, “The Long Now.”

Helena Norberg-Hodge, Ancient Futures: Lessons from Ladakh for a Globalizing World.

 

Key films / television series / etc.

Jurassic Park (1993)

Babylon 5 (1993 – 1998)

The Matrix (1999)

Posted in Philosophy, Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 1 Comment

What Is It Like to Be a Lost Generation Philosopher (Part 3)

[Continued from Parts One and Two]

Given that you pursued a career in academic philosophy, any specific regrets?

One big one from my early days. Not turning my MA thesis on Paul Feyerabend into my first book. The idea was there, and it hadn’t been done. There were no book-length works devoted to Feyerabend’s ideas then like there are now, and most of the secondary literature on Feyerabend was awful. I don’t know if having a book like that come out about the same time as receipt of the Ph.D. would have helped or not, of course, but part of me would still like to think so. But I let myself get sidetracked. I can’t blame anyone else for that. I was writing for a local music magazine and trying to write a novel based on some of the stuff I’d seen and been through as a student in a college town where there were a lot of popular bands — and also a lot of dopers. One version of it got finished but was never published, which is probably just as well.

I also ended up with what was probably a publishable paper on incommensurability that I’d read at a meeting that first year out. All it needed was some tweaking, but the novel seemed more important! We live, we learn, we regret. Paraphrasing Kierkegaard, we reflect backwards after we’ve lived our lives forwards.

There’s a new book out with Paul Feyerabend’s name on it, you know.

Yes, Philosophy of Nature. I’ve started reading it, and I doubt his forays into ancient Greek art and the Homeric worldview will be understood. Feyerabend was always an interdisciplinarian, in a field riddled with microspecialists.

You wouldn’t describe yourself as a specialist, then, I take it?

Heck no! If I wanted to be a specialist I would have stayed where I was in geology!

Do you find any recent trends in academic philosophy disconcerting?

I never cared for so-called feminist philosophy. Nothing against women, of course; I just doubt the wisdom of pursing philosophy from a group-grievance perspective. Most of what I’ve read of feminist philosophy just isn’t very good. It isn’t well reasoned, conscious of its own unstated premises which tend to be Hegelian-Marxist, or what it’s borrowed from others sometimes without credit. It’s not even well written. And radical feminists don’t handle criticism particularly well. That was one of my first observations as a new Ph.D., knowing next to nothing about them at the time but observing them disrupt a meeting at a national conference.

What was the conference?

American Philosophical Association, Eastern Division, either 1987 or 1988, that escapes me now. Christina Hoff Sommers was the speaker, doing a paper on feminism and the family, and she was basically shouted down – booed and hissed down, in fact. You might expect something like that from student revolutionaries, but do you expect it from professionals many with tenured teaching jobs? I was astonished. Drawing attention to this in a couple of prominent venues probably cost me job interviews because of the special favors accorded these people. There was, and still is, an irrational push to get more women into philosophy.

But is it not the case that only 25% of philosophy professors are women?

I’ve seen that stat, and I’ve no reason to think it’s wrong. There’s also been a push, going back longer than 30 years, to get more blacks into philosophy. The percentage of blacks in philosophy in the late 1960s was between 1 and 2%, and that hasn’t budged. These affirmative-action pushes just don’t work, unless we’re supposed to believe there’s dozens of blacks and women out there who can’t find tenure-track teaching jobs in philosophy because of discrimination. Some say that women just don’t warm to philosophy’s argumentative nature, which doesn’t ring true to me, so I don’t have a specific explanation other than the actual numbers reflect the actual choices women have made, or are making. Sometimes political hires backfire. There are departments that get stung because they hired someone who doesn’t give a damn about anything except her political agenda, and she turns the department into a war zone. I watched that happen once … from a safe distance … so I know it happens.

Wouldn’t feminists or critical race theorists or other voices that claim they’ve been marginalized say that traditional philosophy is all straight white male philosophy?

Strictly speaking, that’s not true. Plato, arguably the first pivotal figure in Western philosophy, was gay. Wittgenstein was gay. It doesn’t appear to have affected the way they did philosophy. I’m sure there are others. As for the rest…?  Straight white males all. But so what? Look, whether anyone likes it or not, just about all the people who gave us the Western intellectual tradition were white males, many of them explicitly Christian, at least until we get to the Enlightenment. Not that the Western tradition is perfect. I’ve criticized it as a whole myself on certain specifics like its tendency to dichotomize everything: Plato’s essential versus accidental properties, Aristotle’s terrestrial versus celestial realms which the scientific revolution transcended, Cartesian mind versus body which in some respects we’re still stuck in, Kant’s noumenal versus phenomenal worlds, analytic versus synthetic statements, determinism versus free will, and so on and so on. There are other problems. There’s a tendency in most Western thought to treat everything abstractly. Abstract and universal versus concrete and local is probably another dichotomy. The postmoderns do get some things right in my view. We are situated. We all come from specific places and times. We are either male or female, and there are things (relationships being the obvious example) we don’t perceive in the same way. There is such a thing as class consciousness. This academic abstraction, the rational individual, thinking thing, or whatever you want to call it, does not exist. It’s a Platonist-Cartesian myth, and pernicious as it works against the local knowledge of common people the world over and justifies this creation of a mass-consumption monoculture, which turns out to be nothing more than Western scientistic-technocratic materialism based on abstract rules. What happens is that common peoples in other cultures have followed their noses for centuries solving problems rather than relying on a book of rules, or a money system, or anything like that. Feyerabend’s worked on this; so have anthropologists like Clifford Geertz. There’s plenty of grist there for anybody’s intellectual mill, if they’d but use it; there’s no need for this affirmative-action based pseudo-scholarship to talk about group dynamics in cultural settings. A “feminist approach to physical science”? What sense does that make, anyway?

What bothers me the most about academic feminists and others of that ilk is that intellectual curiosity is not what motivates them. They’re part of a collective grievance industry driven by a desire to “get even” with us mean old white guys. Some are chronically angry, like the women who disrupted that meeting I talked about. I had an office right next door to one such person during my last job in the U.S. During a four-year period, I think she spoke three words to me. One of them was “Hey!” when one day I accidentally hit the hall light switch that also killed the lights in her corner office due to some screwy wiring. Guys like me just had no business existing in her version of reality.

Are you an angry white male?

The angry white male was an invention of the mass media. The first time I saw it was on an article I wrote that year, 1994. I’d not used the phrase in the article. Nor has any other white person used it that I know of.

Don’t you think racism is still a problem in America?

I’m sure it is, it’s a problem in a lot of places, not just America, and the devices liberals favor, offering minority groups benefits at the expense of whites, are making it worse, not better. Any time government offers one group favors at the expense of another invites resentment from the nonpreferred. That’s not racism, that’s human nature. That, plus factors like the outsourcing of jobs, is part of what is driving whites, the only race in America losing ground and numbers right now as is well documented, to support Donald Trump. Also, there’s the implication that blacks can’t succeed without government-mandated freebies. Many aren’t succeeding with government freebies. They’ve been made dependent, and told they are entitled. Left-liberals have created a postmodern welfare-state plantation, and I fear it’s about to blow up in their faces.

There are more whites on welfare than blacks.

Blacks make up 14.4% of the population in the U.S. I don’t have statistics in front of me what percentage of the population receiving government benefits of some sort or another is black, but I’m sure it’s larger than 14.4%. Everyone wants to think in terms of ratios, that if the ratios don’t match, it’s systemic racism. If 14.4% of the population is black, then 14.4% of professors should be black, the reasoning goes, the same being true in other organizations. But there’s not a society of multiple ethnicities anywhere in the world where you see proportional ratios in all institutions. You always have a dominant group. Always. These Social Justice Warrior types demanding more “diversity” because they can’t have the politically correct ratios they want are in a fantasy world. You’d think more people would figure out that the disruptions we saw on campuses all over the country last year are dead giveaways that these left wing policies don’t work. How long have these policies been around, anyway? At least since the early 1970s. That’s the trouble with the left in a nutshell: nothing it does ever works. It discredits the disciplines it takes over, and the universities get more corporate. And it turns race relations into a powder keg.

Do you believe political correctness is both real, and a problem in academia?

I think political correctness, along with the corporate model that’s eliminating the tenure system little by little … the two of them together have just about ruined higher education in America. I have been banging the first drum for a quarter century now, and everything I warned about in the early ‘90s has come true. I have no idea how to turn these two tides back at this point, because they’re part of academic culture now. Students believe they’re entitled to “safe spaces” just like top administrators believe they’re entitled to six-figure salaries. These disasters aren’t going to be fixed from the inside. I’m not sure they’re going to be fixed at all. It’s past time to start new institutions that make the present-day ones. Some folks are already doing this, but the accreditation system is in their way, and so is the employment system. If that ever changes, watch out!

Do you think sexual harassment is a problem in academic philosophy?

Well, I’m not there anymore, and I never saw anything when I was, but I have followed to some extent the accounts of the last few years, one at Northwestern University, another at the University of Miami. Nothing I’ve read really surprises me that much. I met arrogant men with tenure when I was in academia. They may or may not be married, I have no idea, but it’s clear: professional ethics took a nosedive long ago. I wouldn’t put it past some of them to solicit favors from vulnerable female graduate students in exchange for positive recommendations or contacts later to help them in this horrid job market. I think any guy who does anything that stupid in this day and age deserves whatever he gets when he gets caught.

How do you see the future of philosophy?

Dismal. It’s suffering from all the problems of academic culture generally we just talked about, as well as the neglect that is the natural outcome of a corporate environment of money über alles, which fields like philosophy don’t contribute to. Most of those finding academic work are ending up as permanent part-timers — adjuncts — working for pay that’s a joke. The “hyper-educated poor,” one online column called them. One concern I have is the disappearance of figures that promise to be historically important. Where are today’s Quines, Wittgensteins, or even its Kuhns, Feyerabends, and Rortys? There are very few if any such voices, and the remaining major figures are almost all in their 70s and 80s. The figures that made philosophy worth studying in the last century are dying off, one at a time, and not being replaced. This is particularly true in the U.S. A major figure in the philosophy of mind is David Chalmers, 50 I think, but he’s Australian. There are a handful of good people in Great Britain, like Timothy Williamson and Luciano Floridi, but in the U.S. no one under 70 is breaking new ground. One reason is this adjunctification, affecting 70% or so of those with teaching jobs, and you can’t do scholarship on the kinds of schedules adjuncts have to keep if they want a roof over their heads. Of that other 30%, no one is writing anything especially new or impressive. There is publishing going on, of course, a lot of it, in fact, but it’s the same microspecialization that’s guaranteed to gather the proverbial dust on library shelves. Some folks are retreading the same ground over and over again — especially libertarians and anarcho-whatevers. Their books have been written three dozen times now.

I think another reason things prospects for academic philosophy are so dismal is that the reasoning skills that make you good at it transfer to other disciplines or occupations, and so there’s been a brain drain. There’s no way to document this, of course, because there’s no way to determine how many people might have gone into philosophy had the environment been more hospitable, but my nose is telling me the number is far from negligible, that a lot of intelligent people who might have been good philosophers saw better opportunities and took them, sometimes before ever starting graduate programs.

Meanwhile, hostility to liberal arts education in our neoliberal environment has never been greater! Guys like Florida governor Rick Scott openly suggest what amounts to eliminating them in favor of STEM education. A book I’ve been reading on the culture wars by Andrew Hartman [A War For the Soul of America: A History of the Culture Wars] says it all at one point, how the real issue, now that universities have been changed not just by PC but by the corporate model, is not, do we teach John Locke or Frantz Fanon, but do we bother with the liberal arts at all? They don’t help corporations make money! In a way it’s depressing. But in a way it’s not, because in a few years, the field may be wide open for contributions from those of us working outside academia.

Like yourself?

Absolutely!

You moved to Chile in 2012?

Yes. I had an inheritance, and was fed up with teaching jobs which by themselves would never have paid the bills.

You had outside work?

In a manner of speaking I had outside self-employment. I was selling off my vinyl collection on eBay all those years — well over 1,000 vinyl records, some dating from those days I told you about [Part One] when I was growing up. That wasn’t sustainable indefinitely, of course, because a vinyl collection is finite and soon those that sell really well are gone. I’d gone to my department chair and asked for a raise, but the university I was then at — USC-Upstate  in Spartanburg — wouldn’t even reimburse my travel expenses to a conference in North Carolina where I’d read a paper a few weeks back. This was a university that had just spent something like a $100 million on a plush new facility for the business school, as well as random millions on other new buildings and beautification projects around campus. So at the end of spring semester 2012, I resigned. It was a de facto resignation: since adjuncts don’t really “resign,” they just aren’t rehired. All I did was clean out my office and disappear. I don’t think the department was happy to be left in the lurch, but for a change the shoe was on the other foot; it wasn’t my problem. I had friends in Chile and a couple of job prospects there, so off I went. Plus, the political situation in the U.S. was already bad and about to get worse. The GOP screwed up again, got Obama reelected, and Americans got Obamacare shoved down their throats. Not to mention continuing the Bush foreign policy that’s just about wrecked the Middle East. Chile, to its credit, is not at war with anybody and has no designs on countries on other continents.

Anything you miss about the States?

A few things. Solid research libraries, reliable mail that doesn’t cost a fortune, good customer service, dealing with ordinary people in my own language, greater efficiency generally. Although I imagine those are living on borrowed time in the U.S.

You’re more than pessimistic about the future of the U.S., I gather.

Look at the present presidential choices, the deteriorating infrastructure, the culture generally. I’ve long considered the Clintons to be two of the most loathsome human beings on the planet, and there’s at least a 50-50 chance voters are going to put another Clinton in the White House — probably the worst of the two — because the Republicans screwed up yet again! As bad as left-liberal Democrats are, Republicans also screw up everything they touch.

Are you for or against Donald Trump, then?

I’m not endorsing anyone, but I don’t think he’s worse than Hillary. Some of what I’ve read about his being a psychotic sociopath who shouldn’t be allowed anywhere near the nuclear codes is utterly ridiculous! He’s not going to blow up the world! I do think his Make America Great Again slogan, as well as the response it has generated, speaks volumes about America’s decline from greatness, but because he’s a prima donna, lacks patience with complexity, and will sow a lot of confusion by insisting on doing things his way, and above all, will be resisted by a firmly entrenched Establishment, if he gets in it will hasten the empire’s demise. That’s if he is who he says he is. If he turns out to be another Establishment tool who’s been leading a huge swath of voters by their noses, then all bets are off. There’s a massive financial debt bubble that is going to burst in the near future anyway, and that’s going to happen no matter who gets elected.

Your opinion of Hillary Clinton?

A loathsome creature. A lot of progressive leftists despise her, because they’ve figured out she’s not one of them. She’s a consummate opportunist who serves the interests of power and greed. And a total hypocrite. She claims to support the LGBTQ community while the Clinton Foundation takes money from foreign governments that brutally murder homosexuals. Her volatile temper is well-known, according to those who worked around her in the 1990s. She’s the one who shouldn’t be anywhere near your nuclear codes. She’ll get you into World War III faster than Donald Trump, who assuming he’s allowed to govern if he gets elected, may try to work with Vladimir Putin to rid the world of ISIS, instead of against him to try to extend neocon / neoliberal world domination.

Wow! Now tell the world how you really feel.

I think I just did.

Have any of your fellow U.S. citizens criticized your decision to leave?

I’ve been accused of cutting and running instead of standing and fighting, so to speak. My short answer is, you’re right, I cut and run. Only because I think standing and fighting is a losing game at this point. My long answer: however we look at it, the U.S. is an empire in decline: overextended militarily and fighting wars we shouldn’t be involved in, its “main street” economy hollowed out by globalism, its national debt rising to unsustainable levels, its political system bankrupt and dysfunctional, its media corrupted and filled with shills who wouldn’t keep their jobs otherwise, and the culture hopelessly divided with the races at each other’s throats and dumbed down by the entertainment industry. And very few people really have that much of a clue what is really going on. If things go completely to pieces, I’ll be watching over my computer screen instead of through my front window.

You mentioned job prospects down there. Have you done any philosophy teaching in Chile?

Yes, at two institutions. Both were disasters, mainly because the schools are so bureaucratic and inefficient. One of the places neglected to pay me for almost four months, and I had to threaten them with legal action. Liberal arts education is even less of a priority here than in the States. Chile is not a tropical paradise. It has its own set of problems. Education is a disaster here, too, though not for the same reasons as in the States. It’s more expensive than the average Chilean can afford, which is one of the things motivating a strong “free education” movement here. President Michelle Bachelet promised “free education” but hasn’t been able to deliver. Long story, but that’s been a cause of unrest here, especially among students. There have been protests that have shut down major campuses. The Social Justice Warriors haven’t done that in the U.S., at least not yet.

Any interesting projects on the horizon?

I am sitting on two manuscripts which I revise periodically, one of them on the breakdown of academia in the U.S. Another is on Descartes and the epistemological turn, that draws attention to a mistake in Descartes’s reasoning that, in my opinion, threw the Western tradition off track. It started as a paper that got turned down by four journals, none of which pointed to fundamental flaws in my arguments or otherwise gave a credible reason for the rejection (one said the paper was too long, but nothing more). After that I stopped sending it out. I was losing faith in peer review, just another overrated procedure that protects certain ideas and methods at the expense of others. I once had a 30-page paper of very detailed arguments against so-called feminist epistemology rejected by a major journal which provided two lines of referee comments. That paper disappeared during one of my many moves, trying to keep employed. As we were still in the 1990s it didn’t get saved on a flash drive. I’ve continued expanding the one on Descartes, and it could be published as a slim book now, if I ever get motivated to send it out. I’m also sitting on a manuscript on the original small-is-beautiful political philosopher Leopold Kohr, who was E.F. Schumacher’s teacher, and I’m working on a longer manuscript on where civilization needs to go from here.

That last sounds like an ambitious undertaking.

Yes, but as I note in its introduction, we’ve nothing to lose at this point.

Where does civilization need to go, in your opinion?

Away from scientistic-technocratic materialism. Away from financialization, centralization, and globalism, the actual drivers of the worsening inequality blamed on “capital in the 21st century.” These are causing unrest all over the world, and seeding the ground for more populist revolts by peoples who, if they get things right, will see their enemy as a firmly entrenched economic elite seated in central banks, centers of high finance, and the corporations that have grown up around all that – not just governments. We need to get away from the need to monetize everything and everyone, which means, away from the neoliberal ethos (leftists using that word do get some things right). We need to move past postmodernism, which is really just a gesture of despair in the face of the hardships of seeking truth inside the institutional cages we’ve created for ourselves.

And towards?

I call it civilization’s Fifth Stage.

Fifth Stage? What were the other four?

Auguste Comte, the founder of sociology, formulated the first three — the religious or fictitious, the metaphysical or abstract, and the scientific or positive. His Law of Three Stages. He thought we could stop at three, because the third would give us a scientific-secular paradise. It hasn’t, obviously. Postmodernism became the fourth, but postmodernism is an academic and artistic curiosity, not a basis for continuing civilization. Most of it is unintelligible to nonspecialists. We can’t go back to the first three in their original forms, although we can look for things they got right. The third still commands a lot of authority but, I argue, has left us at sea ethically and is one of the reasons we are having so many of the problems we are having. But we can draw on the earlier stages to build a fifth one. I don’t know that we’ll do it, or that anyone will care, but if we do, we will have figured out how to build a global civilization that isn’t centralized and authoritarian, one that uses technology to create abundance instead of to maintain artificial scarcity, one that doesn’t confuse liberty with money and power or free trade with corporate-controlled trade, but rather takes seriously what people want at a local level, acknowledging that most trade between common people is local and has to be. There is absolutely no reason, for example, to transport food from thousands of miles away, with unhealthy preservatives to keep it from deteriorating, when it can be grown locally. People have a right to know what’s in their food, and to have control over their food and health care. A Fifth Stage of civilization may have found a cheaper and more efficient way of powering our homes and vehicles that puts the present energy leviathans out of business. A Fifth Stage would be global, but post-globalist, in the sense of rejecting the encircling economic coercion we have now: cooperate or starve, because peoples have had their local economies destroyed and been robbed of their autonomy. (To see how this happens, read John Perkins, Confessions of an Economic Hit Man.) A Fifth Stage would welcome cultural exchanges and learning, provided the interactions aren’t forced on peoples like today’s open immigration policies which really serve corporations, not the peoples directly involved including the immigrants. It might even be able to abolish the money economy, with a technological state of affairs no longer requiring rent and mortgages for shelter and money for electricity. Think of oxygen, our most basic biological need. Without it, you die in a matter of minutes. We don’t pay for oxygen, though, because oxygen is abundant. Energy is presently a scarce resource, produced from scarce resources. Can we use technology to make energy abundant and therefore free? I don’t know, but the question seems worth asking.

Sounds almost like a Utopia of your own.

It may be, but the alternative is that the West passes into history, another failed civilization. And that’s even if we avoid a major war involving a nuclear exchange.

You are married now, yes?

Yes, to a Chilean woman I met over two years ago. Love her to death. There are a lot of things she says and does that I can’t get enough of, even after well over two years. Last thing I expected, as I was an unmarried dateless wonder in the U.S. But in Chile, women are women. They don’t resent men, and aren’t trying to be men. PC imperatives are trying to make inroads here, promoting such things as legal abortion and gay marriage, but they’re up against some fairly high cultural walls.

Might these walls come down in the future?

It’s possible. If they do, it will be for the same reason they came down in the U.S. An educational system that was inattentive to those kinds of threats because it neglected subjects like philosophy and critical thinking, used the emphasis on vocational STEM education to create employees instead of free citizens, and a culture that emphasizes the same mass consumption we see in the U.S.

Congratulations on the marriage. Here’s a question I’ve asked several philosophers. Suppose you’re king of the world. What’s your first move?

Resign. Or abolish the position and then resign. Come to think of it, if I abolished the position first there’d be nothing to resign from. But if I resigned without abolishing the position, then I’d relinquish the authority to abolish the position, and someone could replace me. Sort of a paradox. So I guess I’d abolish the position. That would make resigning redundant. Although I’d still sign a resignation document. Just to be clear.

Any non-philosophical interests these days?

I’ve been working on my cooking, often just experimenting. It’s surprisingly relaxing, and my wife seems to like the results. It gives her a break since she does the bulk of the cooking. I think we’ll be doing some gardening when spring rolls around, too. With the crap that’s in the food you get in grocery stores now, even in Chile, we get as much as we can at outdoor farmer’s markets, and the next step is growing some of our own.

You’re in favor of “prepping,” of people growing and storing their own food?

Yes, even absent the dangers of economic decline, social conflict, and possible world war, I think food corporations have done a lot of damage to our health, and growing and distributing our own food is the only way we’re ever going to get it back. That’s another interview, though.

True. Is there anything else you have to say, any other author we haven’t mentioned in this interview you’d recommend reading?

Leopold Kohr, although I did mention him briefly. He wrote a book back in the 1950s predicting a lot of what we’ve seen, the breakdown of the U.S. amidst greed, political corruption, and wars of choice. It was called The Breakdown of Nations, and it deserves a much wider audience than it’s ever gotten. He basically laid out a theory of the trajectory of empires, something like Spengler without Spengler’s obscurities, that when they get too large, empires become violent, abusive, and self-destructive. My Fifth Stage thinking would try to break this trajectory, which has empires rising, hitting a plateau, slowly getting corrupted by complacency and greed, then falling from within. But a lot of my private thinking these days is about this, so nothing I’d say would be short, and I know this has gone on too long and we’re out of time.

One more question, then. Are you writing about Fifth Stage thinking, as you call it?

Yes. Assuming it ever gets finished, it’ll be called The Fifth Stage of Civilization.

How far along is the manuscript?

As we speak, I’m almost two thirds of the way done. With the hardest part still in the thinking stages!

Do you have a publisher?

Haven’t sought one yet. I’ll be sending out feelers well before the end of the year.

Good luck, in that case. I’ve enjoyed doing this, and I hope readers if we have a few will benefit in some way. Thank you.

I hope so, too, and thank you.

Posted in Academia, Culture, Higher Education Generally, Libertarianism, Philosophy, Philosophy of Science, Political Economy, Political Philosophy, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , | 1 Comment

What Is It Like to Be a Lost Generation Philosopher (Part 2)

[Continued from here.]

Getting back to personal stuff again if you don’t mind: what did your parents make of your decision to go into philosophy?

My mom had always encouraged me to find out and pursue what I was really interested in, but my dad wasn’t happy at all! He would have preferred I finish my geology degree, get an MBA, and follow his footsteps into business. But I remembered those evenings he fell asleep on the couch from exhaustion, and how he’d often complained about the pettiness of some of the people he worked with. I very much didn’t want to go that route, especially since I didn’t think it would work. I don’t have a good head for business. I tried a couple of sales jobs and struck out miserably.

You mentioned Thomas S. Kuhn? What other philosophers did you read when you were first starting out?

David Hume was the first major figure I read closely, in an undergraduate seminar. Took a philosophy of science course to study Kuhn formally instead of reading him on my own, and the professor introduced us to Paul Feyerabend, whose Against Method: Outline of an Anarchistic Theory of Knowledge electrified me more than Kuhn’s book had. My thoughts about Kuhn had become, ho-hum, what’s so controversial here? Isn’t what he saying obvious? Feyerabend was a lot more challenging to intellectual authoritarianism than Kuhn — he was debunking the logical abstractions philosophers were trying to impose on science and calling scientific method. He claimed they would have prevented the most important episodes in the history of science from ever happening, and made a compelling case. His claim is that no method covers everything we call science; that’s the “anarchy” here. It’s kind of a joke, actually, as he says himself; the rationalist predicament is that if you want a rule that always holds, it will have to be something as empty as “anything goes.” Feyerabend complained more bitterly than Kuhn that critics didn’t understand him. He went so far as to call them illiterates, which seemed a bit over the top at the time, but most professional philosophers have little in the way of a sense of humor, or of irony.

Anyone else?

A British philosopher of science, Nicholas Maxwell, came to my attention when I was working on my MA. He stressed the presupposition made by science that nature is intelligible, and sought to promote a methodology in which scientists presuppose, a priori, that the world is intelligible in the sense of structural simplicity, and seek to identify and refine the specific ways in which the subject matter in their discipline is intelligible and structurally simple so that Occam’s Razor really does work. Since this seemed closer to how real science made progress, I thought Maxwell might have had the best answer to Feyerabend and be the next step in reconstructing the philosophy of science in the wake of epistemological anarchism. I wrote my MA thesis based on this idea, but later realized that nearly every endeavor, not just science, works under the assumption of a minimum of stability, order, intelligibility, simplicity, and so on, so this wasn’t a satisfactory answer to Feyerabend’s challenge to the idea that science has a unique method for finding truth.

You wrote your dissertation on the Kuhn-Feyerabend incommensurability thesis, and you used the term earlier to describe theism versus atheism. What is incommensurability?

Not something talked about much today, unfortunately, since the problem never really was addressed in my opinion. The idea comes from mathematics. Rational and irrational numbers are mathematically incommensurable, because the former can be expressed in the form a/b and the latter can’t. Comparisons between them are therefore approximate, and can be made to whatever degree you want, but are never exact. Example: ∏ and 22/7, or the far more exact ∏ and 3.14159 … non-ending and nonrepeating. Incommensurability can similarly hold between conceptual systems, vocabularies, cultural systems as systems of habits, and worldviews. We get the same inexactitude. Incommensurable systems can’t be reduced to one another or shoehorned inside a larger vocabulary or method with the conceptual machinery of both intact. I tried to argue that the phenomenon was not the threat to either scientific realism or rationality that it had been made out to be, if we don’t define these in formal terms. Kuhn’s eventual version of the thesis, which he had articulated by the time I was writing about this in the mid-80s, was that it was restricted to a few core postulates or propositions designating concepts not shared, so that when one theory replaces another, these postulates or propositions are not explained, or subsumed, they simply drop out of the vocabulary, like phlogiston did from chemistry, or élan vital from biology. The terms, viewed as referring in the old paradigm, are no longer seen as such, and so are no longer used. The pragmatics are relatively straightforward. But there is no “logical” means of convincing someone to drop a term or idea from across an incommensurable divide, since the very standards one needs to convince them are bound to the new system. As Kuhn says somewhere in Structure, there is no logical method of convincing the unconvinced to step inside the circle. But those who insist on defending an “old” paradigm end up written out of the discipline as it moves on, as was Joseph Priestley who defended the phlogiston theory of combustion for the rest of his life.

You’d worked this out in your dissertation?

Most of it.

Should have been a good launching pad for an academic career. Let’s talk about that, or what there was of it. How did you land your first teaching job?

It wasn’t on my own. If anybody actually does that, how they do it is a mystery to me. A professor in my graduate program knew a professor going on sabbatical for a semester at a university up the road, as it were, and had me send that department my CV. I interviewed for and got a job no one else knew about.

How long were you there?

A year. My first two jobs lasted one year each. Then I was out of work for a semester, part-time for a semester, then full-time for the next five years but still not on a path to tenure.

You never landed a tenure-track job?

No, and not through lack of trying. All those early years, I sent out over a hundred applications per year.

Did you get any interviews?

One, the first year out. Again, through a personal contact. Three, I think, the next year; and five the year after that. After that the interviews were just sporadic, and again it was usually because somebody who knew me knew somebody. It’s definitely true that it’s not what you know but who you know. And coming from the right graduate program, somewhere highly ranked, which I didn’t do. There was no Philosophy Gourmet Report in those days, either, if that matters. Academia, I like to say, is not a meritocracy or anything close. The more years you’re on the job market without finding a tenure-track job, the more it hurts you even if you’re building up teaching experience and even if you’re publishing in refereed journals. I think search committees conclude there must be something wrong with you, that you couldn’t keep a job. If they have over a hundred applications to go through, they’re looking for excuses to put yours in the slush pile where it never gets looked at again. You can publish and still perish, therefore.

You were writing articles in graduate school.

That’s right. My first published article was based on a chapter of my MA thesis, on one of Feyerabend’s ideas. It came out in ‘84 in a European journal called Inquiry, three years before I had my Ph.D. It proved important enough that someone wrote a discussion article of it. I replied, and that was my second academic refereed journal publication. Then I got a third, based on a conference presentation. A fourth came out right after I got the Ph.D. and when I had that first job. I had something like eight after three years out on the job market, and nearly always had something out to a journal. Naively, I thought this sort of thing would help my career.

And you don’t think it did.

Search committees have to read your CV first — not just three letters of recommendation from people they never heard of before.

What was the high point of your career?

The three years I spent at Auburn University teaching mainly logic — occasionally ethics, and once, philosophy of mind which I steered towards AI. Auburn had serious degree programs in schools like engineering, and for them logic was a required subject. Those students were damned smart! A lot of them were a joy to teach! I was very motivated during that period. During my four years there total — one of my one year jobs before had been in that department — I published a total of seven articles and review essays in refereed journals I can think of off the top of my head, and had the manuscript of my first book accepted by a publisher.

Was there an effort to get you on a tenure line?

Yes, but it failed. According to university bylaws I had to leave after four years of full-time work: after five years I would have de facto tenure, they called it, which took the tenure decision out of the hands of the department. They didn’t want that. I never saw any such system elsewhere, and I suppose whoever came up with that thought they were doing us some kind of favor, not putting us out of work. That’s what happens when bureaucrats make rules up on high. I didn’t have the unanimous support it would have taken to get on a tenure track and be put up for tenure before the end of that fourth year. So after that fourth year I was history.

You found another job, though.

Yes, right away, but I went from a place with those strong technical programs to a “flagship” state university with all but open admissions, tons of students on academic probation, a lot of them in remedial courses. The University of South Carolina at Columbia. It was culture shock. I went from classes of around 30 students, around seven of whom made A’s on average, A’s they’d earned, to classes of around 80 students in which only four or five did well enough to make an A without rampant grade inflation, and where a lot of students thought they were entitled to good grades just for showing up. There were some who had attitudes that said, “Teach me something, I dare you.” I’d say about a third shouldn’t have been in college at all. A lot of what went on in classrooms there amounted to little more than crowd control. There were no remedial courses in my subject, and I cut large pieces out of it anyway since they weren’t teachable to those students. One formal logic course I’d taught at Auburn went all the way through proofs in predicate logic using three-place relations. I’d never have been able to do that at USC. As it was, I had many bad experiences there with defiant students arguing over grades; I didn’t always handle them that well, and that hurt me with what really counts if you don’t have tenure: student evaluations. Plus I published a book that skewered one of academia’s the sacred cows. I wasn’t rehired for a third year, and found myself wishing I’d simply resigned.

You’re referring to your book on affirmative action?

Yeah, that one. Civil Wrongs was its name, and it came out in ‘94. It gave me my Warholian 15 minutes, as I was on talk radio quite a few times talking about it, and had several requests for spin-off articles based on it. I also had invitations to speak to off-campus groups and made some friendships I still have.

But you got no job offers?

No. That was when the interviews for academic jobs dried up. Imagine that. After that book came out I had two interviews for tenure-track teaching jobs. Two.

Low point of your career?

Learning the circumstances of my departure from Auburn, that one person had opposed my bid for tenurability — one person who had not identified himself, and gotten away with it. I was unemployed for six months, and then got a temp job in a state government office, doing typing, filing, stuff like that. Ridiculous underemployment. For a while I wasn’t even sending out applications, though. It probably sounds like whining, but there is a lot of professional jealousy in academic philosophy. I’d heard about, and sometimes met, other victims of that sort of thing. As much as I’d wanted to avoid the kind of pettiness my dad had seen, philosopher professors worried about someone encroaching on their territory are magnitudes worse — probably because so many of them with tenure do very little of value, and every one of them, in their heart of hearts, damned well knows it. This guy saw a chance to eliminate a threat, and he took it.

If you could go back in time and give yourself advice then, what would it be?

Some of what I said at the beginning [see Part 1]. I wouldn’t go anywhere near the humanities today. Never mind the anomalies, stick with the sciences, although I’ve heard that realistic employment prospects are no better in the hard sciences. If you just have to study philosophy, either get into an Ivy League doctoral program or keep it as a hobby. And then don’t get sidetracked.

END OF PART TWO

Posted in Academia, Christian Worldview, Higher Education Generally, Libertarianism, Philosophy, Philosophy of Science, Political Economy, Political Philosophy, Where Is Philosophy Going? | Tagged , , , , , , , | 1 Comment

What Is It Like To Be a Lost Generation Philosopher? (Part 1)

This is an “Imagined” interview. It is based on a proposal I made to the What Is It Like To Be a Philosopher website created by Clifford Sosis (Coastal Carolina University), not responded to for whatever reason, but it follows that model. This is what I came up with as I thought through how such an interview might have gone at its best. It is also a potentially good vehicle for reviving this moribund blog. Take this seriously or just as self-indulgence, the thoughts here are intended to be serious:

INTERVIEW BEGINS

In this interview, Steven Yates, Ph.D., talks about his background and interests; how he got interested in philosophy; how he gained Christian faith, lost it, then regained it again; how he got into and why he left academia; why he is so disdainful of academic feminism and other movements he associates with political correctness, a phrase he uses openly; why he left the U.S. and settled in a foreign country, Chile; and why he remains devoted to completing a few works of philosophy whether they gain him any personal reward or recognition or not.

What do you mean, a Lost Generation philosopher? What’s the Lost Generation?

There’s more than one “lost generation” now. The first was the generation to graduate with Ph.D.s after the infamous collapse of the academic job market in the early 1970s. For a while there were almost no jobs at all, or so I was told as that was a little before my time. What I was told: those who completed doctorates during those years had no choice but to go into other fields like computer programming or get a job with the government or maybe … well, I encountered a couple of articles about “cab-driving Ph.D.s.” I did my graduate work in the 1980s. By that time, the market was opening up, but not by much. Every tenure-track job opening still fetched hundreds of applications, and a lot of people who wanted to teach settled for anything they could find even if it was just one course at a community college. The market was slightly better in the late 1990s and for part of the 2000 decade but collapsed again when the Great Recession hit and has not recovered. With the direction the universities are going, with tenure being phased out little by little and people hired for starvation wages, I am not sure things are going to get better. There’s little point in taking jobs that don’t pay your basic expenses.

Then you wouldn’t encourage a really good student to go into philosophy?

The academic discipline? Right now, absolutely not! Those good at formal logic, I’d tell them to study computer science and keep philosophy as a hobby. If they like politics, I’d say study political science. I’d avoid fields like history and law. Law is also ridiculously overloaded with graduates who can’t find jobs actually using their JDs. Almost all the “thinking disciplines” are hurting because of poor employment prospects. The U.S. economy is terrible outside of Wall Street and Silicon Valley tech guru territory, and despite what the economics “experts” say about the drop in the unemployment rate, there’s no getting around that.

We’ll get into your thoughts about the future of academic philosophy, but first some basics. Where were you born? What did your parents do?

I was born in Bartlesville, Oklahoma, and moved to Atlanta with my parents when I was a child. My dad had bachelor’s degrees in both zoology and chemistry and an MS in chemistry, and my mom was an RN. So I grew up around books on science, encyclopedias, stuff like that. My dad was the first person in my family to go to college. As a World War II Submarine Veteran he attended on the GI Bill. I was aware early on that there were people both well above us on the economic pyramid, but also well below us. We were somewhere in the middle of the middle class. We weren’t hurting, but we had no special privileges. My dad did chemical marketing research and wrote up his findings for an oil company, traveling a lot to get information you could probably pull up online today. He worked his butt off and sometimes came home so tired that he’d fall asleep on the sofa in the evenings while the rest of us watched television. He was crotchety sometimes, and something of an authoritarian, but he took care of us.

Siblings?

One sister, adopted when she was two and a half. Her name was Leigh, and she was very different from me. I loved books and education, but they just bored her. I think it was a given that I’d go to college and she wouldn’t. She barely graduated from high school, but went on to a satisfactory life with two kids of her own, working for Cobb County [Ga.] where she lived until chronic health problems forced her to go on disability. She kept doing volunteer work, though, as much as she could. We lost her just a few weeks ago as I do this interview. Complications following surgery.

Sorry to hear that.

Thanks. All I can do now is wish we’d been closer. I’m just grateful my wife and I traveled up to see her and her husband last summer.

What was on your mind as a kid?

Science. The first thing I remember wanting to be was either an astronaut or an astronomer. One of the first books I read from cover to cover was a book on the planets, back before a lot was known about them. My mom checked the book out of the public library. Then it was dinosaurs and paleontology. Then I got into collecting rocks and minerals — I became a “rockhound” in other words, could have told you the chemistry of every one of them. I ended up as a geology major for a while in college.

What did you do just for fun?  

Watched science fiction TV programs and movies, and read sci-fi books. Really got into a program called The Outer Limits even though it sometimes scared the bejesus out of me. I was only in the third grade. I’m sometimes surprised my parents let me watch that stuff.

Favorite books from back then?

Hmmm, when I was a kid there was a series called Tom Swift, which was juvenile sci-fi. There were maybe 30 of those books, a new one coming out every few months, and I tried to collect as many as I could find. It was a mixture of sci-fi and political intrigue, the U.S. against a country called “Brungaria” obviously modeled on the Soviet Union. Later, in high school, I discovered 2001: A Space Odyssey, Isaac Asimov’s Foundation books, Frank Herbert’s Dune, more Arthur C. Clarke books especially Childhood’s End and later his Rendezvous With Rama — I was a senior in high school when that came out. Robert A. Heinlein, especially his short stories that involved paradoxes or excursions into other dimensions like “He Built a Crooked House …”  Roger Zelanzy’s fantasy novels about Amber, “the one true world,” and his other things.

You were a teenager in the 1970’s … what music were you listening to?

Progressive rock, mostly, some classical. Bands like Emerson, Lake, & Palmer; Yes; Focus; Pink Floyd; Moody Blues; Genesis; those were my favorites. Brian Eno became one of my musical idols after I got to college. I still collect his releases, just got The Ship through international mail a few weeks ago. Also got into a Hungarian band no one else I knew had ever heard of, Omega. Very big in their own country. They could have been famous in the English-speaking world if they’d gotten any airplay, which they didn’t. I also listened to German electronic stuff like Kraftwerk and Tangerine Dream. I enjoyed anything dominated by keyboards, be it pianos and organs, or synthesizers and mellotrons, anything that stretched the limits. Sometimes I couldn’t get enough! I think I had a lot of Maslovian “peak experiences” just from listening to what I thought was incredible music!

A lot of those guys are getting older now.

Brian Eno is 68. David Bowie who’d collaborated with him, just died at age 70. He had cancer. No one knew. He kept it a secret till the last minute. Chris Squire, of Yes, died of leukemia about a year ago. Rick Wright of Pink Floyd died of cancer several years ago. Keith Emerson, of ELP, committed suicide a few months back. Sad. He was 71, and had nerve problems in one of his hands. He was probably the best rock keyboardist ever, except that he could also play jazz, classical, honky-tonk, and probably more. Not just an incredible musician but an absolute perfectionist who couldn’t stand the thought of something like that hurting his live performances. Edgar Froese, who founded Tangerine Dream, died early in 2015 from a pulmonary embolism. He was 70. I mentioned losing my sister and I’ve lost several other people I was close to over the past couple of years. Makes you think about your mortality. You know something Froese was quoted as saying? “There is no death, there is just a change in our cosmic address.” I don’t know what his religious beliefs were, but I love that!

Are you religious? What role has religion played in your life?

We generally went to church when I was growing up. I don’t remember thinking about it much when I was a kid, but I became a Christian at a youth retreat between 9th and 10th grades, and then saw the tensions between science and religion in a new light. That was about the time I became conscious of the reality of worldviews, of the fact that different people and different communities bring entirely different basic beliefs, frames of reference, conceptual systems, there’s many things you can call them, to their experience. I also became conscious of things that didn’t fit into the dominant theories in the sciences and wondered what they meant.

What about in college, and more recently?

I thought of myself as an agnostic most of the way through college, graduate school, and for quite a while later. Just as well. I doubt I would have been able to complete a graduate program in philosophy as an outspoken Christian. It just doesn’t happen. Later, slowly, I came back to belief. No eureka experience I can put my finger on, but I realized that with the way some ideas are protected at the expense of others in academia as I’d experienced first-hand, the materialist view of the universe being one of them, many reasons for nonbelief no longer seemed to hold up to scrutiny.

Are you a churchgoer now?

Yes.

A fundamentalist, or evangelist?

I don’t like to characterize myself in those terms. I leave exact Biblical interpretations to others. I read a lot of “end times” stuff when I was in high school, and none of it happened. I don’t agree with premillennial dispensationalism even if I think Western civilization is in all kinds of trouble.

What is premillennial dispensationalism?

Basically, a fifty-dollar term for the idea that Jesus is going to “rapture” all believers away from this world any day now, and then Jesus’s thousand-year (millennial) reign will begin. If someone asks me, I tell them nobody knows God’s timetable. Come to think of it, I tend to think both Pascal and Kierkegaard were right about God’s basic incomprehensibility beyond what He’s specifically revealed about Himself. I guess I would call myself a Kantian-Kierkegaardian Christian. Kant showed that our minds are designed to work in three-dimensional space plus time, and what transcends the categories of the human mind is simply a mystery. Kierkegaard basically destroyed the teleological argument that moves from the apparent design in nature to the idea that nature’s designer is necessarily the Christian God. Hume got there first, I know, but Hume wasn’t a believer and Kierkegaard was.

The leap of faith?

He never uses that phrase, he just calls it a “leap.” I prefer to think of it as trust — that we are better off trusting in the existence of God than not. Trust is not proof, of course. I can’t satisfy those who say, “There’s no proof,” and I usually don’t try. I think a more recent philosophical theologian, Cornelius Van Til of the Reformed School, had the basic idea when he described theism and atheism as incommensurable. I think he actually used that term.

Sounds like a long intellectual journey.

It was. Needless to say, I didn’t get much help from my atheist colleagues who in retrospect seemed arrogant and authoritarian even when they tried not to be. Part of my argument is that materialism, as I use that term a worldview which has dominated the intellectual and economic landscape for the past hundred years now in one form or another (communism yesterday, neoliberal capitalism today), has been a political, moral, and cultural disaster. We’ve set up our own massive empire based on money and power, ruined our culture with sexual hedonism and the worship of celebrities, the entertainment culture generally. Ruined our health with junk food. Wrecked the family unit. Our political system is close to dysfunction and our financial system is verging on bankruptcy. All the while we’ve been laying waste to other nations in the name of extracting resources we believed we were entitled to, making enemies of peoples with more traditional worldviews, and possibly threatening the ecosystem itself. Modernity may have given us all manner of technological advances and creature comforts, but they’ve come with a pretty steep price we’re probably just beginning to pay. It certainly hasn’t delivered on its Enlightenment promises. It’s true that a lot of wars have been fought over religion, but irreligious secularism looks to be just as hopeless at delivering world peace, and possibly a lot more dangerous since religion never served up weapons capable of reducing entire cities to ashes in a matter of seconds.   

Do you think man-made climate change is real?

I think we have no choice but to take the idea seriously. There seems to be a pretty solid consensus that’s coming out of all the hard sciences, and if you study what’s being said on its own terms, it’s the same kind of consensus you find for nearly every other major theory in any science, like the Big Bang, or evolution, or continental drift.

Would you say you got disillusioned with the scientific outlook?

I’d say I learned that there were a lot of problems the scientific outlook couldn’t solve, and a lot of areas of human life where it didn’t seem to apply.

And this got you interested in philosophy?

As I noted a few minutes ago, I got fascinated by things that didn’t fit into anybody’s favorite theories. I wanted to know what they meant, and what to do with them.

Give us an example.

I could give you several dozen because I kept a journal on them years ago, but one will do. It’s fairly representative. Back in 2000 in London there was a display of anomalous artifacts — a rare event, by the way — and one of them was a hammer, clearly of human design, found buried in solid rock geologists had dated as from the Cretaceous period. It was part wood and part petrified; radiocarbon tests on the wood were inconclusive. I read one attempt by a geologist to explain this, saying something along the lines of, older minerals can dissolve and then harden around a recent object dropped on the ground that falls into a crevasse in rock. I could be wrong, I never finished that geology degree, but that sounds very strained to me. If that was the only such object found in solid rock or in a coal vein supposedly millions of years old, I’d be satisfied with the idea that we’re fooling ourselves somehow, or that it’s a hoax some joker dreamed up. But there are dozens of such objects. There are obviously human footprints found in rock of equivalent age. We know these aren’t hoaxes because the sediment around them is compressed in ways compatible with the weight a human foot would make in sand or mud. If they were carvings made to look like human footprints, the sediment wouldn’t be compressed.

What about the supposed Paluxy River findings? Those were shown to be fakes.

Those might be. I honestly don’t know. That case got publicity, but I didn’t pay it any special attention, more than these others. I’d have to say that if the dozens of cases that have been catalogued are all hoaxes, then the hoaxers must have been a really, really busy bunch, and very clever to be able to get these things inside solid rock or coal beds well underground — not knowing they’d even be found! What these things seem to me to mean is that either our ideas about the age of the human race are wrong or our ideas about the age of the Earth and when these layers of rock and coal were formed are wrong. Declaring that something “isn’t real” or “they’re all hoaxes” because they don’t fit the dominant theories is too easy. Besides, why would religious believers fake these things when all they really accomplish is setting themselves up for public ridicule?

Such cases are part of what made me shift my emphasis from science to its metaphysical assumptions and epistemological foundations — and there I was, in philosophy.

Is this why you wanted to study the philosophy of science?

I had a world history class where one day the professor started in on Thomas S. Kuhn’s book The Structure of Scientific Revolutions, and I realized I wasn’t the first person to wonder about this kind of thing, or about the status of scientific theories: knowledge? belief? or something else? I suppose I was a goner from that day forward — especially as the professor was clearly hostile to Kuhn’s ideas. That made me all the more curious. Why the hostility? So I picked up the book and read as much of it as I could on my own. It struck me as the most intelligent and rational depiction of science I’d ever run across. I started to suspect that academics were hostile to claims or ideas that threatened beliefs of their own that gave them a sense of security in the world — things they couldn’t incorporate into what they were absolutely certain the truth must be. I’m not all that happy with psychologistic explanations but I don’t think there’s a better one, for those who can’t believe in God but make Science their religion. A lot of secondary literature on Kuhn is dreadful. Most of his critics not only didn’t understand him, I don’t think they wanted to understand him. What they seemed to find offensive was his idea that paradigm change couldn’t be shoehorned into this view that science is a purely rational, i.e., logical enterprise, in the sense formal logic can be used to reconstruct, and that scientists don’t always deal logically with findings that conflict with favored theories. It clearly has a sociological and psychological dimension, possibly even an economic one. I could have become a postmodernist back then, but it never occurred to me to doubt that sometimes we do reach bona fide truth about the world. I assumed that authority and authority-driven institutions got in the way of truth-seeking, not that they in some sense defined what truth is for the populations they have power over, which is the way I read philosophers like Michel Foucault.

When did you start thinking about power in society?

Somewhere around the mid-1990s it struck me, almost out of the blue one day: the fundamental unsolved problem of social organization, and therefore political philosophy, is how to contain power. Every population has a minority in its midst that is fascinated with power, however we characterize them, whatever label we pin on them, whatever their worldview is. A few people, many of them philosophers, gave us comprehensive visions of the Perfect Society which, if you really study them, are almost invariably totalitarian when power-seekers use them, like Plato’s or Jean-Jacques Rousseau’s almost certainly were. As with the scientific anomalies they can’t figure out what to do with what doesn’t fit the plan. That’s another minority, those of us fascinated with freedom, who want to be free, can’t stand to be constrained, think we’re all better off when we have total freedom — whether we are or not. Most of the rest of the population, those we call the masses, are interested only in what affects them directly. They tend to seek security rather than freedom or power — perhaps just enough freedom or just enough power to have security. If they have to choose between freedom and security, they’ll choose security every time. I remember, I’d reached that conclusion as an advanced doctoral student thinking that the masses, including the masses of scientists, should be scientific or “naïve” realists, although we philosophers knew better.

Kind of an elitism of your own.

You could look at it that way, I suppose. I never accepted the Wittgensteinian and positivist view that there was no knowledge unique to philosophy, that philosophy was a method only, of clarifying language say, although obviously it involves a lot of that.

You used to think of yourself as a Libertarian, right?

I’m probably more conservative these days. The masses need tradition, convention, structure. That’s just to say that Hobbes and Hume were right: in the last analysis we’re not creatures of pure reason. We’re far more creatures of habit they we like to admit. Rationalists hate this, but none of their basic ideas have ever caught on with the general, nonintellectual population. Given what the political mainstream keeps coming up with, the Libertarian Party should have been able to clean house by now, or at least win a state-level election here and there. Creating conditions for freedom is still a problem, for the reasons just stated: the masses probably aren’t reachable on the rationalist terms Libertarians want. The contest is between whoever leads the masses: those who can create benevolent conditions for liberty, or those who want power.

Who’s winning these days?

I don’t think it’s the Libertarians, although they’re still out there.

END OF PART ONE.  PART TWO IN THREE DAYS

Posted in Christian Worldview, Culture, Higher Education Generally, Libertarianism, Music, Philosophy, Philosophy of Science, Political Economy, Political Philosophy, Science Fiction, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , , , , , | 2 Comments

Donald J. Trump: Reasons He Appeals, Reasons for Hesitating

Back in January I wrote a piece about Donald Trump explaining why I was not endorsing him. The piece was rejected by the publication where I post the bulk of my commentary, which is pro-Trump with all four claws. I am caught between the two extremes: those who see Trump as the political equivalent of the Second Coming and those who hate everything he stands for. There is a lot of angst-ridden commentary out there, so much I couldn’t begin to link to it all at this point, most of it written by pundits (or pseudo-pundits) falling into the latter category.

The idea of nuanced efforts to get the Trump movement understood on its own terms, by investigating the appeal Trump has with, e.g., the white working class, seems to be verboten. My essay finally did appear, but on a site receiving little web traffic where it was probably seen by no one (I did not receive a single email, hostile or otherwise.)

It’s thrust: on the one hand, Trump scares the crap out of the global financial elites who have something on their hands they can’t control. Trump doesn’t need the money of the “donor class.” He appeals to those who are tired of struggling with economic forces they can’t control, and for which they blame the elites (with much justification). He appeals to those who are fed up with political correctness, and even more fed up with being in the one group that can be discriminated against, abused, and its members even physically attacked; and if they retaliate they are “racists.” Trump appeals because he is an outsider, as they are outsiders. Control by the global elites has just about run its course. The national debt has reached $19 trillion under Barack Obama’s watch and could hit $20 trillion before he leaves office. The tepid “economic recovery” clearly has not benefited the ordinary person whose job is part-time because he/she cannot find full-time work, and because the vast majority of full-time jobs barely pay enough for a family to survive on. Is it any wonder that the masses are getting behind someone who promises to “make America great again” even if he has no explanation what this means or how he is going to do it — anymore than Obama explained what he meant by “change you  can believe in.”

The rise of a political outsider was inevitable. If it hasn’t been Trump today, it would be someone else tomorrow. This is to be expected in our present historical moment.

On the other hand, Trump is clearly an authoritarian. The pseudo-pundits have that much right. He would do nothing about the steady rise of police violence over the past decade or so, for example. He not only hasn’t mentioned it, but talks as if the police were somehow an oppressed group operating with their hands tied. Trump’s authoritarianism is steeped in an “America-Firstism” the roots of which predate the rise of the global elites and the financialization of the economy that began in the 1970s. It rejects open borders, for example, just as it rejects the offshoring of a country’s manufacturing base. Libertarian bloggers have demonized Trump as a fascist. It is true enough, he doesn’t worship at the alter of the absolutely free market. It is unfortunate that libertarians confuse free markets in the abstract (which have never existed in a pure form) any more than Communism has ever existed in its pure form) with the concrete reality of corporate leviathans being able to do as they please without legal restrictions.

As I put it in my nuanced logic-speak (a language almost no one understands anymore, it seems): at our present historical juncture, the rise of an outsider may be a necessary condition for what is needed, but it is not a sufficient condition. The fact that most of the public won’t have a clue what I just said signifies a major part of the problem. Education at all levels has fallen off the cliff in the U.S. Ridiculously overpriced neoliberal universities now teach vocationalism and political correctness. Students keep enrolling. Public education always has been more about teaching dependence and obedience, not critical thinking skills and self-reliance. Parents keep sending their kids to them. What else can one expect besides a major fall-off in public aggregate thinking ability. Did those involved in this system really believe they could do this forever without a Donald Trump eventually appearing to take up the cudgel for those who really are being ostracized by the system (as opposed to the official, politically-designated “victims”)? What were they thinking? Were they thinking?

It is now being said that the Election of 2016 will be a turning point for the country. I submit that the Election of 2016 is more likely to be a train wreck of historical proportions if Trump is the GOP nominee and Hillary Clinton is the Democratic Party nominee.

The turning point was the Election of 2012. The GOP had an opportunity to nominate an abiding Constitutionalist: Dr. Ron Paul. They threw it away. As my marginally-published essay puts it, welcome to the post Ron Paul era. I see what now seems likely to happen as the inevitable trajectory of an empire (the U.S.) whose fortunes have already begun to decline, and will decline further over the next couple of decades. A President Donald Trump might well hasten the process!  Look on the bright side. There wasn’t a single other person from either party running who could have turned that process around.

Posted in Libertarianism, Political Economy, Where is Civilization Going? | Tagged , , , | 1 Comment

A comment on, “A philosophical interpretation of recent campus protests” by Huenemanniac

What follows is a lengthy comment on this blog entry.

I also came over here from Brian Leiter’s blog, and may be something of a johnny-come-lately because due to work obligations only saw this essay last night. Disclaimer: despite my Ph.D. and publications including books I have been outside academic philosophy for several years now, so some might say I no longer have “skin in the game,” which is true insofar as it goes, but — trust me! — not having to worry about being laid off from one’s adjunct-instructor position liberates a person to say things he would NEVER say if he had that worry: for example [TRIGGER WARNING!!!!!] that Black Lives Matter revealed themselves (some of them anyway) to be a group of narcissistic thugs with a single hashtag (#F***Paris). And yes, as the lowly branch campus in the South I walked away from where philosophy was the most poorly funded subject on campus, I can only be bemused at the people in Ivy League colleges, students or faculty, who see themselves as put-upon or repressed because of a politically incorrect email. But anywise …

Professor Huenemann, suppose for the sake of argument that your assessment of the Enlightenment and privilege is correct. In that case, yes, the Enlightenment-derived technologically advanced civilization and creature comforts we all enjoy was built on the backs of subjugated peoples and has more blood on its hands than any of its expositors or defenders care to admit, or even think about. Although I used the phrase “for the sake of argument” this is not merely hypothetical; last year, the work of renegade economist Michael Perelman came to my attention, especially excerpts from his book The Invention of Capitalism: Classical Political Economy and the Secret History of Primitive Accumulation  (2000) which demolishes the idea championed by Adam Smith and David Hume, and later by David Ricardo, that British (later British-American) capitalism developed voluntarily when multitudes of peasants flooded into the cities looking for new opportunities in the factories. Perelman asks a surprisingly simple question: why would these people abandon lives of relative independence and stability in farming or related agrarian work for lives of dependence in dark, dirty, and often dangerous factories? His answer is, they wouldn’t and they didn’t. Relying on little-known correspondence, diaries, etc., dating from the period, peasants were forced to do so. Perelman shows how capitalists quietly advocated government policies that forced them off their land and left them with no alternative except moving into burgeoning cities where they either worked in factories or starved. He argues that Smith, Hume, and others rationalized this with their philosophies of capitalism as representing progress, self-correcting, and therefore in no need of direct regulation apart from the creation of the larger legal infrastructure. The actual goal was to deprive peasants of their land and enjoin on them by force the need for “primitive accumulation” (Perelman’s term) as instrument for control over labor. This argument emphasizes class as it existed in the rising British empire, before we even get to the slave trade!

It is sobering to reflect that much of the ideology of modern capitalism, however cashed out, is built upon a false premise as well as upon exercises of force its leading defenders today, libertarians, repudiate with their “non-aggression principle.”

But given this — I may have just strengthened the arguments for a “second” look at the Enlightenment and its leading expositors — and supposing no errors of historical fact or logic here, the troubling question remains: what can we do about all this NOW? We cannot exactly turn back time. We can acknowledge that some have always been sacrificed so that others may prosper, and it happens today in a more modest form every time an occupation performed by humans is replaced by automation. But this hardly seems any kind of basis for the moral view of the world philosophers invariably seek (unless they decide that this view of the world is an illusion!).

We may be chained by our histories, metaphorically speaking. Rising Westerners hardly invented the above practices. Human beings have been enslaving and brutalizing one another for as long as there has been a human race. If anything, we Westerners have taken more and larger steps towards eliminating various forms the chains of slavery (chattel or otherwise) can take, than any previous civilization. We have done much to reduce the level and acceptability of cruelty, at least in our own culture (although, again, IMHO we’ve regressed badly since 9/11!). We have our Enlightenment ideals to thank for what progress we’ve made, even if we’re a long way from being out of the woods: I would say that we are not out of the woods until no one can be fired from a job for saying something politically unpopular which may be factually true. The employment system that eventually evolved from the factory system has remained a form of control, after all.

If we are to have any kind of future, we must learn to break those chains and end these controls, at least within the parameters human nature permits. That will mean making peace with a past we cannot change, having resolved to learn from it what we can, and — yes — move on, into a future that cannot be better than the past any other way. It was macrohistorian Carroll Quigley (1910 – 1977) who once said that the future can be better than the past, and we have an obligation to try to make it so.

Posted in Culture, Higher Education Generally, Libertarianism, Political Economy, Where is Civilization Going? | Tagged , , , , , , , , , , , , , | Leave a comment

Higher Education and Race: Approaching the Nearest Cliff

Over the past month, we’ve been regaled by confrontations at an Ivy League school, Yale, and at the University of Missouri, involving allegations of supposed racism on campuses. The former involved supposedly “insensitive” Halloween costumes and a faculty member who defended them as acts of free speech and opportunities for dialogue. With the latter school, for all practical purposes football players were instrumental in forcing the president of the university to step down. And a video caught a person who turned out to be a professor shouting, “Who wants to help me get this reporter out of here? I need some muscle over here!” The woman, Melissa Click, had what one follow-up article described as a “courtesy post” from which she’d agreed to step down.

What on earth is a “courtesy post”? In my 15 or so years of wandering from campus to campus to campus, I never encountered such a job category. The same article stated that she was still on the faculty of — are you sitting down? — media communications.

What hath 50 years of affirmative action and 25 years of political correctness on collage and university campuses wrought?

What it’s wrought is racial near-chaos, a situation in which a single allegation is its own evidence and proof, in which students complain of “microaggressions” and demand “safe spaces”; professors must announce “trigger warnings” before discussing material that might prove “offensive” to the fragile flowers who have wandered into their classrooms.

This mess has spread from campus to campus to campus. We have now seen disruptions at Amherst College, Ithaca College, Columbia University, Claremont McKenna College (which also saw a resignation), Occidental College, the University of Kansas, Brown University, Princeton University (where students who probably can’t get Woodrow Wilson in the right half-century want his name removed from campus facilities), and Dartmouth College where black students went on a rampage through the library early this week, confronting white students trying to study and hurling obscenities in their faces in ways which could have triggered violence had any white students grown a pair and raised some loud objections. To the best of my knowledge, the administration has remained silent! At Smith College protesters went from Melissa Click’s embarrassing threat of physical violence against a journalist to a demand for “solidarity,” i.e., what amounts to a loyalty oath!

What should be clear: these kids are all on Facebook and Twitter, among other social media. Their SmartPhones and iPads go everywhere they go. They are networked together at a level the generation I came up with can barely imagine, much less keep up with. And they are demanding “diversity” at a level not even the generation of leftists that began to take over the universities in the 1960s was capable of. In fact, even left-liberal professors have indicated their fear of these kids. Many, in this day and age, are adjuncts or contingent faculty, off the tenure track, and therefore vulnerable to student evaluations. Enough bad ones, and they’re out of the profession. This is one of the things adjunctification has done to academia: put it in a position where a generation of crybabies and bullies – crybullies, some writers are calling them – can dictate not just policy but hiring and firing decisions. Students at Yale wanted two faculty members fired, one of whom stood up for free speech and her husband who supported her. Another video captured a black female student shrieking obscenities at one of them.

These youth have fallen hook, line and sinker for what has been the official politically correct narrative for the past 30 years: all their problems can be blamed on present-day white racism piled on top of earlier white racism. Nothing is to be blamed on a black youth culture that is fundamentally anti-intellectual and violent. For a white person to declare himself to be not a racist is, by definition, proof that he is a racist; for a white student to be made uncomfortable by a black student yelling “Racist!” in his face, amidst obscenities, while he is trying to study is further proof of the white student’s racism. Few observers seem able to stand consciously outside the official narrative, where they can perceive the closed-circular “logic” here. How does one counter act it? One doesn’t. The only peaceful thing one can do is walk away — assuming one is not physically assaulted in the attempt (and it might come to that before this plays out).

Alan Dershowitz, who can hardly be accused of racism or “right wing extremism,” took the student protesters to task in a major article published earlier this week. His observations are obvious to those of us who have been at this a while, first in the trenches of adjunct teaching and then as outsider-observers. The black students do not want genuine diversity, a diversity of ideas. They want just the opposite: an academic culture of complete uniformity and coerced conformity to a hard-left ideology which views the U.S. as built on racism and the systemic repression of minorities by white males (although they are hardly being repressed now!). White males, according to this hard-left narrative, are civilization’s biggest villains — never mind that they invented science, the study of culture, and most of the technology these kids now take for granted when they tweet one another.

If they wanted real diversity, they would demand the hiring of more professors with conservative ideas. They would speak out on behalf of conservative and libertarian student organizations. They would demand not uniformity of ideas amidst a boutique diversity of faces, but a proliferation of competing ideas throughout every area of academic life: the sciences, economics and the social sciences, as well as liberal arts and humanities. They would welcome, not condemn, statements such as the following from one of the few conservative faculty members who has survived in academia today, Mike Adams of the University of North Carolina — Wilmington (who had to fight a lawsuit over his institution’s bias against conservative ideas that was blocking a promotion he had earned).

Here is the ideal first-day-of-class lecture, according to Mike Adams:

Welcome back to class, students! I am Mike Adams your criminology professor here at UNC-Wilmington. Before we get started with the course I need to address an issue that is causing problems here at UNCW and in higher education all across the country. I am talking about the growing minority of students who believe they have a right to be free from being offended. If we don’t reverse this dangerous trend in our society there will soon be a majority of young people who will need to walk around in plastic bubble suits to protect them in the event that they come into contact with a dissenting viewpoint. That mentality is unworthy of an American. It’s hardly worthy of a Frenchman.

Let’s get something straight right now. You have no right to be unoffended. You have a right to be offended with regularity. It is the price you pay for living in a free society. If you don’t understand that you are confused and dangerously so. In part, I blame your high school teachers for failing to teach you basic civics before you got your diploma. Most of you went to the public high schools, which are a disaster. Don’t tell me that offended you. I went to a public high school.

Of course, your high school might not be the problem. It is entirely possible that the main reason why so many of you are confused about free speech is that piece of paper hanging on the wall right over there. Please turn your attention to that ridiculous document that is framed and hanging by the door. In fact, take a few minutes to read it before you leave class today. It is our campus speech code. It specifically says that there is a requirement that everyone must only engage in discourse that is “respectful.” That assertion is as ludicrous as it is illegal. I plan to have that thing ripped down from every classroom on campus before I retire.
One of my grandfathers served in World War I. My step-grandfather served in World War II. My sixth great grandfather enlisted in the American Revolution when he was only thirteen. These great men did not fight so we could simply relinquish our rights to the enemy within our borders. That enemy is the Marxists who run our public universities. If you are a Marxist and I just offended you, well, that’s tough. I guess they don’t make communists like they used to.

Unbelievably, a student once complained to the Department chairwoman that my mention of God and a Creator was a violation of Separation of Church and State. Let me be as clear as I possibly can: If any of you actually think that my decision to paraphrase the Declaration of Independence in the course syllabus is unconstitutional then you suffer from severe intellectual hernia.

Indeed, it takes hard work to become stupid enough to think the Declaration of Independence is unconstitutional. If you agree with the student who made that complaint then you are probably just an anti-religious zealot. Therefore, I am going to ask you to do exactly three things and do them in the exact order that I specify.

First, get out of my class. You can fill out the drop slip over at James Hall. Just tell them you don’t believe in true diversity and you want to be surrounded by people who agree with your twisted interpretation of the Constitution simply because they are the kind of people who will protect you from having your beliefs challenged or your feelings hurt.
Second, withdraw from the university. If you find that you are actually relieved because you will no longer be in a class where your beliefs might be challenged then you aren’t ready for college. Go get a job building houses so you can work with some illegal aliens who will help you gain a better appreciation of what this country has to offer.

Finally, if this doesn’t work then I would simply ask you to get the hell out of the country. The ever-growing thinned-skinned minority you have joined is simply ruining life in this once-great nation. Please move to some place like Cuba where you can enjoy the company of communists and get excellent health care. Just hop on a leaky boat and start paddling your way towards utopia. You will not be missed.

I have no idea how to improve on that, except to note that the U.S. is hardly a “free society,” but I will assume he is referring to our professed ideals, ideals such as Constitutionally-protested free speech which these students are now openly challenging. I will again marvel that this man has survived in contemporary academia, without being fired by unscrupulous seniors in his department or in the administration, and apparently without having his classes disrupted or having to suffer the nuisance harassment of having garbage strewn on his front lawn (which happened to a conservative sociologist I once knew at Bowling Green State University) or obscene phone calls at 2 am in the morning (which happened to me a few times when I was living in Greenville, South Carolina). Bowling Green State actually has an informal “index” of banned books. My first one, Civil Wrongs (ICS Press, 1994) is on it.

Imagine that: an “index” of banned books at a major university in this day and age.

Welcome to higher education of the twenty-first century, as it approaches the nearest cliff!

Posted in Culture, Higher Education Generally | Tagged , , , , , , , , , , , | 3 Comments

There Is No Such Thing As “Settled Science” (Here’s Why)

When deciding what to write and post on this blog, I am generally torn between two conflicting impulses. The first is to take note of and comment on those things that eventually drove many of us Lost Generation Philosophers from U.S. academia: the rampant dishonesty of many academic “searches,” the adjunctification of the profession, the structural misallocation of resources in many if not most universities, the sense of having to be an entertainer in the classroom to keep students’ attention, and more besides. There is certainly room for such discussion, and several good blogs cover at least the second item on the above list. The other impulse is just to comment on philosophical problems and their various applications from an outsider’s perspective. That is, from the perspective of someone with no allegiance to the fashions that get people hired and tenured nowadays, or who questions the dominant assumptions of what philosophical conversation exists today (materialist naturalism, for example).

The next problem is actually getting the material read. The Internet, after all, is not what it used to be. Blog entries done ten years ago had much more hope of being read by people who weren’t experts in driving traffic to websites; I know, because I was writing a blog then, which I mistakenly stopped because it was drawing too much of the wrong sort of attention (trolls). Today, on the other hand, the World Wide Web is so saturated with information that it’s a crap shoot. If your piece has a dramatic human interest angle, and appeals to people’s emotions, it might be picked up and tweeted all over creation and “go viral.” On the other hand, it might sit unnoticed (the fate of better than 99.99% of what gets posted on the Web now). One way of getting noticed on the Web is to say something easily perceived as threatening. But that, too, has the potential to garner the wrong sort of attention, and fairly quickly!

Many of us, therefore, would prefer to publish the bulk of our material on established online publishing platforms, which of course now means confronting many of the same problems afflicting mainstream hard copy publishing, starting with staying in the good graces of the editor / editorial board who rightly has the final say on what goes on his site. And of course, said editors / editorial boards invariably gravitate towards an inner circle of familiar writers. Thus sometimes it happens, sometimes it doesn’t. This piece was sent to three different platforms, has not appeared on any of them (it might appear on one next month), but still strikes me as important enough to appear somewhere. This is as good a place as any. It would  be nice if someone sees it. Feel free to leave a comment; feel free to redistribute, but please do me the courtesy of linking back here.

There Is No Such Thing As “Settled Science” (Here’s Why).

by Steven Yates, Ph.D.

Recently, Thomas DiLorenzo published a piece on the infantilization of university students. To those of us who did time in mainstream American academia and eventually escaped, DiLorenzo’s remarks are obvious. But elaborating this point isn’t my purpose here. In one paragraph he comments on that wonderful phrase settled science, a phrase often used to defend man-made climate change as established fact and stop all further debate. I wish to expand on this, as the basic problem may not be obvious to readers, especially if they trust academic science.

The influence of science on the modern world is a given. One would be foolish to think we haven’t come a long way in the past few centuries — since 1543, the year Copernicus published his landmark De Revolutionibus Orbium Coelestium which challenged the Aristotle-Ptolemaic theory of the solar system. That same year also saw Andreas Vasilius publish his On the Fabric of the Human Body which revolutionized the study of human anatomy. Isaac Newton’s famed Philosophiae Naturalis Principia Mathematica appeared in three volumes in 1686-87, and the rest, as they say, is history.

Most leading modern philosophers have been interested in science. How does it achieve its results? How does it justify them? What are its assumptions, and how do they differ from those of, say, religion? Are they better? Does science have a special method, the scientific method, guaranteed to yield truth? Is the invention of new scientific ideas a rational process, or more a matter of fortuitous circumstances? What separates science from pseudoscience? And so on. There have been vigorous debates on such questions for over a century now, and yes, philosophers of science have gotten results worth noting, often by paying attention to the history of science instead of listening to what scientists themselves say.

To make a long story short, the upshot of these debates is that there is no such thing as settled science. How we get to this conclusion is a very interesting story.

Let’s begin in the 1830s, right after William Whewell (1794 – 1866) coined the term scientist which replaced Newton’s natural philosopher. Auguste Comte (1798 – 1857), best known as the founder of sociology but also of the philosophical ideology of positivism, offered his Law of Three Stages. An advancing civilization, Comte argued, passes through three stages or states or conditions. The first stage is characterized by various forms of supernaturalism and faith: the theological or fictitious stage, he called it, indicating what he thought of it. He did see the single God of Christianity as superior to the squabbling deities of ancient Greece and Rome. The second stage is characterized by grand system-building philosophy: edifices of thought from Plato and Aristotle down through Kant and Hegel. He called this the metaphysical or abstract stage. He thought better of it, but saw it as having substituted abstractions (e.g., “natural law” and “pure reason”) for supernatural agencies.

The third stage is the scientific or positive: we give up childish abstractions air castle building, and look at what reason actually does, which is to begin with experience and test, step by step, general statements we can make about the world. We arrive at hypotheses subject to the tribunal of further empirical test and utility. With scholars having given up the quixotic effort to find absolute certainty, theories that stand up to the best tests scientists can throw at them deserve to be called knowledge. Comte’s thinking established philosophy of science as a discipline. Its working assumption was that science was tied to experience and disciplined inquiry at every point. It’s a nice idea, but it isn’t true and never was.

The fall of Newton’s physics in the face of Einstein’s relativity suggested to many thinkers that there is more to science than testing theories against experience — that science, no less than philosophy, works with grand systems of thought of a more restricted sort, and that scientific disciplines have undergone wholesale revolutions, as new sets of concepts replace old ones from the ground floor up. This idea sat undeveloped (by philosophers, anyway) for a long, long time.

In 1962, doctoral candidate in physics turned science historian Thomas S. Kuhn (1922 – 1996) published his landmark The Structure of Scientific Revolutions which tried to outline proposals for what happens when an old theory is replaced by a presumed better one. It was the most controversial book about science of that decade. The basic ideas are well known. A mature science is dominated by its paradigm — a presumed body of achievement consisting of solved problems, the basic concepts and methods used to solve them, along with any number of remaining problems awaiting solution by “normal science,” the implication being that the methods used to solve the grand problems of the past will continue to work on lesser ones in the future. A scientific paradigm is often embodied in a major work, such as Newton’s Principia. Kuhn thought the advantage of paradigms is that scientists could explore parts of nature in detail without having to dwell on fundamentals which need not be questioned. This may work for a time, but only up to a point. A few problems will resist solution; these anomalies will accumulate over time. Scientists will try to fit them into the familiar mental boxes of the paradigm.

A few (usually younger) will sense this is impossible. Gradually, allegiance to the dominant paradigm will break down, discussion will turn back to fundamentals, and “revolutionary science” will begin. It will end with the formulation of a new candidate for paradigm and what Kuhn calls the ensuing battle over its acceptability. If it creatively solves the problems that got the old paradigm into trouble while suggesting new lines of research, it may win the allegiance of the scientific community’s leaders, the journals will fill with accounts of their triumphs, and a scientific revolution will have taken place. A new period of “normal science” will begin, but the discipline will have changed its understanding of itself. The examples Kuhn developed in some detail included Newton’s physics replacing Aristotle’s cosmology, Antoine Lavoisier’s oxygen-based chemistry replacing the phlogiston-based chemistry of his predecessors, and Einsteinian relativity replacing Newtonian mechanics. Kuhn’s point was that a successor theory may use much of the same vocabulary but does not bring the same concepts to its understanding of its subject matter, so that for Newton mass is fixed whereas for Einstein it increases as its velocity approaches that of light. Other concepts just drop out. To Lavoisier’s predecessors, phlogiston was a principle released into the air by combustion. In his chemistry, in which combustible substances combine with oxygen, there is no such thing.

Paul Feyerabend (1924 – 1994), in Against Method: Outline of an Anarchistic Theory of Knowledge first published in 1975, went further than Kuhn. According to Feyerabend, deep changes in science meant that there was no permanent, enduring method involved in science at all. Neutral experience, moreover, was also a myth; we always approach nature from within a theoretical perspective or worldview (my term, not Feyerabend’s). Many ideas — Platonistic, Christian, astrological, numerological, etc. — had contributed to the scientific revolution. Kuhn’s “normal science,” moreover, was a bad idea to the extent Kuhn was right. What it did was turn science into rigid dogma more suited for a church. Healthy science embraced a proliferation of competing and often mutually inconsistent ideas, as it had always been improved by competing, mutually inconsistent lines of thought. Dominance of a single paradigm was more the product of the politics of scientific communities than intellectual argument, positivistic mythology notwithstanding.

These ideas may seem complicated. In a lot of respects, they are. I’ve only scratched the surface, without providing the details of the extensive debates they provoked within the philosophical and philosophy of science communities. They are worth thinking about if we want to understand both the impulses that lead many people, not all of them scientists, to embrace the idea of settled science, while helping us realize why there is no such animal. Science has always changed its assumptions. Narratives about reality once dominant have invariably fallen. There is no reason to think ours have any special standing.

Both Kuhn and Feyerabend were accused of introducing irrational elements into science. It was a source of frustration to both that their critics lumped them into a single “school” called “historicism,” since many of their ideas were different as day and night. Kuhn denied the allegation; what he’d said was that scientific change could not be shoehorned into the logical formulations then in vogue in analytic philosophy. Feyerabend made fun of it, responding with satire and ridicule. His “anarchism,” he insisted, had never been more than an ironic expression of the rationalist’s predicament if the rationalist honestly compared his abstract theories of science to the real thing in all its messiness. Science is not the product of thinking machines, but human beings in specialized communities. These communities always have a hierarchy and therefore both political and economic dimensions; normally, someone has to fund them, and this introduces nonscientific overseers into the mix. We thus have one of the issues with academic and corporate science — and one of the primary reasons for skepticism about its objectivity. This comes before we hear from scientists who will claim to have been ostracized for rejecting the official climate change (or some other) narrative, and some who have quit major organizations because they believed those organizations’ embrace of the narrative was political instead of evidence-based.

I trust it is clear how all this applies. First, what does the climate change narrative assert? It can be divided into at least two parts. There is the idea that the world, on average, is warming — that the climate as a whole is changing, which includes increasingly hostile weather in some locales. Climate change advocates, by the way, are right to distinguish weather from climate. Weather is what it is doing outside right now (sunshine, overcast, rain, sleet, snow). Climate is what happens around the world over a long period of time. This is why arguments that the U.S. and Europe are experiencing hostile winters don’t refute climate change proposals. Let’s avoid that mistake.

In an ideal world, whether the planet is in fact warming could indeed be determined by collecting data — large amounts of data, from various places, over a long period of time, all over the globe. Whether we have been collecting data for a long enough period is a question some ask. Another matter worth considering is whether ice fields and masses in the Arctic and Antarctic regions are growing or shrinking over time.

The next question is conditional: if the world is warming, then is human activity its cause? Here is where you’ll get the “settled science” claims; we’re all familiar with the 97%. This figure is disputed, however. Muddying the waters further is the realization that both sides have their own funding sources with deep pockets. It is clear that globalists such as George Soros are backing the official climate change narrative; it is also clear that the Koch brothers and corporations such as Exxon-Mobil have funneled millions of dollars into the coffers of climate change skeptics. The honest layman might have a hard time deciding who to believe. Worse still is the strong sense that our generation is obligated to get this right; future generations won’t look back on us kindly if we don’t. The claim is that carbon dioxide (CO2) and other greenhouse gases, products of the burning of fossil fuels entering the atmosphere for decade after decade, have begun to disrupt natural climactic patterns, and the climate’s adjustments explain the hostile weather, violent storms, and extreme cold as well as extreme heat.

From a systems point of view, the idea isn’t crazy. Systems theory tells us that systems — sets of interacting and interdependent parts or processes — always operate in an environment, always face potential sources of disruption, and are always parrying or adjusting as they try to maintain equilibrium. This sort of claim might be necessary as a justification for climate change advocacy, but it isn’t sufficient. By itself, it won’t tell us what is causing any systemic adjustments that are happening.

To opt for the view of man-made climate change — that CO2 and other greenhouse gases produced by industrial civilization are the causes of actual long-term warming — will indeed call for major changes in how we generate and use energy. Those contending that burning fossil fuels will need to be phased out on a global scale if we wish to avoid disaster are correct (how many major cities are located on coastlines and will be inundated?)! That is what is at stake here!

The fact that there is little reason to trust academic science is working against us. There may, in fact, be no rational resolution to this dilemma. It might have sounded, to idealists, like a good idea to keep science free of politics, but given that Kuhn and Feyerabend were right: given that nearly all science is done by human beings in organizations that do not finance themselves, this being impossible in a culture that values entertainment more than science (entertainers become multimillionaires while non-celebrity scientists beg for grant money).

It is conceivable, however, that other forces at work today will solve this problem for us. Kuhn’s ideas about the rise and fall of paradigms in science are echoed, somewhat, by the trajectory empires take: they rise and flourish for a time, but invariably find themselves unable to solve mounting problems they face and go into decline. Empires fail for some combination of the following reasons: aggressive behaviors towards others that breed blowback, moral confusion and debauchery bred by materialism, financial corruption and destruction of the dominant currency, an inability to transmit their traditions to future generations via education, invasion and colonization by unassimilable immigrants, or official policies by governments that sabotage the very thing that enables a civilization to stay prosperous: productive activity. When its masses consume more than they produce, as short-term pleasures displace long-term thinking and planning; when the majority grows soft and dependent, and an ethos of entitlement is built into official policy, populations become parasitical. The more resourceful members of the civilization’s intelligent and productive minority begin to flee. They take their entrepreneurial or other skills elsewhere. This usually means the decline is irreversible. History books tell multiple variations on these themes.

Western civilization has probably done the most to curb air pollution. Many Latin American cities still suffer from pollution, however; and Chinese cities are still terrible polluters. The Chinese are starting to follow the West into mass consumption, however, and will end up in the same place as consumptivism leads to entitlement. As I have argued at length elsewhere, we are not far from a major global reset. Our civilizational paradigm has been one of unlimited progress amidst centralization, war, debt, open borders, moral subjectivism, and dependency. This paradigm is beginning to fail massively. Those who have made long term thinking and planning a priority in their lives will be ready for the coming reset, and will not worry overly about the when — I grow weary of people either trying to pin a date on it or asking me for one. The point is, it will happen. History and mathematics do not lie, like mainstream politicians and economists.

When the reset occurs, it will become possible to devolve power from the center and begin progress towards a world of small states or other autonomous ventures (some of which will embark on projects of statelessness), along lines advocated by Leopold Kohr. As small, self-ruling polises proliferate, Feyerabend-style, one will see wide differences in philosophy and economic arrangement. It may be, in fact, that millennials flocking to Bernie Sanders will get a chance to try democratic socialism — on an actual Denmark-sized scale or smaller! If they aren’t bothering the Galt’s Gulches of the world, there will be no reason to stand in their way. We will find out which systems generate better lives for their participants.

None of these small entities, I submit, will tend to pollute their surroundings. Having attracted the best and the brightest, one or more of the new polises may well develop technologies for energy generation and use that do not rely on fossil fuels. This idea is no crazier than were those of heavier-than-air commercial flight or wireless communications or space travel. I see nothing wrong with pursuit of such technologies by those who choose and have voluntary support, regardless of the truth or falsity of the climate change narrative. Such efforts will encourage self-reliance. If by some chance the narrative contains some truth, we will heal the planet by having healed ourselves, at least in part. If it is false, by abandoning empire we will still have better and healthier communities than we have now. Along the way, those who have chosen to do so will have eschewed materialism and reconnected with God. These are the folks who will have achieved true happiness.

Posted in Higher Education Generally, Philosophy of Science, Political Economy, Where is Civilization Going? | Tagged , , , , , , , , , , , , , , , , , , | 1 Comment

Why Do Highly Intelligent People Do Extremely Stupid Things?

Academia, those who have spent any time there may have noticed, is a very strange place. Among other things, there is a certain amount of truth to those colorful anecdotes about absent-minded professors who, back in the days when chalkboards existed, would fill their chalkboards with equations, but walk around unkempt with shoes untied, and be likely to have left their watch or wallet somewhere. And many have been prone to alleviate their social awkwardness towards the fairer sex with a “hi cutie” or two, usually privately as they entered class.

Unfortunately, not all such events are mere anecdotes, and not all are funny. Some are career destroying. They raise questions of why obviously intelligent people do the things they do, especially when some of those things are just plain stupid in this day and age.

Several months ago, not long after I first started this blog, I wrote about the now-discredited gang rape allegation at the University of Virginia. It was never my position that unwanted sexual advances and rapes do not take place on campuses. Obviously they do. Not as frequently as feminists say, but my point here is, they do happen, and we critics of political correctness never said otherwise. Cases easily describable as sexual harassment also happen.

Consider the case that seems to have begun back in fall of 2011 between reasonably well-known British philosopher Colin McGinn, then at the University of Miami in Coral Gables, Fla., and a graduate assistant named Monica Morrison. McGinn’s credentials are well known. A prolific writer, he has numerous books in both the philosophy of mind and the philosophy of language as well as dozens of journal articles. He recently pushed a few disciplinary boundaries with his latest book, Prehension: The Hand and the Emergence of Humanity (MIT Press, 2015). He, along with a colleague and his former university, are being sued by Ms. Morrison.

If we can rely on this account, here is what appears to have happened. During academic year 2011-12, Ms. Morrison was assigned to work with McGinn. Right off the bat we have the basis for a major power asymmetry. McGinn is well known, and so working as his grad assistant would be a plumb assignment. Successful work might mean of very positive letter of recommendation later. Such letters are almost essential for finding any decent academic employment at all these days, even for women. Hence a strong motive for subservience.

McGinn began expressing an interest in Ms. Morrison which looks to have gone well beyond the appropriate and professional. As the months ensued they began exchanging emails, with McGinn expressing most of the interest and Ms. Morrison growing increasingly uncomfortable, as the emails had gone from complimenting her legs to comments about erections, hand jobs, and references to her as his “beloved pet.” McGinn, incidentally, was 62 and she was just 26. Huffington Post claims to have reviewed hundreds of pages of emails exchanged between the two.

Ms. Morrison reluctantly expressed increasing discomfort with the situation. McGinn did not back off, and so she complained to university authorities, whom she now alleges did not act seriously on the complaint. McGinn contended — still contends — that the relationship was consensual, and this was the university’s judgment although McGinn was called onto the carpet for not reporting the relationship. I have to wonder how a reasonable person can claim that a relationship is consensual when one of the partners is relatively famous and has written to the other that she would be “much better off with my support than without it.” That’s what’s known as a power-play.

Both left the university. Morrison resigned as McGinn’s graduate assistant in early September 2012, filing her complaint a few days later. McGinn announced shortly before the end of the year that he would resign instead of fight the allegation; he says he was not forced out. You can read the specifics on the article I linked to above, if you are so inclined. That article also references significant passages from the emails — passages which, one should note, are pretty damning. It would be hard to say they were “taken out of context” or something along those lines.

Which brings me to the main point of this post. What on Earth is any university professor doing putting references to his having an erection or wanting a hand job into an email sent to any female student at his university, much less to his own graduate assistant?

One explanation is humorous bad judgment of the sort I noted at the outset which got a bit out of control in this case. Intelligent people are prone to it. All of us who had the dubious fortune of being born with strong inclinations towards intellectual work know, if we are honest with ourselves, that we have sometimes serious failings in other areas of our lives. I am a much better writer than conversationalist, for example. Many of us are socially awkward, borderline Asperger’s cases who can’t handle small talk, or have poor aptitude with mechanical things. I can do predicate logic in three place relations or with definite descriptions while explaining them step by step to students — but the last time I had a flat tire, I had to track someone down who could help me remove the tire and put on the spare (at least I had a spare).

The other possible explanation is a bit darker. McGinn, who is British, received his BPhil in 1974 and began his teaching career immediately. It is probably fortunate he isn’t an American, since that was right around the time the job market in the U.S. was starting to collapse. So if McGinn isn’t Lost Generation, he missed being such by the skin of his teeth — by not being an American and having been born in 1950. At 65, he may be the youngest person to have contributed both original and worthwhile material to the field with his new mysterianism: the idea that our minds just aren’t equipped to solve the problem of consciousness. The idea is intellectually serious, and can be expanded on: I am not sure our minds are equipped to understand either God’s existence or such matters as how (and when) time began or when the universe began. Our ignorance is vastly, vastly greater than our knowledge.

But never mind that here. My other possible explanation draws from the fact that McGinn was the last of the lucky generation, one might call them — those who came up before job prospects for those with advanced degrees in subjects like philosophy fell apart. Everyone in today’s academic one percent (one might call them) falls into this category. I know of no exceptions. I’ve often commented that, in U.S. philosophy anyway, there is almost no one younger than 70 doing serious philosophical work actually advancing the field, whose contributions cannot be dismissed as political (books on radical feminism, for example).

Perhaps, as an academic one percenter, McGinn thought he could get away with lack of professionalism with a graduate student. He found out otherwise — in this case (we have no idea how many instances of this sort go unreported, of course). This will taint his future in the discipline no matter how many books he writes. One of McGinn’s former colleagues came to his defense in a personal confrontation with Ms. Morrison, some minor-leaguer I never heard of named Edward Erwin, who alleged that she had ruined McGinn’s career. This earned him inclusion in the list of defendants in Ms. Morrison’s lawsuit, which also charges defamation in the face of generalized threats of retaliation. The fallout from this case may prevent her from having any future in the field at all.

What such cases do is give the radical feminists more ammunition. (All one need do to see this is read the comments section under the HuffPo article.)

Sadly, academic men now inhabit an environment in which a male professor is out of his mind if he approaches a female student — any male professor, and any female student. He is even stupider if he approaches someone enrolled in his class, and to make amorous advances toward someone working directly under him is taking stupid to the nth power! (Just think: if an adjunct or even someone full time but without tenure were to do it, that person would be gone faster than you can say sexual harassment.) This Lost Generation Philosopher’s recommendation, which is not limited to academia: never get involved with, or make advances toward, someone you work with.

Why do I say sadly? Because I have known older faculty members who met their future wives this way, as former students or former grad assistants. That, of course, was a long time ago, when there was more sexual sanity in our culture, and more maturity generally.

Posted in Culture, Philosophy | Tagged , , , | 1 Comment

The Age of Atheism and the Eight Veils.

Over the past month or so I’ve been reading Peter Watson’s The Age of Atheists: How We Have Sought to Live Since the Death of God (2014). I’ve almost finished it. If you are a believer in any sense of the term and you pick up this book, expect to find it somewhat depressing, but Watson is mostly reliable in his account of how mainstream thinking has developed over the past couple of centuries in many areas from the sciences to the arts and the humanities, including professional philosophy. Not that I believe his is the final word on his subject matter. I’ve read two of his previous books: The Modern Mind: An Intellectual History of the Twentieth Century (2000) and Ideas: A History of Thought and Invention from Fire to Freud (2005). The former gives a rousing account of the rise of energy technology, but without once mentioning Nikola Tesla (not even in an endnote). The latter treads close to the idea that Jesus Christ never really existed, much less died on a cross and was supernaturally resurrected. This will give you an idea of what to expect. While this is not a review of Watson’s books, even if you find these ideas repellent Watson’s books should be read if a reliable account of mainstream thought matters to you. If you reject mainstream thought, it is important to have an account of what you are rejecting, as well as why.

Two paragraphs early in the final chapter of The Age of Atheists leaped out at me when I read them. Watson writes: “We need to remind ourselves one last time that many people—and perhaps the quieter souls among us—see no problem in God being dead. For them his death is no source of anxiety or perplexity. Such individuals may call into question Robert Musil’s claim that even people who scoff at metaphysics feel a strange cosmic presence, or Thomas Nagel’s comment that we all have a sense of looking down on ourselves as if from a great height. But such individuals are not ‘metaphysical types’ and seek no ‘deep’ meaning in existence. They just get on with their lives, making ends meet, living from day to day and season to season, enjoying themselves where they can, untroubled by matters that so perplex their neighbors. They have no great expectations that ‘big’ questions will ever be settled, so devote no time to their elucidation. In some ways, they are the most secular people of all and perhaps the most content.

“Countless others live in circumstances so meager, so minimal, so fraught with everyday material difficulties that there is no time for reflection, circumstances where such an activity is beyond their means. By such people’s standards a concern with meaning, a preoccupation with the difference between how to live a good life and how to live well, is something of a luxury, itself the achievement of a certain kind of civilization. We must accept that the search for meaning is, by this account, a privilege” (pp. 532-33).

About this, Watson is entirely correct. There are people so busy struggling to make ends meet that they don’t have either the time or the energy to pursue philosophical problems or think about their relationship, if any, to a “higher power.” But even among those who have a bit more leisure, those of us fortunate enough to have been born into relatively advanced civilizations, it is clear that the majority, even though they have the time, do not have the interest or the inclination. They may be believers, and if so, they believe because their parents and peers believed, and it never occurred to them to believe otherwise. These are the people who will look out over the Pacific Ocean from hotel balconies at a beautiful sunset and see God, or wonder how such things could be if there were no God. It is not that they would react against a scientific explanation of the sight or the colors. Such an explanation would go in one ear and out the other. They have no interest in it. Or such folks may be unbelievers. Perhaps they were never exposed to church when they were children. Or they just don’t see the relevance of belief in a God to anything in their lives, or anything in this world. A few such people may have a sense of evil, via suffering they have experienced personally or seen others experience, or perhaps they’ve read about the Holocaust or Stalin’s purges, and respond with the idea that anyone who believes there’s a God in charge of this mess has rocks in his head. Then they don’t give the matter further thought.

I am reminded of a curious essay I encountered online a number of years ago: “Slavery and the Eight Veils” by someone named Don Harkins who edited a publication called The Idaho Observer before he passed away a number of years ago. I know nothing of Harkins or his credentials and little of his publication, nor whether the ideas in his essay are original with him or if he borrowed them from an earlier unreferenced source. But the essay stood out in my mind enough that I eventually penned a piece of my own, “Piercing the Veils” which looked at Harkins’s ideas and made some changes in them. The basic theme still seems sound to me, and would explain Watkins’s account of how so many people simply accept the “death of God” in secular civilization, if indeed they give the matter more than passing thought; or why other people believe, also without giving the matter much thought despite having the leisure to do so.

Consider this scenario for a civilization at any level of advancement: 90% of its people live out their lives with no serious or concerted interest in anything beyond what it takes to make ends meet and (as Harkins puts it) keep their lives together. In less advanced civilizations, of course, the percentage may be higher. In ours, it may be slightly lower. They live their lives behind the First Veil, and we may call them First Veilers.

The other 10% penetrate the First Veil. They take an interest in those matters affecting their community, or nation, that may not affect them directly and personally, such as politics. They have a position on the issues of their time, and may be able to defend it reasonably. They will trust and support their leaders, up to the point of going to war with a foreign nation if their leaders say war is justified. But 90% of this group never go further, because they live their lives behind the Second Veil, we will call it, and we may call them Second Veilers. Ten percent of this group penetrate the Second Veil and discover the cycles of history, the rise of civilizations, the influence of documents important in history such as the Magna Carta, the English Bill of Rights, the U.S. Constitution, and others. They will use such documents as a means of evaluating their leaders rather than just following them. But Third Veilers remain trapped behind the Third Veil.

What happens to that 10% who penetrates the Third Veil? They may have suspected earlier that, e.g., the drift of the U.S. away from Constitutional government wasn’t a mere accident, and wasn’t simply a necessary adjustment to the supposed failures and excesses of laissez faire capitalism. This 10% postulates, in different ways, the following idea: behind a number of semi-secret organizations (semi-secret here means that they neither seek nor dwell in the limelight, doing work which manages to be massively influential without publicity of any kind) is a number of very wealthy and very powerful extended families who, via ownership or co-ownership of banks or other corporations able to influence politicians and trends with money, shape economies and world events, in effect taking history in specific directions. Fourth Veilers who perceive this, often through laborious studies conducted on their own or with like-minded others, are simply dismissed by Second and Third Veilers as “conspiracy theorists,” a phrase with a specific origin (I won’t get into that here). But Fourth Veilers will often note how those behind the Second and Third Veils see First Veilers as little better than cannon fodder, to be sent (along with their children) off to fight in the wars they have caused. Ninety percent of those who penetrate the Third Veil to find themselves in this “brave new world” will remain trapped behind the Fourth Veil. For the other 10%, things begin to get very interesting — and possibly relevant to our remarks on Watson’s paragraphs. The people he describes who are untroubled by unbelief, and those I described who believe without mental struggle or perplexity (and don’t much care whether you agree with them or not), are First Veilers. Many of the intellectuals, artists, and others Watson describes are Second or Third Veilers. Philosophers such as John Dewey, Jürgen Habermas, and Richard Rorty are unquestionably Third Veilers whose take on the human condition is circumscribed by language and its limitations instead of politics, as well as by contemporary academic culture.

Many Fourth Veilers look at the present scene and conclude that the situation is hopeless. They have the global elites — the wealthiest, most powerful, and influential extended families — on pedestals. What can we do?

For the 10% who penetrate the Fourth Veil, I hypothesize in my essay, the materialism subscribed to by Third Veilers and by many Fourth Veilers is shortsighted, for this world really is the scene of a titanic struggle between godly forces and satanic forces. For the 10% who penetrate the Fourth Veil, the supernatural is real! God is in charge! He has revealed to us what He wants us to know! How does this 10% know this? They would not say they are simply reading it out of the Bible. They would say God is speaking to them through His Word, and in a variety of ways.

Crazy religionists, right? One of the features of those who have penetrated higher veils is the difficulty they have explaining themselves to those behind lower veils, assuming they are inclined. First Veilers who have the time do not have the interest. To Second and Third Veilers, Fourth Veilers are simply nuts, and many won’t hesitate to say so. To many Fourth Veilers — which incidentally would include the global elites themselves — Fifth Veilers are either nuts or simply irrelevant. As Harkin explained it, those behind lower veils can no more see what is behind the higher veils than you or I can see what is behind an opaque curtain.

Assume just for the moment there is something to this “veils scenario,” and it places belief (in God, in the supernatural, etc.) on an entirely different footing. It makes knowledge of the Divine Order (if you will) esoteric in the original sense of that term.

But wait a minute!

Harkin said there were eight veils. This means that those who see things in terms of godly versus satanic forces are Fifth Veilers, still behind the Fifth Veil. It implies — extending our numbers again — that 10% of those whose level of consciousness reached Fifth Veiler status will penetrate the Fifth Veil and become Sixth Veilers, and that 10% of those will see what is behind the Sixth Veil, and so on. Seeing what?

I don’t know, and in a culture still quite influenced by Christianity, whether supportive of it or attempting to reject it specifically (as Nietzsche did when he first proclaimed the “death of God”), those who penetrate the Fifth Veil will have a much harder time explaining what it is they claim to see. Are God and Satan members of some extremely advanced alien race? Are they composite beings? Are they entities we have no means of describing, as we utterly lack the vocabulary and the concepts? Did God create the world as some kind of science project? Or art project? Do such questions even make sense?

Some will say that the entire “veils scenario” collapses into unintelligibility at this point — especially if we insist that there are Fifth, Sixth, Seventh and Eighth Veils, and something specific behind them. Idle speculation, and nothing more.

I have no idea why Harkin postulated eight veils, and I don’t suppose I ever will. I do not claim to have the final word on any of this. I only note it as interesting, because it is clear that different people do have different levels of perception and cognition. Sometimes these are sufficiently extensive that the metaphor of their inhabiting different worlds is useful. This is self-evident, and explains why many people have no special interest in science, much less philosophical theology. They have no interest in understanding, e.g., the principles behind electricity just so long as their light switches work. It takes a certain level of consciousness to understand why a Søren Kierkegaard struggled with his faith, as opposed to just accepting his salvation based on Church authority. Perhaps, at a higher level of consciousness, faith both is and has to be a struggle. It takes an equivalent level of consciousness to understand why some people take for granted that they have free will and why many philosophers find the idea very perplexing at best and unintelligible at worst.

Might such ideas also explain why there are people who will find the materialist metanarrative lurking behind accounts of the “death of God” unsatisfying? Our culture, especially our academic culture, does not encourage this kind of speculation despite its pretenses to the contrary. For my part, even if we have decided in advance that our pontifications on being behind or penetrating “veils” is only another instance of somewhat clever wordplay, I have to wonder about the potential gains of refusing to be limited by a specific academic culture in a specific civilization at a specific point in its history. I am willing to put forth these and other speculations, in full light of the fact that beyond a certain point this is all that they are, since none of us has any way of knowing in advance whose speculations will eventually bear fruit, or what that fruit may look like.

Posted in Culture, Philosophy | Tagged , , , , , , , , , | 1 Comment