Open Letter to Professor C. Christine Fair, Georgetown University


“Look at this chorus of entitled white men justifying a serial rapist’s arrogated entitlement. 
All of them deserve miserable deaths while feminists laugh as they take their last gasps. Bonus: we castrate their corpses and feed them to swine? Yes.”

— (((Christine Fair))) (@CChristineFair) September 29, 2018

Professor C. Christine Fair,

Saludos from Santiago, Chile.

I’ve not written anything quite like this, and I am not quite sure how to begin it.

I’ve been around the block a few times and seen some vile stuff, but nothing quite like that remark from your Twitter feed.

Yeah, I’m a white guy, and on top of that, I’m straight as an arrow. Sue me.

I’m not going to go into attack mode, though. Others have probably done that far better than I.

I just wonder, though: what do you really think you accomplished with that tweet? Do you think such remarks do anything to heal the divisions that are tearing American society apart, much less further the mission of your institution (whatever it is these days)?

Or maybe something as breathtakingly constructive as trying to heal divisions is not your aim.

Maybe your aim in writing that was just to piss people off, so you’d get a predictable reaction.

And from the top comment on your Facebook page, it looks like you got one.

Not here. You see, Christine, I’ve been following the decomposition of academia for over 25 years now.  And you know something? A lot of us, out here in the boonies, have written folks like you off.

But I just have to say what is clear: you can’t possibly be interested in what is true and factual, much less what is right, or fair, or just. If you were interested in truth, you wouldn’t have called Brett Kavanaugh a “serial rapist” when there isn’t the slightest scrap of evidence the accusation is true.

Just chalk it up to my perspective as a white guy who doesn’t live in your academic corner of the universe where all straight white Christian men with conservative ideas are history’s criminals, where we are guilty if accused, and deserve to die miserable deaths and be castrated afterwards and our nuts fed to swine while your ilk laughs.

So sorry about that.

Make the best of it.

The only thing you could accuse me of is being a masochist, and you would be right. You see, Christine, I left academia several years ago, having gotten the message that my small voice wasn’t going to change anything, and I saw stuff like this coming.

I moved overseas, married a chilena (women here really are women, not … whatever you’ve become, up there in the former Land of the Free).

I’ve noted that free speech seems to apply to you. This:

The views of faculty members expressed in their private capacities are their own and not the views of the University. Our policy does not prohibit speech based on the person presenting ideas or the content of those ideas, even when those ideas may be difficult, controversial or objectionable. While faculty members may exercise freedom of speech, we expect that their classrooms and interaction with students be free of bias and geared toward thoughtful, respectful dialogue.

How nice that your wise administration has your back. Not so much for this person:

“[Kavanaugh accuser Julie] Swetnick is 55 y/o,” [Dean William] Rainford wrote. “Kavanaugh is 52 y/o. Since when do senior girls hang with freshmen boys? If it happened when Kavanaugh was a senior, Swetnick was an adult drinking with&by her admission, having sex with underage boys. In another universe, he would be victim & she the perp!”

From your university president, John Garvey (obviously going boldly where no administrators have gone before):

Rainford’s tweets of the past week are unacceptable. We should expect any opinion he expresses about sexual assault to be thoughtful, constructive, and reflective of the values of Catholic University, particularly in communications from the account handle @NCSSSDean. While it was appropriate for him to apologize and to delete his Twitter and Facebook accounts, this does not excuse the serious lack of judgment and insensitivity of his comments.

Rainford has led the National Catholic Social of Social Service since 2013. It is my desire that he continue to lead the school. But in light of these recent actions I have suspended him as dean for the remainder of this semester. Rainford understands and accepts this decision. Associate Dean Marie Raber has agreed to serve as Acting Dean during that time.

Double-standard at Georgetown? Who’da thunk it?

Well, Christine, I’d like to think I’ve made my point. Not holding my breath for confirmation of that, of course.

I presume you’ve heard, Brett Kavanaugh was confirmed to the U.S. Supreme Court earlier this afternoon (it’s 7:15 pm in my time zone as I write this).

Do have a very nice rest of the weekend, Christine.

But stay away from books like Heather MacDonald’s new one The Diversity Delusion (St. Martin’s Press). Your head will explode on contact. Friendly advice from your friendly neighborhood straight white Christian male. (Oh, I will also add you to my prayer list, that somehow your worldview might become a little less violent, bitter. and hateful.)

And don’t worry that I’ve started trolling you or something because I accessed your Facebook page to link to it. I assure you, I have better things to do with my time.

Sincerely yours,

Steven Yates, Ph.D., Philosophy

Writing from Santiago, Chile.

Posted in Academia, Media, Where is Civilization Going? | Tagged , , , , , , , , | 2 Comments

The Fate of Civilizations

Should a philosopher be interested in the trajectory of civilizations, from their rise to dominance in a region, and then the reasons why a civilization seems to lose its collective capacity and go into decline?

Most professional philosophers are not, of course, mostly because of the micro-specialization of academia generally. But suppose we can identify philosophically significant premises believed within populations as well as by leaders … premises that might empower the rise of a civilization. If these premises then start to disappear, or are removed, the civilization starts to falter.

Historically important philosophers such as Condorcet, Comte, Marx, all had theories of stages civilizations went through. Each believed that progress would lead to a final state of affairs, that which Hegel called the Absolute. For Comte, the ideal society was a society based on the applications of science to every aspect of human life. Bertrand Russell agreed. Since subsequent history has shown abundantly that science and technology are just as prone to abuse as any other human products, there is now grave doubt that a society based on science (and technology) would be the ideal.

Other writers saw civilizations as moving in cycles: as having life spans not unlike that of a person, with all the stages of life a person goes through. Edmund Gibbon wrote his classic Rise and Fall of the Roman Empire. Spengler penned The Decline of the West. Carroll Quigley, in The Evolution of Civilizations, wanted to answer Spengler as he believed civilizations in trouble could turn themselves around and continue making progress. Writing in the 1950s and 1960s, Quigley believed we were in trouble even then.

A lesser known writer we ought to investigate is Sir John Bagot Glubb (1897 – 1986). His major works began to appear around the same time as Quigley’s, and for some time after. Glubb was British, the son of a Royal Engineers officer, whose own military education and service (Royal Military Academy, followed by his own stint in Royal Engineers), followed by service in the first World War, eventually took him to the Arab world where he settled in. In 1930 he signed a contract to serve the Transjordan Government (later: Jordan). He came to command the Jordan Arab League. By this time he had assimilated into Arab culture, a culture he appreciated and even loved. He became a leading authority on the history of the Arabs, eventually writing some 20 books on the Middle East.

By the 1950s, however, Glubb noted with some dismay that his native Great Britain was in full retreat around the world. He’d learned that the Arabs had a vast empire a thousand years ago. He found himself studying other empires. He soon came to the conclusion that vast civilizations follow a pattern that transforms them into empires and then sees them destroyed. This led to his best known essay, “The Fate of Empires” (1978).

What pattern did he see?

He saw civilizations going through six stages, or phases. Here they are:

(1) The Outburst and Age of Pioneers.

(2) An Age of Conquest.

(3) An Age of Commerce.

(4) An Age of Affluence.

(5) An Age of Intellect.

(6) An Age of Decadence.

Glubb would agree, that is, with the idea that civilizations are usually not conquered but fall from within. Let’s consider each stage in a bit more detail.

(1) An Outburst may follow the appearance of some new ideal that captures the imaginations of a population, or a founding document such as the U.S. Constitution that expresses this ideal. The U.S. Declaration of Independence and its Constitution surely count as such documents. This, Glubb notes, has happened elsewhere. It happened with the emergence and rise of Islam. What follows is the start of a rapid expansion.

(2) This expansion is called the civilizations Age of Conquest. Those leading the Conquest become national heroes. Heaven help any other cultures unfortunate enough to be in the way. Ask the indigenous cultures that populated North America as the U.S. expanded westward during the very early to mid 1800s. For that matter, ask those who got in the way of Roman expansion, or in Napoleon’s way.

(3) With territory claimed, an Age of Commerce ensues. Farms and factories are built, trade routes are laid down, a single language is spoken, and a single administrative system falls into place across the region. It is during this period that the seeds of trouble get planted, however. For as the first native fortunes are accrued, those building them start to notice the power money gives them. Power otherwise unavailable within the political and administrative system. This they find fascinating!

(4) An Age of Affluence begins (there will be considerable overlap between this and its predecessor). To all appearances, the High Noon of a civilization is its Age of Affluence. Because new technologies are appearing and the builders of fortunes create millions of jobs, the standard of living rises exponentially. With sufficient surplus wealth floating around, large universities can be created and endowed, research institutes formed, etc. The generations that follow experience the results of this overall rise in prosperity but not the effort that went into them, and this, too, presages trouble. Moreover, making money starts to become an end in itself and not a means to advancing the common good of communities.

(5) An Age of Intellect begins, also overlapping with its predecessors. Almost every major community will soon have its college or university. Some of these will be very good at this stage; others will be mediocre. With the basic necessities of life now assured for a sufficient fraction of the population, acquisitions of academic honors start to replace honors achieved through military conquest and even by commercial achievement. At the same time, the Age of Intellect is marked by the appearance of disputations that more and more, seem to lack seriousness in the sense that they don’t address real problems. They may well be steeped in false premises, not recognized as such because they are not really tested against the world but protected within academia’s safe groves. Inevitably, such disputations turn to the foundations of the civilization itself, be they religious or otherwise. A civilization begins to drift as its first premises are called into question by its ostensibly best minds. Intellectual and eventually political leadership is thus beset by quandaries and doubt that did not exist before. Questionable decisions will be made, some involving which intellectual groups to support with lavish funding. Some will have bad consequences, as bad books are written and absorbed within a growing media culture. A result is that the moral “fiber” that holds communities together starts to unravel. This is accompanied by rapid “creative-destructive” advances in technology, achievements of convenience, and so on, that often lead to massive differences between parents and children, adding to uncertainty.

(6) Ages of Decadence call for lengthier attention. An Age of Decadence is marked by all or most of the following.

(a) Monetary policy is less and less responsible; as when, for us, financialization replaced production as a means of wealth-generation, allowing production to be outsourced to third world nations for cheap labor, all in the name of enhanced profitability. Civilization is by now highly centralized, so that monetary policy affects everyone within its borders in one way or another, and for that matter, will affect other societies that are trading partners.

(b) Rapid cultural changes are urged; these are eagerly embraced by some populations but not others, leading to rising division and dissension.

(c) There is rising alienation, as institutions of all sorts cease to serve persons and become expansive, impersonal bureaucracies serving only themselves. (Incidentally, think of the replacement of personnel departments with human resources departments, the implication being that human persons are resources not different in kind from other resources.)

(d) Increased frivolity sets in, as celebrities and sports stars replace achievers of the past who made genuine contributions to the civilization.

(e) Women begin to move into professions previously dominated by men, sometimes for economic reasons as the currency is devalued, wages flatline, and families need two breadwinners instead of just one.

(f) Immigrants begin to flow into population centers, the difference being that immigrants of the past learned to speak the dominant language and assimilated into the dominant culture while those of this new period do not. The result is that subcommunities form, and the capacities of schools, hospitals, and other institutions are overwhelmed by a babble of foreign languages. Some of these subcommunities are actively hostile to the dominant culture, furthering already existing divisions. (We see this happening in Europe, a civilization clearly in its Age of Decadence.)

(g) There is rising dependency on the instruments of the state sometimes for basic necessities. This may be because families have split up and communities have become divided, leaving elderly couples stranded and without other help; it may be because profit-driven outsourcing has resulted in a lack of jobs that match the skills of the population. It may be because of growing chronic health conditions resulting from imbibing unhealthy food, products of other questionable (but profitable) decisions.

(h) Schools fail to educate. Documentation of this is ignored. Educators begin to leave the profession out of frustration. The sources of their frustration may range from the growing indifference and unruliness of students, from bureaucratic interference with their teaching methods and content, or from pay so low that it fails to meet their basic expenses. Schools fill up with mediocrities and become less and less functional.

(i) Religious belief, healthy patriotism, a sense of duty to the common good, respect for matters of learning, and other commitments aggregated under the label tradition are replaced by materialist consumerism, a love of money, frivolity, and cynicism. These encircle the individual, who is increasingly isolated if he refuses to commit to them. A kind of pessimism suffuses the body politic however papered over with “eat, drink, and be merry.” Pessimism and anxiety will be reflected in literary, artistic, cinematic, musical, and other cultural products.

(j) An irrational fascination with sex of every variety comes to suffuse all cultural and commercial activity. We see its results all around us: distrust and hostility between the sexes, extramarital affairs, marriages breaking down or not happening at all as people choose to stay single (much easier to have multiple affairs that way), the appearance and mainstreaming of practices previously rejected sometimes as immoral but sometimes just on public health grounds. A general sense of the cheapness of human life manifests itself in the widespread acceptance of such practices as abortion. An added sense of the postmodern fluidity of truth is employed in their defense, which speaks euphemistically (e.g.) of a “woman’s reproductive rights” without the added observation that the “right” under discussion is a right to kill another human being without impunity.

(k) Finally, and most dangerously, civilizations, having entered their Ages of Decadence, take on the full characteristics of empire: over-expansion whether politically, economically, militarily, or in some combination of all three. They become aggressive toward other nations, often seeing themselves entitled to those nations’ resources or to be able to profit from having gotten them hopelessly entangled in debt (see John Perkins, The New Confessions of an Economic Hit Man, 2016). These same governments and corporations become increasingly aggressive towards their own citizens. Decisions are made on the basis of expedience, not principle. Those who criticize this system are driven to the margins, though they can appeal to increasingly alienated populations and sometimes gain an audience for their ideas. Among those ideas, if they are to retain a following, is hope. Satisfying that hope, however, is predicated on a fundamental change in the collective consciousness. Such change may go against the will (and the profit margins) of the corporate-state, and therefore be resisted violently by those in power if it catches on.

Is it not clear that the U.S. (and indeed, much of the rest of Western civilization) is deeply mired in its Age of Decadence?

I shouldn’t have to argue the point!

Those who keep up with current events saw the pathetic spectacle of the hearings over Supreme Court nominee Brett Kavanaugh over Dr. Christine Blasey Ford’s allegations of sexual misconduct during a party when he was 17 and she was 15. There were abundant reasons to believe the Democrats were holding onto the allegation, just in case they could not take Kavanaugh down on his legal qualifications and experience. The sense that Dianne Feinstein had little intrinsic interest in Dr. Ford’s complaint illustrates the cynicism of our times. As matters ensured, we were treated by Senate Judiciary Committee members to a recounting of the exact words used by teenagers in Kavanaugh’s high school yearbook as if they constituted evidence, something that did not occur even 25 years ago when Anita Hill accused Clarence Thomas of sexual harassment. What is clear, however, and this speaks to the sense of fluidity of truth mentioned above: if you’re a liberal or progressive, you believed her. If you’re a conservative, you believed him. Moreover, if you’re a liberal or progressive, you saw his visible anger as the arrogant outburst of an entitled white male of privilege. If you’re a conservative, you see it as the moral outrage of someone falsely and very publicly accused. This all exemplifies a divided nation. There is no clear way of ascertaining the truth, as what little evidence there is, is testimonial, and doesn’t support either story unequivocally (how could it?).

We get into messes like this because, first of all, in a culture saturated with sexuality and sexual innuendo, sexual misconduct is bound to occur. To that extent, her story becomes somewhat believable. In a culture of distrust between the sexes, moreover, in which allegations become weaponized for whatever reason, false accusations are bound to be thrown around. To that extent, his story becomes believable. Consider, moreover, social media technology which research shows allows people to group themselves voluntarily into silos, echo chambers, where their premises and conclusions won’t be challenged. This is human nature, if you think about it. Result: divides grow until they are all but unbridgeable, views on the other side of the aisle are seen as illegitimate, and public differences of opinion threaten to turn violent.

An Age of Decadence will be characterized by distrust. This distrust will manifest itself in countless ways, some very visible and others little more than nuisances. An example of the first is the highly intrusive vetting for positions such as a lifetime seat on the Supreme Court, a process bound to be very public in our age of total media saturation. Given that men are now guilty if accused in this environment, this may eventually ensure that no one, no matter how well qualified, will want such a position. We aren’t to that point yet, but why would anyone in his right mind want to endure what either family, Kavanaugh’s or that of Dr. Ford, have had to endure? (As an example of the second above, the nuisance factor, the other day I was temporarily locked out of my PayPal account because I had a typo in my password when I tried to log in. The system threw me several security hoops I was compelled to jump through to prove “I’m me.” The sad fact is, in this age of hackers, most such measures are justified.)

Returning to Sir John Bagot Glubb. He documents, from excursions into the histories of Greece, Rome, Persia, the Ottomans, and others, that we’ve never seen a civilization turn around from an Age of Decadence and regain its original foundation. The sexuality genie in particular is unlikely to go back into the bottle. Our monetary foibles are reaching a critical stage as debt of all sort continues to mount. The U.S. national debt is unpayable and continues to grow by leaps and bounds. Many of the U.S. federal government’s larger legal obligations will eventually be unpayable. A lot of student loan debt will not be paid, even as those struggling to pay it sacrifice major expenditures (e.g., housing) that would contribute to the economy. Add to this the fact that other nations, recognizing the dollar’s loss of value and increasing fragility, are starting to “de-dollarize” (do business in their own currencies).

All of this, of course, leaves the future of the U.S. very uncertain, no matter who is president, no matter which party controls Congress, no matter which technologies promise to emerge tomorrow to increase our convenience and save us from ourselves.

What often happens as an Age of Decadence runs its course is that the old older simply collapses, whether slowly or rapidly. There is a vast loss of influence and sometimes territory. This happened with the British Empire. It happened with the Soviet Union. It is likely to happen to the U.S., after it happens to the European Union. At the culmination of an Age of Decadence, institutions lose their capacity to enforce the rules because those in them lose their will. The system itself loses legitimacy. Citizens will already have turned inward, either to “tending their own gardens” as it were, or acting with their follows to actively separate, which is the start of the building of replacement institutions in a new culture. There are innumerable persons and communities, some within the borders of the U.S., some elsewhere, who have to all intents and purposes seceded from a political economy they see as dying.

What should a philosopher have to say about all of this? Er, plenty, it looks like, although very few philosophers are saying anything (most are, er, “tending their own gardens” in their safe comfort zones of academia).

As I stated at the outset, a philosopher should look to the first premises guiding any civilization, explicitly or tacitly, and get positioned to evaluate them. This will be the topic of my next few posts.


Posted in Culture, Media, Philosophy, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , | 1 Comment

Why Is Philosophy Important? An Expanded Comment

Daily Nous, the philosophy blog, posted a recent query raising this question in response to an undergraduate who had fallen in love with the subject. Presumably she’d gotten some flak from friends or maybe family. The blog’s editor, Justin Weinberg (South Carolina), solicited and received a number of responses. Most were interesting and worthwhile. One was from yours truly. Reviewing it, I decided to expand on it here because I think more can be said on, Why Is Philosophy Important? Some of it I’ve said before, but it bears repeating.

First, and as my comment noted (perhaps a bit too brusquely for the delicate tastes of most career academics), very little academic philosophy is important. It provides a paycheck for those fortunate enough to have found jobs in the field, or who didn’t eventually abandon them out of frustration.

Let me envision two roles for philosophy that could secure its importance in civilization. I will call them philosophy as service and philosophy as thought-leadership.

Philosophy as service will center on critical thinking and the analysis of language, offering a kind of mental housecleaning. This is appropriate for the academic setting if the instructor approaches it in the right way, warning in advance that some people might believe their toes are being stepped on. A good course in the subject should provide a student with a sense of what it means to support a conclusion with reasons (premises) and why this might matter. The student should learn what makes reasoning cogent or fallacious. Ideally, students will not be as prone to fallacious reasoning either in themselves or in others. A student should also come away from a philosophy course alert to the fact that not everything in our reasoning is stated openly. One’s beliefs might (usually?) contain hidden premises. How we identify these, and what we do then, will be crucial.

Philosophy might also draw attention to what seem to be the limits of our reasoning. Reason alone cannot answer every possible question or settle every dispute. First premises are notoriously difficult to prove or disprove, after all. Otherwise they would not be first premises!

A more practical focus on language in philosophy ought to alert us all to the fact that there are plenty of people in this world who use language as a means of control or even domination, sometimes as the equivalent of a weapon. Words or phrases, carefully selected, will encourage some lines of thought while inhibiting others. The political and commentary spheres provide an abundance of examples. Any reasonably intelligent person should be able to go to any popular newsfeed and find a dozen examples in less than a half hour.

If anything will hobble this approach to philosophy as service, as mental housecleaning, it is because as an academic subject, philosophy has been self-limiting and self-deprecating for well over a century now. Much of this was due to its deference to science in matters epistemic. From Auguste Comte on, positivists and their descendants saw themselves as, at best, handmaidens to science in the sense that Aquinas saw philosophy as a handmaiden to theology. For a long time, this was understandable. Unfortunately, philosophy as handmaiden to science tells us little about how to evaluate all manner of recent scientific developments ranging from nuclear weapons to genetic engineering to artificial intelligence and beyond. As Dr. Ian Malcolm (played by Jeff Goldblum) quipped to other characters in the film Jurassic Park, “Your scientists were so preoccupied with whether or not they could that they forgot to stop and ask if they should!”

Positivism is therefore dead and buried, one of our worst modern wrong turns. But self-limitations on philosophy have remained. As I’ve noted previously, the analytic tradition whether in its formal or natural language varieties developed powerful techniques but never used them to their full potential. Used to their full potential, philosophical analyses of how words and phrases have crept into our the general lexicon and what they are used to do might shed great light on how those seeking controls over others’ thought accomplish this. Did Wittgenstein not say near the end of the Tractatus that asking, What do we actually use this word or proposition for? repeatedly leads to valuable insights? It also matters who the speaker is, how he or she self-identifies, where he or she is, i.e., at what level of which hierarchy, etc.

If one needs examples, consider the phrase conspiracy theory. A simple search would turn up dozens of usages. What are these usages attempting to do? This example illustrates how any good analysis of a term or phrase should include its origin and history, as the origin of this phrase with the Central Intelligence Agency back in 1967 is known. The CIA’s aim, in introducing the usage, was to circumvent, a priori, all serious discussions of ideas or theories those in power did not want around.

Or consider the term homophobia, which for over 20 years now has come to be used reflexively in response to conservatives who criticize the homosexual lifestyle and its political and legal protections. What is a phobia? The term has a recognized use, as an irrational fear of something (think of a legitimate usage, e.g., agoraphobia). Use of the term therefore automatically suggests that critics of homosexual conduct and its promoters are by definition irrational. That which is irrational is not to be answered with logic but with cured with therapy. Hence the power of the term to misdirect and confuse. Good philosophical analysis should unearth this, but typically does not for obvious reasons: it quickly runs up against powerful prevailing political / cultural currents. (Use of the phrases transphobia and Islamophobia indicates that the phobia suffix is spreading. Why not play a successful meme for all the mileage one can get from it?)

At present, if one is interested in this kinds of usages of language as instruments of control, one will glean far more from the writings of Aldous Huxley and George Orwell than from Wittgenstein or Quine or any of the other heroes of mainstream twentieth century Anglophone philosophy. The former, of course, did not have to worry about offending those signing their paychecks, or being blacklisted within the profession for having offended the wrong people with words or supposed conduct.

There is another, more ambitious role for philosophy, however, which rejects roles as handmaiden to something else, some other enterprise. This is the role of philosophy as thought-leadership.

The best role philosophy could play in present-day civilization as a repository of thought-leaders is in identifying, clarifying, and critically evaluating worldviews.

By worldviews we do not mean personal opinions. We mean usually tacit but still fairly comprehensive systems of thought that direct civilizations through their institutions (governing, mediating, etc.), manifesting themselves in culture.

These are not theories that philosophers simply spun out of their imaginations, although past philosophical theories influenced them. Those in other leadership positions, or simply in dominant ones, in society state or imply worldviews with their choices of words and phrases, or influential choices of what they see as important. Afterwards worldviews may operate as unstated premises in discussions of public issues.

Supporters of these premises may hold them so deeply that they do not see the need to state them openly. They may think anyone who rejects them (also implicitly) is pernicious, or evil. This may be one of the reasons why those on opposite sides of, e.g., the conservative vs. progressive divide increasingly tend to fly at each other’s throats instead of sitting down across a table and finding out what the other is thinking.

What they should do is explore their worldviews. Even if they still did not agree, they would have more clarity on what they were disagreeing about. They would surely not be any worse off. They might even find common ground and recognize a common foe.

Is there a dominant worldview in the West right now? Is there more than one, perhaps vying with each other for dominance? I have identified materialism as dominant even if it takes more than one form, and Christianity, once dominant, as still its chief competitor. It, too, takes more than one form. These qualify as worldviews whatever their other features, because they fully suffuse all significant aspects of the lives of those immersed in them. They define reality for that group.

A worldview will usually be expressed in some core text such as the Holy Scriptures or in key statements such as Darwin’s theory or Russell’s “A Free Man’s Worship.” It will find expression in a culture’s art, its music, what its leading voices see as of value or important, and sometimes in political ambitions. Why have some civilizations’ leaders taken it upon themselves to try to dominate the world, or as much of it as possible? Because their worldview defines not just empirical reality for them but all that is good and superior. They see universal allegiance to their worldview as the path to Utopia. Communists saw revolution against the bourgeoisie this way, in accordance with the historical laws of dialectical materialism (Marx’s phrase). Global corporate capitalists since the fall of the Soviet Union have seen the superiority of a consumption-oriented marketplace as key to general material prosperity, not just for Westerners but for everybody in the world. This, to them, is superior to all else.

This brings us to: is a prevailing worldview helping us or harming us … or, perhaps, helping some (perhaps empowering them) at the expense of others? Does identifying and examining worldviews help philosophers engage systems of power and propaganda, doing what Noam Chomsky once described as the responsibility of intellectuals: “to speak the truth and to expose lies” (see his essay “The Responsibility of Intellectuals”)?

The academic system doesn’t encourage any of this, of course. It doesn’t encourage my service role for philosophy in this form — not really. Which is why most critical thinking courses are just logic courses that leave out their most important potential applications. As that great comedian and social commentator George Carlin once wryly observed, the last thing the truly powerful, owners of the leviathan corporations, want is a population of critical thinkers. Much less do those in dominant institutions want publicly accessible critiques of their worldview.

But philosophy still has a job to do, if it is to be a force to be reckoned with, and otherwise, why consider it important? The two roles for philosophy I outlined above are fundamentally flipsides of the same coin, for doctrinaire and controlling language is bound to be worldview-embedded. How to carry forth this kind of project is the question those who see philosophy as important should be asking themselves and each other, and also nonphilosophers concerned about where Western civilization is going (if it is going anywhere).

If the self-identified professionals ever get out of their office cubicles, or break free of various ideologically-induced blinders, whether to look at their language or consider the role of worldviews in modern advanced civilization — if at least some can courageously rise above their present stations and engage these kinds of questions and see where they lead, then Yes, philosophy as a discipline will clearly be important. Some, I firmly believe, are up to this task. They will be the thought-leaders of tomorrow if the West is to survive.

Posted in Academia, Language, Philosophy, Where is Civilization Going?, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , | Leave a comment

“Should I Pursue a Doctorate in Philosophy These Days?”

Should you even consider getting a doctorate and going into academic philosophy today? Even if you find the subject endlessly fascinating, and you have talent for it?

The question comes up occasionally on forums. Someone I am “friends” with on Facebook floated the idea. He posted that he was seriously considering it. (I’ve not met him, just read a few of his writings, thought many of them were both interesting and good whether I agreed or not, and he responded favorably to a friend request.)

I don’t often comment on his page, as I don’t really know him or his circle of real friends, but this time I felt moved. I advised against.

It has since struck me that others might find these reasons of interest, assuming those others happen to find their way to this humble, low-traffic philosophy blog — which includes not just philosophy but also the business of philosophy.

No, I would NOT recommend going into academic philosophy.

I speak as someone who did, obviously. In many respects I am still paying the price.

(1) The first and most obvious consideration is the job market for philosophy PhDs. It was horrible when I started (1980s), improved a little (not much) in the 1990s and early 2000s, and collapsed again with the financial crisis of 2008-09 during which a number of vulnerable people not in tenure track positions lost their jobs — in the state where I was then working (South Carolina) at least. Maybe things were better elsewhere, though I have no reason to think so.

The gradual replacement of tenure track jobs with part-time, adjunct positions has attracted some attention, moreover. Neoliberal administrators like hiring adjuncts because they save the institution money — so that they can spend it instead on that new building or pay for the latest campus beautification project while their corporate board (it might as well be) pays them six figures plus perks.

Media attention paid to adjuncts happened mainly because some were discovered to be, for all practical purposes, homeless, a handful had died from treatable conditions, and groups were forming attempting to unionize and bargain for better wages and working conditions (most do not have their own office space or computer terminals, which surely helps them build credibility with students who these days are going massively into debt to go to college).

Eons ago (back in saner times), adjunct faculty were usually retired professionals, willing to share their years of hands-on expertise on a subject by teaching a class. For this they may have receiving a small honorarium. There are retired professionals who did such things just to keep busy.

Most of today’s legion of adjuncts, many of them newly minted PhD’s and not retired professionals, will not find decent-paying academic employment. Many, if eventually saddled with family responsibilities, will be forced to leave academia in search of decent-paying work, as was the case after the job market initially collapsed in the 1970s.

(2) Long-time readers of this blog know my view that academic philosophy is basically a mess. Many of those in the profession would say otherwise. There are, after all, plenty of books published by academic presses, plenty of conferences held, an abundance of backslapping at national events, and every so often someone makes a splash in the waters of intellectual life with something that gets read and discussed. No one says there isn’t an abundance of activity. But when push comes to shove, these are not the days of truly first-rate minds like W.V. Quine and Ludwig Wittgenstein, or even Michel Foucault if you lean Continental. These are not even the days of Thomas S. Kuhn and/or Paul Feyerabend or Richard Rorty. These philosophers did not merely make ripples with their works. Quine’s “Two Dogmas of Empiricism” caused a tsunami, as it were. As did Wittgenstein’s Philosophical Investigations and (much to the chagrin of their critics) Kuhn’s Structure of Scientific Revolutions and Feyerabend’s Against Method.

Those days were gone by the 1990s. “Feminisms of the week” had ensued; and while the field had always been prone to fashions, the latest owed more to their conformity with rising political agendas than they did the kind of intellectual prowess of the above-listed works. If you are in an officially designated “marginalized” group, now expanded to include sexual minorities, you’ll receive at least some added attention from search committees that could lead to a tenure-track job. This could cut the other way, however, since most committees do not want to hire someone they fear would be disruptive, or might turn their department into an ideological war zone (yes, it does happen, I was there and I saw it).

What you’ll also risk is being “branded.” That is, you’ll be expected to contribute to the “literature” of your tribe, as it were. Today this includes not just “feminisms of the week” but philosophy “from a queer perspective” or now from a “transgender” perspective. And if you step out of line, e.g., by “misappropriation,” not writing from within the unique perspective of the tribe you’re writing about — even inadvertently, having written something well-intentioned — you’ll be punished. If you don’t believe this, “google” Rebecca Tuvel’s name (or go here).

You’re end up walking on eggshells around the politically protected — or just the administratively favored. Academia is not a place where you want to make enemies, something very easy to do. Conceivably, this is because so much of the work is of so little importance.

Let me qualify this. It is not as if pivotal historical figures like Aquinas, Hume, Kant, Nietzsche and Wittgenstein are being driven out by professionalized agitators seeking to “expand the canon.” That’s an exaggeration. But the writings of “dead European white guys” are clearly overshadowed these days, deemed less relevant in an age of “inclusion” or simply as “unexciting” (i.e., hard). While I’m in no position to take a survey and find out, I would love to know how many “feminist philosophers” of whatever stripe, or “gay philosophers” or “transgender philosophers,” or whatever next year’s favorite “marginalized” group will be, can outline and evaluate, from memory, Aquinas’s cosmological argument, or Hume’s criticism of miracles, or the second version of Kant’s categorical imperative, or offer coherent thoughts on what Nietzsche might have meant by “God is dead,” or offer some original, informed, and thoughtful commentary on the strengths and limitations of Wittgenstein’s later philosophy. These are things my generation needed to be able to do. Questions about such figures and their key contributions were on my prelims (a series of both written and sometimes oral exams doctoral students have to pass before advancing to actual candidacy).

Summation: academic philosophy has declined. The decline may be irreversible. Those able to reverse it are either struggling to survive in multiple part-time jobs leaving them little or no time for sustained scholarly endeavors (this is what I would deem marginalized in an accurate sense of that term). Or they have left for greener pastures and taken their talents with them.

(3) These are not days when subjects like philosophy are taken seriously at the administrative level, or necessarily by average students. In this neoliberal age, there is no money in them. If anything, they use university “resources” and don’t give anything back. The department I was in when I lived in South Carolina was the most poorly funded on campus. I felt supremely lucky at the time to have escaped the axe in 2008-09, because at the time I needed the job! Today, entire philosophy departments are actually being closed down at some institutions (Wisconsin Stevens Point is an example; Western Illinois I think is another; more are likely to follow). Those with tenured positions at such places are losing them!

In my experience, the majority of students do not take the subject especially seriously. I had many students who appeared to expect good grades just for showing up. In a sense, I get it. As mentioned briefly in passing, students are now paying through their noses to attend a university, or going massively into debt. Doubtless most are conscious of this, and want to make sure they graduate employable. They come to a philosophy class and wonder what studying Aquinas or Kant contributes to their future employability, and when they come up empty, they grow restless. They’ve been indoctrinated to think of themselves as consumers as well as students, future inhabitants of our mass-consumption paradise. With them having grown up in a media-saturated and entertainment-saturated culture, the professor who is a cross between Socrates and Seinfeld has something of an advantage in class. Can you do that? is a question I would ask a prospective academic philosopher. Are you willing to do it?

Let’s take note of another obvious sea-change of the past 20 years: the rise of mobile devices. There is probably no one in any advanced nation in the world that doesn’t own at least one. A recent study shows that social media has greatly shortened the average attention span (it has been measured as less than that of a goldfish). Students are now addicted to instant gratification, and the addiction is borderline-physical. We have other studies that have documented that checking Facebook on your phone and seeing the latest “likes” on your posts actually supplies a dopamine rush to your brain that reinforces the behavior. This means that millions of social media addicts are literally unable to go more than a few minutes without checking their phones. Tell students at the start of a class to turn their phones off, and by the end of a 50 minute class they may actually be feeling physically uncomfortable — the discomfort of the addict who needs his fix!

This is the landscape you’ll have to navigate if you decide to embark on a doctoral program in philosophy (or in many other subjects, for that matter). Incidentally, it will begin with not pissing off the wrong people when you’re still a graduate student. I knew a guy who did this simply by being an outspoken Republican, and eventually rose to being president of the campus Republican Party group. That was the 1980s. Reagan was president. The situation is magnitudes worse today, with Trump in the White House. Today, on some campuses, outspoken Republicans are called “Nazis” or “fascists.” They risk being physically assaulted. Cases are too numerous to link to individually, and there are likely many cases we don’t know about.

(4) Obviously, if you want to navigate this minefield, it’s your decision. It’s your life. In that case, choose a “ranked” doctoral program you’ll be hired out of. There are programs where you can learn a great deal, of course, and avoid most of the trendy rubbish. University of Pittsburgh seems to be such a place, or at least it still was a decade ago. University of California at Berkeley may seem zoolike because of all the adverse publicity surrounding conservative speakers there, but graduates of the school’s philosophy doctoral program tend to find good jobs. Numerous important recent philosophers have taught there: John Searle, Hubert Dreyfus, and the above-mentioned Paul Feyerabend, are just three examples. The department continues to have a top-flight reputation.

If you must embark on getting a PhD in philosophy, do your homework. Interview those in a prospective department, even as they are interviewing you. Presumably you are there because you have some idea where you want to specialize, and what will be your AOS (Area of Specialization) is well-represented there, and known by others to be well-represented there. Ask for the ratio of those who eventually found tenure-track jobs out of their program to the total number of those who sought jobs, which would include part-timers or those forced to take nonacademic employment.

Keep in mind, too, that if you end up in this final category, you’ll have to get used to being told you’re “overqualified” for whatever bullshit job you might find yourself applying for, increasingly out of desperation. Entrepreneurship is a possibility, but getting a doctorate in philosophy will not give you entrepreneurial skills. It might even do the exact opposite, by encouraging you to write for the tiniest and most academic of audiences material that will be light years over the heads of average readers. Let’s note in passing that the number of people who read books has also fallen off dramatically during the social media era.

(5) Given the World Wide Web, there are plenty of venues for discussing philosophical problems — and for all I know, some of them might be monetizable (for those who have asked, this blog doesn’t get enough traffic to make it worth the effort).

You can lecture on a YouTube channel if you’re so inclined (again, I’m not).

You can do podcasts.

My point is, there are ways of involving yourself with philosophy, and with other philosophers, that do not subject you to the abuses of academia, and to a discipline that is arguably slowly and painfully killing itself.

As I said, though, it’s your decision. Good luck. And remember: you were warned.

Posted in Academia, Higher Education Generally, Philosophy, Where Is Philosophy Going? | Tagged , , , , , , , , , , , , , , , , , , , , | Leave a comment

“Anti-Intellectualism and How Fascism Works”: A Comment

I followed the link from here to IHE’s “Anti-Intellectualism and How Fascism Works,” an interview with Jason Stanley (Yale) who has authored a book entitled How Fascism Works. I’d been thinking of posting a comment, but discovered that the comments thread had been closed by the site administrator. This seems odd, since the interview is less than two days old.

Ninety comments appear. While comments sections do often degenerate into pointless slugfests, except for a very few posts this one did not strike me that way. While there was sustained and sometimes vigorous disagreement, some fundamental issues were being raised. I’ll leave it to readers to ponder why the comment section was closed so soon.

In any event …

Admittedly this was a short interview, and I’d been anticipating something longer, conspicuous in its absence was a clear, concise definition of what the author means by fascism. Why is this important? Not just because it is in his book title, but because of the way the term is thrown around and never defined. How gullible do you have to be to realize that fascist has become one of the big demonizing and weaponized words of the day?

The closest Stanley comes to a definition is this (it is, as we see, a definition not of fascism but of a variety of fascism he calls fascist anti-intellectualism).

Fascist anti-intellectualism sets the traditions of the chosen nation, its dominant group, above all other traditions. It represents more complex narratives as corrupting and dangerous. It prizes mythologizing about the nation’s past, and erasing any of its problematic features (as we see all too often in histories of the Confederacy and the Reconstruction period, or of the treatment in history books of our indigenous communities). It seeks to replace truth with myth, transforming education systems into methods of glorifying the ideologies and heritage of the members of the traditional ruling class. In fascist politics, universities, which present a more complex and accurate version of history and current reality, are attacked for being places where dominant traditions or practices are critiqued. Fascist ideology centers loyalty to power rather than truth. In fascist thinking, the university is simply another tool to legitimate various illiberal hierarchies connected to historically dominant traditions.

I don’t question that this was true of Hitler’s German and Mussolini’s Italy. But a version of this same thing is true of any totalitarian ideology, including those of the left such as Communism, the primary difference being that they “mythologize” about their futures instead of their pasts. They surely “replace truth with myth …”  I hope no one seriously believes universities under Communism presented an “accurate version of history and current reality …”  Surely we recall the Lysenko case.

This aside, in the absence of a definition for its key term, one has to suspect that the subtext is just another attack on President Trump and his supporters. The defense of elites here, however qualified (“Our suspicion of elites and what could be seen as anti-intellectualism can be healthy at times;…”), surely supports this interpretation, since it was anti-elitism that Trump successfully appealed to from the get-go.

Thus we see more of the same: possibly yet another lengthy ad hominem argument, with no real analysis. No analysis, that is, of the whys and hows of a guy with no previous experience in the political arena was able to trounce sixteen Republican competitors and then go on to defeat the Democrats’ and cosmopolitan elites’ anointed candidate. However small the margin in the Electoral College, and whoever won the popular vote, the point is: Trump won. How was that possible? Why did it happen?

Could it be because the mainstream of both political parties has collapsed, has simply lost credibility with the voting public? Could it be, too, that alternative sources of information readily available on increasingly sophisticated Internet platforms were successfully challenging dominant narratives? The latter would explain the cold war against “fake news,” the latest gambit being played out in this war is Alex Jones’s InfoWars being kicked off Facebook and numerous other social media sites. Whatever one thinks of Alex Jones, it is hard to see this as anything other than a move by those who see their current mission as establishing Ministries of Truth.

Returning to the closed comments thread, one comment leaped out at me. The author signs himself only as “Cultural Anthropologist”:

As a Professor Emeritus who has just completed fifty years of teaching at a Ph.D. granting university, I know for sure that the statement “universities…present a more complex and accurate version of history and current reality” is wantonly false. Also false is “Above all, the mission of the university is truth.” The mission of universities today is to advance “social justice,” “diversity,” and “inclusion” (but not of Asians). At least in the social “sciences,” humanities, education, and social work, the mission is to advance a far left wing ideology about society, to undermine the West and Western civilization, to negate liberal rights and protections in favour of statism and identity categories, and to push forward practical methods for implementing “social [in]justice.”

All true. It has been a long time since truth was central to the mission of academia, or education at any level.

But if inculcating herd behavior and obedience to authority are major prerogatives, departures from which are punished with ostracism at best and career destruction at worst, then higher education of the past 40 years or so has been a stunning success!

“Cultural anthropologist,” after all, was immediately attacked by subsequent posters, after all, the first of which accused him (her?) of “spew[ing] … bile.”

Another demanded evidence, making me wonder what cave he (she?) has been living in for going on 30 years now.

Admittedly it’s just a comments thread, but this is the sort of thing I’ve been talking about for a long time, and it’s hardly limited to comments threads.

And perhaps Jason Stanley defines fascism in his book, which I’ve not obtained as I am outside the U.S.; obtaining hardbound books in English in a timely manner where I currently live is possible but extremely expensive, and truth be known, resolving such matters as this is not my highest priority just now.

I’ll conclude by noting … I’ve no idea whether anyone reading this will believe me or not (or will care): the Right is far from getting everything right. I’ve no compunction to defend what the Koch Brothers do, and I’ve certainly no desire to defend the transformation of universities according to the “business model.” It seems to me, however, that this model wouldn’t have been so easy to implement over the years had academia truly had as its mission the discovery and communication of truth during the decades that preceded the current tendencies, much less in the present.



Posted in Academia, Books, Election 2016 and Aftermath, Higher Education Generally | Tagged , , , , , , , , , , | Leave a comment

Philosophers and Social Media: A Comment

Those who read last week’s note will probably say, “Wow, that was a short break!” This is a comment, though, not a stand-alone essay like many of its predecessors. This despite it’s getting longer than I intended.

Should philosophers “do” social media? Rebecca Kukla (Georgetown University) says yes, they have much to gain — especially younger philosophers (for a very short excerpt go here). She is not my favorite academic philosopher. I explain why here. She is among those I label pseudo-marginalized.*

Some of her observations on social media usage are interesting in their implications, however, as are Brian Leiter’s. I’ll confine my observations to Facebook, since I’ve used it more and know it better than any other social media platforms.

Some people, academics or not, despise Facebook and refuse to use it. Their reasons aren’t all that clear. Facebook has become a corporate empire, one of many in the tech world, but that’s not necessarily a reason to avoid it. It’s being in bed with the CIA, the NSA, and probably a dozen other shadowy federal agencies, may be more telling. Facebook stores your information, but it’s hardly alone in doing that. If you refuse to use Facebook out of fear for your privacy, you are naïve. Privacy went out the window with email.

Does Facebook censor? Of course it does. I’ve known people put in “Facebook jail” (their term for it) for politically “insensitive” posts, especially about Jews (surprise, surprise). I’ve not had a problem, maybe because I don’t post about the “Jewish problem.” This despite defending Donald Trump from what I consider myopic, incompetent criticism. I’ve penned countless exposés of academic political correctness and corporate media dishonesty.

What I suspect: the upper echelons of the Facebook world disdain political discussion generally. I’m not sure I blame them. The platform wasn’t designed for that. Moreover, the research is coming in: social media are among the dividers in American society. People have a tendency to congregate with those like themselves, who share their beliefs and opinions, especially in politics. Facebook unintentionally encourages this. Its system of friending, liking posts, commenting, etc., sets up feedback loops of positive reinforcement. Don’t like a friend’s posts. Ignore them and eventually you won’t see them. Or unfriend him or her. Thus the formation of echo chambers, whether of the right or of the left or anywhere in between.

Some Internet users, moreover, had become “keyboard commandos” who found it easy to insult or bully those outside their echo chamber before Facebook was around. Now, it is as if the differences between online and offline worlds have begun to blur. Public incidents we would never have heard about 30 years ago are now filmed on mobile devices, uploaded to social media, and viewed almost instantly by millions of people. Victims of this sort of thing become involuntary celebrities. Or perhaps better, celebrities-in-reverse, since we aren’t celebrating them but shaming them. Online shaming has almost become a sport!

I think this is a reason we are living in a more hostile society generally. While pundits (Steven Pinker comes to mind) tell us how much violent crime has dropped during recent decades, such measures don’t reflect cyberbullying, personal attacks, shaming incidents, etc., none of which are illegal (some may enter what is, at best, a gray area).

Left and right, unaccustomed to opposition due to online lives in their echo chambers, are more and more willing to demonize and confront one another violently. To be clear: my boots-on-the-ground sources tell me it is usually the left that gets violent first. But those on the right are increasingly willing to get in their faces. The latter aren’t afraid of guns like the former. Were a situation like those we’ve seen in Portland, Ore., and yesterday as I write this in Berkeley, Calif., to get out of control, there’s no reason to think leftists would win even if they have superior numbers.

Facebook did not create our current divisions, of course. But it set the stage for accentuating and aggravating them.

All that said, Facebook has advantages. Through its networking possibilities I’ve formed a few strong friendships with people I would never have heard of otherwise, rediscovered folks I went through high school and college with, and maintained friendships that would have fallen by the wayside when I relocated geographically several years ago.

There are, moreover, hundreds of private groups on Facebook devoted to every conceivable subject, including philosophy. Many of these groups are closed, and don’t allow insulting other members, or bullying, or trolling. Their administrators post rules up front and do not hesitate to expel those who refuses to follow them. Such groups can be useful venues for conversation, advice on mundane problem-solving, support for those coming to them with more serious issues, and more.

Many who use Facebook, just use it to announce family events (vacations or anniversaries) the way we used to do with photo albums in the pre-Internet days. I think Facebook’s algorithms are more attuned to such usages. At the start of the month I posted an anniversary photo taken of my wife and me four years before on the day we got married. It received over a hundred “likes” and dozens of congratulatory comments. I’ve seen this happen countless times.

On the other hand, my political posts rarely get more than five “likes,” unless I’ve shared a video. Somehow, that increases the number, probably because watching a video is less demanding than reading something. Absent a video, with just a link to an article or story and a paragraph or two of commentary, many don’t seem to be seen at all. (I’ve no means of knowing, of course, how many people “lurk,” i.e., read my material without doing anything to announce their presence.)

Enter Rebecca Kukla, who (speaking of social media generally) calls it “our main opportunity to craft our public persona and to forge connections with other philosophers.” She adds that staying off social media “can actively harm your career, while using it wisely can actively help you, and can enrich your professional and intellectual life.”

There are no a priori reasons it can’t do this. Her discussion converges on Facebook, where many of her observations parallel mine, in that it creates space for professional contacts that open doors, especially for younger scholars, by having “created a vast set of interlocking philosophical communities.” She continues:

Through Facebook (and to a lesser extent Twitter) I have been exposed to, had conversations with, and formed friendships with a dramatically wider range of philosophers than I otherwise would have. My philosophical community is no longer bounded by geography, by job status, by age or social identity, by type of institution, or even by subfield or methodological approach. I expect most of us tend to disproportionately make Facebook friends with ‘people like us’ to some extent, but there is just no doubt that social media has broadened many philosophers’ exposure to different kinds of scholars, issues, and conversations. Junior philosophers who give these communities and exposures a pass are missing out on something that could enrich their intellectual and social lives, and they are forgoing crucial networking opportunities.

This makes sense, but there are dangers she wants readers to be aware of. One of the features of the Facebook world (this was true of the online forums that preceded it) is that you never know who might stumble across it, or even seek it out when they want information about you. We all know this, but how easily we tend to forget it “in the heat of the moment” (my quotes, not hers). Hence the environment, she says, “is fraught with peril. An online fight with the wrong person or a post that rubs people the wrong way can do real damage.” The norms are still evolving, she adds. Posts intended for a particular audience that will read them favorably might be read quite differently, and negatively, by readers outside that loop.

All entirely correct. Kukla thus assembles a list of online best practices for younger philosophers, especially those struggling with a hostile job market or perhaps dealing with rejection from an academic journal. Don’t sound off about it online. It makes you sound bitter and uncollegial. She advises against posting trivial stuff — or material likely to be seen as trivial or juvenile by those a jobseeker may be trying to impress. She suggests creating a separate Facebook account for family, friends you went through high school with, and nonacademic friends generally.

But here’s a thought: is it possible to “do philosophy,” i.e., do more than simply try out ideas or banter about philosophical issues, on Facebook or other social media platforms. Kukla again has many valid points about the latter of these; she says little about the former. What she says is to refrain from dismissing entire areas of philosophy or dismissing philosophers who are well thought of or engaging a given philosopher’s post without doing some basic research to find out who they are (an easy mistake I once committed).

One must ask whether this kind of platform is really suitable for philosophical research (as opposed to networking, testing out new ideas on colleagues, etc.). Why? Because most of us originally majored in philosophy in order to “do philosophy,” not merely banter about it. Facebook wasn’t invented with that in mind, though. Nor was any other social media.

“Doing philosophy” on an independent blog such as this is hard enough! I have not done as much as I intended. I did not plan a news site like Leiter’s (who can compete with him on that, and why would anyone want to?). The Internet is simultaneously liberating and limiting! It is liberating in the sense that I don’t have an editor or referee board making trivial criticisms that I’m using this or that term “unclearly” when the truth is, he dislikes my main thesis or conclusion. On the other hand, the lack of oversight means taking full responsibility for what appears here, and seeing to it that what results is as good as I can make it! A couple of extra pairs of eyes would be helpful, but as an independent scholar with a different occupation, I don’t have that luxury! What limits me is a paranoia that what I have is not good enough! Hence a trove of things sitting in Word files!

Blog entries, moreover, no matter how thoroughly they argue a philosophical thesis, tackle a quandary, or how well they play by the rules of citing relevant literature, etc., are never cited in journals or in The Philosopher’s Index. Much of academic philosophy’s reporting system on the philosophical work out here is still stuck in the pre-Internet era. Having said that, yes, you will find philosophy on blogs that is simply lousy: unoriginal, poorly reasoned, etc.

But all this is aside. Social media is here. We might as well use if we can, if it solves certain problems like networking. But do we need to use it to advance philosophical conversation?

Leiter observed that the areas of philosophy he is most familiar with (e.g., philosophy of law) don’t make much use of Facebook. Younger philosophers will say he’s dating himself, as am I, for I am thinking he may be right. Most of us, of the generation now in its 50s and 60s — the first “lost generation,” some of us, anyway — grew up without computers. My generation had no social media when we were graduate students. I don’t believe researching the material that went into my dissertation the old fashioned way — hours of library work, consultations limited to senior faculty in my department red-penciling my work — was significantly hurt by this. We may have been limited by technological doors not yet built much less opened. Would Facebook have helped? I don’t know. Given that we were a rambunctious lot who rarely hesitated with our opinions, and in a “nonranked” department to boot, Facebook might have been disastrous for us.

What “Facebook philosophy” I’ve seen has been superficial, sometimes reinventing the wheel, sometimes taking positions that have been argued against effectively outside their preferred orbits, sometimes arguing theses so kooky and outlandish no one is going to take them seriously (e.g., about how we can know there are extraterrestrials among us). Much political-philosophical discussion, frankly, very much fits into the universe I described at the outset, in which bodies of like-minded folks have congregated because they work essentially from the same ideological premises. Academics are no less prone to the echo chamber effect than anyone else. Kukla — again: surprise, surprise! — eventually falls into this trap, advising readers:

…don’t trust people with fundamentally terrible values. The misogynists and the bigots and the Trump voters on your page are likely to harm you, because they are harmful people with no moral compass. Arguing across such large divides is emotionally exhausting and pointless anyhow. Just get rid of them and protect yourself.

Who gets to decide whose values are “terrible”? Is it obvious who has a legitimate grievance versus who is a mere “bigot”? How many “Trump voters,” I wonder, has she actually met and engaged, online or otherwise? Some of her neighbors may be “Trump voters,” after all. Is she saying that 63 million of her fellow Americans “have no moral compass” because they voted for Donald Trump?

This, of course, is the sort of arrogance that alienates career academics from their fellow Americans, even if their tenured status enables them not to have to care. It vitiates some of her earlier advice, while confirming what the research tells us about what social media might be doing to us.

What conclusions should be drawn from this? Kukla is right that academic philosophers — and those who aspire to be — have opportunities to use Facebook or other social media exercising caution appropriate to their personal situations. They can bring their work to the attention of others. This can have positive results. They should be aware that what they say online can have negative repercussions, however.

Nothing in my experience suggests that social media is of any help in “doing” philosophy. It sounds pedantic, but those who built the Western tradition, and later the various schools (analytic, continental) did just fine — and probably much better — without it. The problems with producing quality philosophy today have little or nothing to do with social media, though, and everything to do with the structural problems of academia and of a prevailing political economy in both academia and the larger society that is hostile to values presupposed by philosophy.

Real philosophy is difficult to produce, which is why we see so little of it these days. Discussions of fundamental philosophical problems, developments of extended arguments and counter-arguments against one or more premises of someone’s attempt to tackle such a problem, are bound to be far more involved than is possible on Facebook — which does have a limit on the length of a post or comment (as I’ve discovered by running up against it a few of times). Substantial contributions to philosophical debate cannot be composed in one sitting, like the majority of Facebook posts. They are not off-the-top-of-your-head events.

Again, and in sum, social media were not designed with long, involved, nuanced essays and followup conversations in mind. These call for concerted attention and effort on the part of both writer and readers. If anything, research is also showing that social media is actually shortening users’ attention spans. Blogs open some possibilities, but even they are limited as I’ve discovered. Are those of us who blog about the philosophical and larger academic community, events in the larger society that might impact on intelligent conversation, etc., really parts of an independent intellectual vanguard as we like to think of ourselves, or are we just borderline-narcissists venting in our private echo chambers?

Time will tell, but however many contacts I’ve made or maintained on Facebook, I don’t expect to see any major philosophical breakthroughs there, or on any other social media platforms.


*The pseudo-marginalized:

(1) invariably have tenure, typically at influential institutions almost guaranteeing visibility. Georgetown is not an insignificant university;

(2) strongly identify with identity politics, and hence can’t write without constant reminders to readers how prone to mistreatment they are, and how mistreated are those in their preferred group(s);

(3) are often bullies, without being aware that this is how they are seen by those not in their preferred group(s); Kukla’s blithe disdain for “Trump voters” is a case in point, as was her attack on philosopher of religion Richard Swinburne, someone whose work we theists find interesting and valuable;

(4) have no sense of the contradiction between their privileged status (tenure) often attained by their institution’s preferential policies, and their wearing the mantle of victimhood almost as a badge of honor; and finally,

(5) are mostly clueless about how power really operates in industrial and post-industrial civilization, and from where (what sorts of institutions) it emanates? As long as they are swinging broadsides at windmills of white-maleness (or straight white-maleness or straight white-Christian-maleness), we can expect their cluelessness on such matters to continue.

Posted in Academia, Media, Philosophy, Where Is Philosophy Going? | Tagged , , , , , , , , , , , | Leave a comment

Taking a Short Break from LGP

Hello. Those who chance to browse around this site may have noticed the dearth of posts in July, not that I posted a great deal during previous months. I’ve a few items planned, nothing completed, but truth be told, for the past several months my focus has not been on philosophy. Focusing on philosophy when you are not teaching and not independently wealthy is simply not a live option.

Thus for these past several months, last month in particular and most likely for a few months to come, my focus has been on developing what is likely to be my occupation for the next 10 to 15 years (hopefully): copywriting. I am doing what I need to do in order to learn the job and do it effectively. This takes huge chunks out of my day including my writing time, meaning that there is less time for projects like this that don’t make a contribution to it … but could eventually benefit from it.

I have taken note which posts I’ve done in the past seemed to generate the most traffic: the review of Stefan Molyneux’s book The Art of the Argument rose to the top far and away (doubtless because a number of sites with far more visibility than mine discovered it and linked to it); the posts (e.g., this, and this) on the follies and foibles of contemporary academic philosophy collectively came in second, interestingly the more specific the post the better the traffic; the posts on important twentieth century philosophers, on Wittgenstein, and on “Consciousness and the Brain” also did well. Sadly, my observations on thinkers such as Leopold Kohr have done wretchedly; also on topics such as globalism. This is unfortunate, because both need a broader audience and wider discussion. The former definitely has something to say regarding the latter.

All of this is noted for future reference in any event, and comments and suggestions from readers are always welcome.

In the meantime, you (you’re still there, right?) can expect either a handful of much shorter posts (shorter than this one) or no posts at all for the remainder of summer and possibly for much of the fall. Rest assured, where philosophy is assured, I am never that far away.

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment