How Higher Education in the U.S. Has Slowly Self-Destructed

There can be little doubt that at one time, the U.S. had the best higher education system in the world — rivaled only by, perhaps, by institutions in Great Britain such as Oxford and Cambridge. It still lives on that reputation, as people still come from all over the world to study at major U.S. universities. For a period probably lasting around 65 years now, however, American colleges and universities have been going downhill. The process has been long and arduous, and was far from obvious for a long time, but over the past couple of decades, the problems are manifest and have been accelerating. What has happened is a long story, obviously; many lengthy books have been written about the decay of higher education from numerous perspectives. I can only share a small part of mine.

If one looks at higher education in the 1950s — the era in which my generation was born — one doesn’t get the sense of an enterprise that would one day head for a cliff. While things were not perfect, there is a sense in which colleges and universities seemed to enjoy a golden age that began around 1950. Universities opened their doors to World War II Veterans attending on the GI Bill, for one thing, and the student population skyrocketed. My father, a World War II veteran, went to an Illinois university on the GI Bill and earned two degrees in chemistry (a B.S. and an M.S.). Many Veterans were the first in their families to go to college which, at the time, was very affordable!

Academic disciplines like mine also enjoyed their golden ages during this period — especially if you were an analytic philosopher (those who took their cues from, e.g., Heidegger, might feel differently). There was a general sense of accomplishment and enthusiasm. W.V. Quine, of Harvard, published his “Two Dogmas of Empiricism” paper in 1951, and it was immediately clear that an event of the first importance had just taken place. Two years later, Ludwig Wittgenstein’s posthumous Philosophical Investigations appeared, bringing Wittgenstein’s later philosophy together in one important volume. Dozens of dissertations and hundreds of journal articles discussed the implications of Quine’s criticism of the analytic-synthetic distinction, his shift towards pragmatism, his naturalized epistemology which blurred the boundaries between analytic philosophy and natural science. Wittgenstein’s natural language approach, too, rocked the discipline. Wittgensteinian expressions like language game crept into its official lexicon. The influence of the later Wittgenstein was soon felt in philosophy of science, manifesting itself in the major works of Stephen Toulmin, Norwood Russell Hanson, Thomas S. Kuhn, Paul Feyerabend, and others many of whom had made close enough studies of the later Wittgenstein’s ideas to see how they applied to the special languages of the sciences and how these changed over time and with discovery and scientific revolution.

Its intellectual life aside, there were jobs in academia! Universities were expanding to satisfy student demand, and this meant hiring more faculty in every field, philosophy included. It mattered little that many of the new students were more vocationally oriented as opposed to those moved by intellectual curiosity. For there was plenty of state money being lavished on these institutions, and it only increased when the Soviet Union launched Sputnik in 1957 and caused a sense of panic that we Americans were falling behind! Whatever conflicts of educational philosophy arose could be minimized.

American society in the late 1950s had its dark side. On the one hand, the economy was roaring. We were in the middle of the longest and largest genuine expansion in history. New technologies had been appearing for decades, ending with the newest of the new: television. The largest middle class in history was rising in stature. On the other hand, this was the era of the conformist “organization man.” Also that of the “status seekers,” the “people shapers,” the incipient “power elite,” the Beat Generation, and so on — indications that all was not well. The civil rights revolution was on the horizon, as decades of America’s shabby treatment of its minorities caught up with her.

A cultural myth had arisen surrounding higher education: Everyone Should Go To College. It was a foolish myth, of course, and still is. There were then, and still are, many good vocations one can pursue that do not require a four-year university degree. At most, they might require an apprenticeship of perhaps two years or even less. One does not need a university degree to sell real estate or insurance, or repair televisions or other equipment. These were jobs that needed doing.

Sadly, however, employers bought the myth. They held the purse strings. So people went to college who weren’t comfortable there, and would rather have been out earning their livings instead of sitting in classrooms memorizing material to spit back on tests. They continued to attend due to parental and social pressures as well as employer expectations. One of the consequences is that the value of a college degree started to drop over time. Supply and demand is real, after all. The greater the supply of anything, the less the value of any single unit. The same would eventually be true of university faculty.

While I tend towards the view that my field experienced a few golden years during the 1950s and 1960s, it is happens to be the case that a lot of mediocre people made their way into tenured positions during this period of relative abundance. Many of these people might have published a chapter from their dissertations, if that much — or a book review or two — and then do nothing productive for the rest of their careers. Some didn’t do that much. Many who were somewhat more productive produced what came to be known as “secondary literature” which ran the gamut from good to mediocre, although occasionally, it fell in quality to downright abysmal. In fairness, many of those hired during this period excelled in the classroom; that was their vocation. Some, however, did not. I encountered my share of the latter in my journey through American academia, first as a student and then as an aspiring academic.

We all know what happened in the 1960s. There is no means of summarizing that complex era — which really began with the Kennedy assassination — in one blog post; again, it’s been done elsewhere. Like most such events, it had its pluses and its negatives. The aftermath, however, was a changed view of the universities by America’s ruling elites. Higher education, having been ground zero of the late 1960s disruptions, was no longer trusted. The ruling class realized that an independent middle class, pampering its children and allowing them to turn into intellectual idealists who would criticize the system (especially its wars), was in fact dangerous to their privileges as well as their goals. A subtle attack on universities began in 1971 with the Powell Memorandum, which was widely circulated through the elite business community via the U.S. Chamber of Commerce and recommended business take a stronger hand in shaping universities from the top. Some of its readers made their way onto university boards of trustees, and sometimes into administration. Such moves dovetailed nicely with the mid-1970s job market collapse that ended the golden years.

The Powell Memorandum named names, among them neo-Marxist philosopher Herbert Marcuse, then at the University of California at San Diego. Marcuse’s thinking derived from the combination of Marx and Freud developed at the Frankfurt School, brought from Frankfurt, Germany, to the New School of Social Research in New York City, a hotbed of the New Left. Marcuse, who had become the New Left’s most respected philosopher, had a substantial following. The Marcusans, one might call them, would play a definite role in the decline of humanities and social sciences that actually became useful to the ruling elites later, although the former get madder than wet hornets if you point this out to them.
Marcuse’s most influential tract for these purposes was not his books Eros and Civilization (1955) or One-Dimensional Man (1964), but rather the short essay “Repressive Tolerance” (1965). This essay, which promoted the idea that equal opportunity was not enough, that mere anti-discrimination laws were not enough, as in the absence of actual institution of repressive measures against the white majority they would only preserve the latter’s advantages all throughout society. The gradual institution of the new repression to allow blacks and soon women to thrive, began to influence the humanities where it took such forms as identity politics, critical race theory, and so on. It would help demonize the white male as modern history’s biggest villain through a process of highly selective pseudo-scholarship. It would push special studies aimed at women and minorities (“women’s studies,” “gender studies”) in increasingly extreme directions when the results did not magically materialize across the board, and we began to hear about systemic as opposed to systematic discrimination. The radicals wanted to place white men (eventually straight white men) at a disadvantage, although no one was supposed to say that! That was offensive, the sign of a closet Klansman! That was how the new “repressive tolerance” was to work. It was selective tolerance. It tolerated some at the expense of others, as those with political agendas always do.

In the universities, the extreme lack of jobs made conformism an imperative. People visibly not acceding to new views were simply not interviewed for the few positions that became available. Even for the rest, networking, positioning, “people skills,” etc., became far more important as survival skills than the sort of accomplishment that produced a Quine or a Wittgenstein, or their highest quality commentators and protégés. This was the environment in which lost generation philosophers such as myself went to school, and it was the environment in which we hit the job market in the 1980s or later.

What we didn’t see immediately — or, at least, I didn’t — was the corporatization of the universities that was occurring in accordance with the Powell Memorandum, mainly because it was occurring in the upper-echelon administrative level, not in departments where we were. The Powell Memorandum had implied a need to place a more business-oriented mindset in charge. Administrators began to rise to the occasion (with equivalent new “subjects” like “educational administration”), and the universities began to adopt the values of the larger culture that were apparent during the it’s-morning-in-America Reagan years: mass consumption, commodification of everything and everyone, and a latent anti-intellectualism perhaps expressed in Ronald Reagan’s doubts, years before when he was governor of California, that (to paraphrase) we “shouldn’t be subsidizing intellectual curiosity.”

This was the rise of neoliberalism, whose godfathers were the economists Friedrich A. Hayek and Milton Friedman. While long in the making, neoliberalism came of age in the 1980s and became the dominant philosophy of higher education administration (along with much else!) in the 1990s. Its supposed philosophy: let the free market decide everything!

Today, of course, the full corporatization of higher education at the hands of the neoliberal mindset is sufficiently well-documented that I can probably assume it here. The move towards hiring part-time faculty and phasing out tenure is one aspect of this mindset; the view of students as consumers is another. The new consumers continued to attend college in huge numbers despite gradually skyrocketing tuition. This was made possible by readily available federally-guaranteed student loans. So much for the free market deciding everything. Student loans, which guaranteed that institutions would make money, made it possible to continue raising tuition to levels that would support the lavish salaries being paid to top administrators, the new buildings and campus beautification projects to make campuses appear more glitzy and business-friendly, the instructional technology, and so on. To be sure, less money for administration would have made it possible to pay more full-time faculty, but that is water under the bridge now; one could plausibly argue that the continued production of Ph.D.s as if the golden age of the pre-1975 era still existed, or would come back with the projected “wave of retirements,” was foolish. Supply and demand again: increase the supply of x beyond all reason, and certainly beyond the actual demand for x, and the price x’s can command drops like a rock.

But we should be wary of mechanical appeals to supply and demand. As this scathing blog post notes, once an entire endeavor (such as higher education) has embraced the business, profit-maximization model, it will automatically move towards replacing the more expensive with the less expensive, whether we are talking technology or human labor. Working conditions will deteriorate across the board. With the rise of MOOCS (massive open online courses) and similar moves towards Internet-based education which can be offered at very low cost, if students begin to choose these as preferable to tens of thousands of dollars of student loan debt, this could begin wiping out tenured faculties all across the country! Tenured faculty appealing to supply and demand as their best defense against the shabby treatment of adjuncts by universities should beware of where these appeals actually point, that they, too, could become expendable faster than they think! Among the things we should be thinking about is if certain ideas, or practices, should be exempt altogether from the whimsy of the marketplace, if only because the functionality of the marketplace depends on them. But that is another post.

The real tragedy here is the talent lost to academia via a kind of brain drain, as the best minds refuse to be treated like crap simply because they had the misfortune of finishing their Ph.D.s after 1975. Corporatized higher education is actually a very wasteful system. I recently noted this squandering of talent on a comments section on Brian Leiter’s philosophy blog — on a thread generated by a poll assessing the correctness of Harry Frankfurt’s comment on academic philosophy being in the “doldrums” that inspired my second post here. It is interesting that of those who responded to the poll, 48% said Yes, philosophy is in the doldrums; 36% said No; and 16% were not sure.

I observed again the near-absence of significant figures not in their 60s and 70s, at least in the U.S. Timothy Williamson, in his late 50s, is British; David John Chalmers, in his late 40s, is Australian. To be sure, assessing the long-term stature of a philosopher is not necessarily easy; but let’s ask again, when Quine published “Two Dogmas” was there really any doubt that something important had happened? When was the last time we saw something like that? Probably when Chalmers first introduced his “hard problem of consciousness.” That was in the 1990s. Nothing of that magnitude has happened since. Possibly I’ve just missed it, from having been an outsider. I don’t think so, however. If someone reads this and thinks I am wrong, feel free to post a comment drawing attention to the landmark event in professional philosophy I have missed.

The hollowing out of the discipline caused in part by the job market collapse of the 1970s, taken further by the political correctness revolution led by the Marcusans beginning in the 1980s and triumphing in the 1990s, and finally the corporatization of universities also beginning in the 1980ss and beyond, all led to the “doldrums” Frankfurt spoke of.

These happened in tandem with the changing technology of the 1990s. Some have retorted, in response to National Adjunct Walkout Day (Wednesday, February 25, 2015) that those who don’t like the labor situation in academia ought to do the obvious thing and find another line of work. I submit that many potentially promising philosophers have done just that! There is no way of knowing how many, but they’ve probably been doing it quietly for at least 20 years now, mentally gauging the hostility of academia and deciding they’d rather be elsewhere! New technology opened a lot of doors, after all, and as every thinking person knows, many of the analytic skills that make a person good at philosophy are adaptable to computer programming, website development, design, and assorted other information technology fields. As someone who walked away from an adjunct position (in a manner of speaking), if I’m ever asked, “Where are your generation’s Wittgensteins, Quines, etc.?” I’ll tell them, “Probably working for Google, or involving themselves in tech start-ups.”

Do I need to point out that this is talent permanently lost to professional philosophy, whoever we decide deserves the blame?

My comment got an interesting response, from someone wondering where today’s imposing figures are in other academic disciplines as well. This poster suggested plausibly that philosophy is not alone in its “doldrums,” that academia is in dire straits generally.

To begin summing up, disciplines such as philosophy, history, literature, foreign languages, and so on, were once not just the core of liberal arts learning, some mastery of them was at the center of what it meant to be an educated person. Today they have been largely replaced by STEM subjects, these being the subjects employers want, as “education” becomes essentially job training. Students attend unabashedly to get job skills, intelligent enough to know they will graduate as debt slaves. As Napoleon observed, the borrower is always the slave of the lender.

Higher education faces that sort of problem at the student end. At the administrative end are the misplaced priorities created by top-heavy administrations which keep getting top-heavier, empowered by the neoliberal mindset which speaks of free markets. Given today’s structural realities, this means freedom for the point-one-percent to do as they please and rationalize it with free market language, while forcing servitude on the rest of us — for if we haven’t devoted our lives to lining our pockets, then so much the worse for us. With its present priorities, is it any wonder that the higher education is slowly self-destructing, and that the overall educational level of the public is plummeting?


About Steven Yates

I have a Ph.D. in Philosophy from the University of Georgia and teach Critical Thinking (mostly in English) at Universidad Nacionale Andrés Bello in Santiago, Chile. I moved here in 2012 from South Carolina. My most recent book is entitled Four Cardinal Errors: Reasons for the Decline of the American Republic (2011). I am the author of an earlier book, around two dozen articles & reviews, & still more articles on commentary sites on the Web. I live in Santiago with my wife Gisela & two spoiled cats, Bo & Princesa.
This entry was posted in Higher Education Generally, Where Is Philosophy Going? and tagged , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s