Press "Enter" to skip to content

Why College Graduates Still Can’t Think

My colleagues in the humanities, especially within English, composition, and rhetoric, often state as accepted Truth that we are the teachers of critical thinking. This bothers me for one reason: every degree program teaching thinking and a great many of the degrees teach critical thinking effectively. The best book I’ve used on logical fallacies is the work of a computer science professor. We do not want artificial intelligence algorithms falling for poor logic. Reasoning and problem solving are key to most fields, not simply our fields of composition and rhetoric.

The humanities major that demonstrates critical thinking, problem-solving, and overall reasoning skills is philosophy. Consistently, philosophy majors outscore all other majors on graduate school entrance exams (including but not limited to the GRE and LSAT). They understand mathematical logic and abstract thought. According to ETS, the GRE scores for philosophy majors topped all humanities majors and most STEM majors from 2011 through 2016. Philosophers led all three score categories: writing, qualitative reasoning (problem-solving), and quantitative analysis (math). Even physicists acknowledge that philosophers lead in these areas.

Unless you teach analytic philosophy (the specialty with the highest of the high scores), most humanities scholars can only claim to teach reasoning skills better than STEM programs based on a curious definition of critical thinking and problem-solving. Other than philosophers, it is STEM majors that lead. Physics and astronomy students beat journalism and communications majors on the writing portion of exams. Consider that outcome for a moment.

Employers don’t believe we teach critical thinking, even though many articles claim that employers want humanities majors. (The employers curiously name philosophy as their favorite humanities degree. See above, again, for why.)

How can 60 percent or more of employers conclude that graduates of colleges and universities lack critical thinking skills? They hire graduates and discover that the graduates aren’t prepared to solve problems in the workplace.

As Rob Jenkins writes, the problem rests in different understandings of critical thinking, with conflicting meanings accepted by employers and university humanities programs.

Why College Graduates Still Can’t Think

Rob Jenkins | March 24, 2017

More than six years have passed since Richard Arum and Josipa Roksa rocked the academic world with their landmark book, Academically Adrift: Limited Learning on College Campuses. Their study of more than 2,300 undergraduates at colleges and universities across the country found that many of those students improved little, if at all, in key areas—especially critical thinking.

Since then, some scholars have disputed the book’s findings—notably, Roger Benjamin, president of the Council for Aid to Education, in a 2013 article entitled “Three Principle Questions about Critical Thinking Tests.” But the fact remains that the end users, the organizations that eventually hire college graduates, continue to be unimpressed with their thinking ability.

As recently as May of 2016, professional services firms PayScale and Future Workplace reported that 60 percent of employers believe new college graduates lack critical thinking skills, based on their survey of over 76,000 managers and executives.

Clearly, colleges and universities across the country aren’t adequately teaching thinking skills, despite loudly insisting, to anyone who will listen, that they are.

How do we explain that disconnect? Is it simply that colleges are lazily falling down on the job? Or is it, rather, that they’re teaching something they call “critical thinking” but which really isn’t?

I would argue the latter.

This is a problem of language and rhetoric. Employers and humanities programs are defining critical thinking differently. To an employer, critical thinking is the classic ability to analyze a problem and develop a solution. For too many academics, critical thinking refers to the critical in a contemporary manner. If you are a critical thinker in a college classroom that means you are critical of injustices, which are everywhere.

Jenkins notes this divide between employers and academia.

Traditionally, the “critical” part of the term “critical thinking” has referred not to the act of criticizing, or finding fault, but rather to the ability to be objective. “Critical,” in this context, means “open-minded,” seeking out, evaluating and weighing all the available evidence. It means being “analytical,” breaking an issue down into its component parts and examining each in relation to the whole.

Above all, it means “dispassionate,” recognizing when and how emotions influence judgment and having the mental discipline to distinguish between subjective feelings and objective reason—then prioritizing the latter over the former.

I wrote about all this in a recent post on The Chronicle of Higher Education’s Vitae website, mostly as background for a larger point I was trying to make. I assumed that virtually all the readers would agree with this definition of critical thinking—the definition I was taught as a student in the 1980s and which I continue to use with my own students.

To my surprise, that turned out not to be the case. Several readers took me to task for being “cold” and “emotionless,” suggesting that my understanding of critical thinking, which I had always taken to be almost universal, was mistaken.

I found that puzzling, until one helpful reader clued me in: “I share your view of what critical thinking should mean,” he wrote. “But a quite different operative definition has a strong hold in academia. In this view, the key characteristic of critical thinking is opposition to the existing ‘system,’ encompassing political, economic, and social orders, deemed to privilege some and penalize others. In essence, critical thinking is equated with political, economic, and social critique.”

I’ve been told that I am not a critical thinker because I am not an outspoken supporter of specific social justice movements. It isn’t enough to “support” various groups, at least not in academia. As a white male, anything else I might be or might believe has to be stated after sufficient apologies for not being sufficiently marginalized. As one colleague told me, “A white male cannot think critically about the system that benefits him.” Then why should a white male bother aspiring to critical thinking? Such attitudes aren’t uncommon. I had a female professor tell me I could not write about women and minorities because it was impossible for me to analyze their works. If I cannot analyze the works of others, no matter how much effort I invest in doing so, then my education was worthless.

Critical thinking means, bluntly, correctly progressive within the humanities. If you are insufficiently radical, you must not be a critical thinker.

Such reductive reasoning is how progressives dismiss conservatives, libertarians, neo-liberals, and each other. Yes, I’ve watched outspoken progressive professors, including self-proclaimed Marxists, turn on each other for not being pure enough ideologically and obviously incapable of higher-level reasoning. If you don’t agree, you must not be thinking critically enough.

The purity tests applied to junior faculty within the humanities do contribute to the hegemony of left-leaning thought. Critical thinking has become like-minded thinking. Why would anyone insufficiently “critical” want to teach in such departments? Other disciplines (and private industry) are more welcoming.

When local employers told a university program we needed to address problem-solving, I heard a colleague state afterward, “If we taught critical thinking well enough, our students would burn down those companies.” I doubt engineering programs or business schools react so negatively to business roundtables.

Such attitudes only reinforce the employer-university divide that exists in the humanities.

Suddenly, it occurred to me that the disconnect between the way most people (including employers) define critical thinking and the way many of today’s academics define it can be traced back to the post-structuralist critical theories that invaded our English departments about the time I was leaving grad school, in the late 1980s. I’m referring to deconstruction and its poorer cousin, reader response criticism.

Both theories hold that texts have no inherent meaning; rather, meaning, to the extent it exists at all, is entirely subjective, based on the experiences and mindset of the reader.

Thomas Harrison of UCLA, in his essay “Deconstruction and Reader Response,” refers to this as “the rather simple idea that the significance of the text is governed by reading.”

That idea has been profoundly influential, not only on English faculty but also on their colleagues in the other humanities and even the social sciences. (Consider, for example, the current popularity of ethnography, a form of social science “research” that combines fieldwork with subjective story-telling.)

Unfortunately, those disciplines are also where most critical thinking instruction supposedly occurs in our universities. (Actually, other fields, such as the hard sciences and engineering, probably do a better job of teaching true thinking skills—compiling and evaluating evidence, formulating hypotheses based on that evidence, testing those hypotheses for accuracy before arriving at firm conclusions. They just don’t brag about it as much.)

The result is that, although faculty in the humanities and social sciences claim to be teaching critical thinking, often they’re not. Instead, they’re teaching students to “deconstruct”—to privilege their own subjective emotions or experiences over empirical evidence in the false belief that objective truth is relative, or at least unknowable.

How do we address this divide between cultures? I’m uncertain that we can, at least not in the short-term. If employers are the “evil” we must all oppose, then there is no way we can meet the demands of the private sector without contributing to their evil. If we teach job skills, we are teaching our students that the system might have some acceptable elements, including the job market.

Can self-avowed Marxists prepare students for the workplace without severe existential angst?

I have no problem helping students prepare for the workplace. I recognize the problems and challenges in our system and our culture, but I also see ways to address those within the democratic republic and free market systems in which we live. I believe voters can change government and consumers influence businesses. Have I been duped by the system? Not if I can critically analyze its flaws and offer realistic solutions instead of utopian dreams.

One of my colleagues told me that the sign of good critical thinking would be when everyone in the room agrees. That is a viewpoint that concerns me. Quantitative researchers know that 100 percent indicates something went wrong with the test. I’d rather have disagreements within a department that supposedly teaches people how to collaborate and coexist.

Apparently, though, we should all agree that everything needs to be destroyed and rebuilt. It’s academic nihilism disguised as critical thinking.

More to the point, that explains why employers keep complaining that college graduates can’t think. They’re not being taught to think. They’re being taught, in too many of their courses, to “oppose existing systems”—without regard for any objective appraisal of those systems’ efficacy—and to demonstrate their opposition by emoting.

That may go over just fine on the quad, but it does not translate well to the workplace.

A more diverse community of scholars might ameliorate the disconnect between employers and academic disciplines. But, people more likely to concur with employers aren’t necessarily attracted to teaching in the humanities.

Critical thinking will continue to mean vastly different things to the academics and employers.