Why should we study history? Why does the college where I teach require students to take a history course? I asked these questions to my students today. A few them mentioned the phrase “critical thinking” in their answers.
As Georgia State University English professor Rob Jenkins notes in a recent piece at The Chronicle of Higher Education, “critical thinking” is an overused catchphrase in American higher education.
But it can be defined.
Here is a taste:
Critical thinking, as the term suggests, has two components. The first is thinking — actually thinking about stuff, applying your brain to the issues at hand, disciplining yourself (and it does require discipline) to grapple with difficult concepts for as long as necessary in order to comprehend and internalize them.
This is important because we live in a society that increasingly makes it easy for people to get through the day without having to think very much. We have microwaveable food, entertainment at our fingertips, and GPS to get us where we need to go. I’m not saying those things are bad. Ideally, such time-saving devices free up our brains for other, more important pursuits. But the practical effect is that we’ve become accustomed to setting our brains on autopilot.
Actual thinking requires deep and protracted exposure to the subject matter — through close reading, for example, or observation. It entails collecting, examining, and evaluating evidence, and then questioning assumptions, making connections, formulating hypotheses, and testing them. It culminates in clear, concise, detailed, and well-reasoned arguments that go beyond theory to practical application….
The second component of critical thinking is the critical part. In common parlance, “critical” has come to mean simply negative — as in, “I don’t like to be around him, he’s always so critical.” But of course that’s not what it means in an academic context.
Think of movie critics. They cannot simply trash every film they see. Instead, their job is to combine their knowledge of films and filmmaking with their extensive experience (having no doubt seen hundreds, even thousands of films) and provide readers with the most objective analysis possible of a given movie’s merits. In the end, what we’re left with is just one critic’s opinion, true. But it’s an opinion based on substantial evidence.
To be “critical,” then, means to be objective, or as objective as humanly possible. No one is capable of being completely objective — we’re all human, with myriad thoughts, emotions, and subconscious biases we’re not even aware of. Recognizing that fact is a vital first step. Understanding that we’re not objective, by nature, and striving mightily to be objective, anyway, is about as good as most of us can do.
Read the rest here.