Philosophy Books

The best books on Critical Thinking

recommended by Nigel Warburton

Thinking from A to Z by Nigel Warburton

Thinking from A to Z
by Nigel Warburton

Read

Do you know your straw man arguments from your weasel words? Nigel Warburton, Five Books philosophy editor and author of Thinking from A to Z, selects some of the best books on critical thinking—and explains how they will help us make better-informed decisions and construct more valid arguments.

Interview by Cal Flyn, Deputy Editor

Thinking from A to Z by Nigel Warburton

Thinking from A to Z
by Nigel Warburton

Read
Buy all books

It’s been just over two years since you explained to us what critical thinking is all about. Could you update us on any books that have come out since we first spoke?

There are two recent books that I’d like to add to my critical thinking reading list.

Calling Bullshit by Carl Bergstrom and Jevin West started life as a course at the University of Washington. It is a book—a handbook really—written with the conviction that bullshit, particularly the kind that is circulated on the Internet, is damaging democracy, and that misinformation and disinformation can have very serious consequences. Bullshitters don’t care about truth. But truth is important, and this book shows why. It is focussed on examples from science and medicine, but ranges more widely too. It’s a lively read. It covers not just verbal bullshit, bullshit with statistics (particularly in relation to big data) and about causation, but also has a chapter on bullshit data visualisations that distract from the content they are about, or present that data in misleading ways. Like all good books on critical thinking this one includes some discussion of the psychology of being taken in by misleading contributions to public debate.

In How To Make the World Add Up, Tim Harford gives us ten rules for thinking better about numbers, together with a Golden Rule (‘Be curious’). Anyone who has listened to his long-running radio series More or Less will know how brilliant Tim is at explaining number-based claims – as I read it, I hallucinated Tim’s reassuring, sceptical, reasonable, amused, and  patient voice. He draws on a rich and fascinating range of examples to teach us (gently) how not to be taken in by statistics and poorly supported claims. There is some overlap with Calling Bullshit, but they complement each other. Together they provide an excellent training in how not to be bamboozled by data-based claims.

[end of update. The original interview appears below]

___________________________

We’re here to talk about critical thinking. Before we discuss your book recommendations, I wonder if you would first explain: What exactly is critical thinking, and when should we be using it?

There’s a whole cluster of things that go under the label ‘critical thinking’. There’s what you might call formal logic, the most extreme case of abstractions. For example take the syllogism: if all men are mortal, and Socrates is a man, you can deduce from that structure of arguments that Socrates is mortal. You could put anything in the slots of ‘men,’ ‘Socrates,’ ‘mortal’, and whatever you put in, the argument structure remains valid. If the premises are true, the conclusion must be true. That kind of logic, which can be represented using letters and signs rather than words, has its place. Formal logic is a quasi-mathematical (some would say mathematical) subject.

But that’s just one element of critical thinking. Critical thinking is broader, though it encompasses that. In recent years, it’s been very common to include discussion of cognitive biases—the psychological mistakes we make in reasoning and the tendencies we have to think in certain patterns which don’t give us reliably good results. That’s another aspect: focussing on the cognitive biases is a part of what’s sometimes called ‘informal logic’, the sorts of reasoning errors that people make, which can be described as fallacious. They’re not, strictly speaking, logical fallacies, always. Some of them are simply psychological tendencies that give us unreliable results.

The gambler’s fallacy is a famous one: somebody throwing a die that isn’t loaded has thrown it three times without getting a six, and then imagines that, by some kind of law of averages, the fourth time they’re more likely to get a six, because they haven’t yet got one yet. That’s just a bad kind of reasoning, because each time that you roll the dice, the odds are the same: there’s a one in six chance of throwing a six. There’s no cumulative effect and a dice doesn’t have a memory. But we have this tendency, or certainly gamblers often do, to think that somehow the world will even things out and give you a win if you’ve had a series of losses. That’s a kind of informal reasoning error that many of us make, and there are lots of examples like that.

I wrote a little book called Thinking from A to Z which was meant to name and explain a whole series of moves and mistakes in thinking. I included logic, some cognitive biases, some rhetorical moves, and also (for instance) the topic of pseudo-profundity, whereby people make seemingly deep statements that are in fact shallow. The classical example is to give a seeming paradox—to say, for example ‘knowledge is just a kind of ignorance,’ or ‘virtue is only achieved through vice.’ Actually, that’s just a rhetorical trick, and once you see it, you can generate any number of such ‘profundities’. I suppose that would fall under rhetoric, the art of persuasion: persuading people that you are a deeper thinker than you are. Good reasoning isn’t necessarily the best way to persuade somebody of something, and there are many devious tricks that people use within discussion to persuade people of a particular position. The critical thinker is someone who recognises the moves, can anatomise the arguments, and call them to attention.

So, in answer to your question: critical thinking is not just pure logic. It’s a cluster of things. But its aim is to be clear about what is being argued, what follows from the given evidence and arguments, and to detect any cognitive biases or rhetorical moves that may lead us astray.

Many of the terms you define and illustrate in Thinking from A to Z—things like ‘straw man’ arguments and ‘weasel words’—have been creeping into general usage. I see them thrown around on Twitter. Do you think that our increased familiarity with debate, thanks to platforms like Twitter, has improved people’s critical thinking or made it worse?

I think that improving your critical thinking can be quite difficult. But one of the ways of doing it is to have memorable labels, which can describe the kind of move that somebody’s making, or the kind of reasoning error, or the kind of persuasive technique they’re using.

For example, you can step back from a particular case and see that somebody’s using a ‘weak analogy’. Once you’re familiar with the notion of a weak analogy, it’s a term that you can use to draw attention to a comparison between two things which aren’t actually alike in the respects that somebody is implying they are. Then the next move of a critical thinker would be to point out the respects in which this analogy doesn’t hold, and so demonstrate how poor it is at supporting the conclusion provided. Or, to use the example of weasel words—once you know that concept, it’s easier to spot them and to speak about them.

Social media, particularly Twitter, is quite combative. People are often looking for critical angles on things that people have said, and you’re limited in words. I suspect that labels are probably in use there as a form of shorthand. As long as they’re used in a precise way, this can be a good thing. But remember that responding to someone’s argument with ‘that’s a fallacy’, without actually spelling out what sort of fallacy it is supposed to be, is a form of dismissive rhetoric itself.

There are also a huge number of resources online now which allow people to discover definitions of critical thinking terms. When I first wrote Thinking from A to Z, there weren’t the same number of resources available. I wrote it in ‘A to Z’ form, partly just as a fun device that allows for lots of cross references, but partly because I wanted to draw attention to the names of things. Naming the moves is important.

“People seem to get a kick out of the idea of sharing irrelevant features—it might be a birthday or it might be a hometown—with somebody famous. But so what?”

The process of writing the book improved my critical thinking quite a lot, because I had to think more precisely about what particular terms meant and find examples of them that were unambiguous. That was the hardest thing, to find clear-cut examples of the various moves, to illustrate them. I coined some of the names myself: there’s one in there which is called the ‘Van Gogh fallacy,’ which is the pattern of thought when people say: ‘Well, Van Gogh had red hair, was a bit crazy, was left-handed, was born on the 30th of March, and, what do you know, I share all those things’—which I do happen to do—‘and therefore I must be a great genius too.’

That’s an obviously erroneous way of thinking, but it’s very common. I was originally going to call it the ‘Mick Jagger fallacy,’ because I went to the same primary school as Mick Jagger (albeit not at the same time). People seem to get a kick out of the idea of sharing irrelevant features—it might be a birthday or it might be a hometown—with somebody famous. But so what? It doesn’t mean you’re going to be Mick Jagger, just because you went to the same primary school. In the end I called it the Van Gogh fallacy, and it’s quite amusing to see that it’s actually now got some currency online and elsewhere. People use it as if it were an established term, which I guess it is now.

I love that. Well, another title that deals with psychological biases is the first critical thinking book that you want to discuss, Daniel Kahneman’s Thinking Fast and Slow. Why did you choose this one?

This is an international bestseller by the Nobel Prize-winning behavioural economist—although he’s principally a psychologist—Daniel Kahneman. He developed research with Amos Tversky, who unfortunately died young. I think it would have been a co-written book otherwise. It’s a brilliant book that summarizes their psychological research on cognitive biases (or its patterns of thinking) which all of us are prone to, which aren’t reliable.

There is a huge amount of detail in the book. It summarizes a lifetime of research—two lifetimes, really. But Kahneman is very clear about the way he describes patterns of thought: as using either ‘System One’ or ‘System Two.’ System One is the fast, intuitive, emotional response to situations where we jump to a conclusion very quickly. You know: 2 + 2 is 4. You don’t think about it.

System Two is more analytical, conscious, slower, methodical, deliberative. A more logical process, which is much more energy consuming. We stop and think. How would you answer 27 × 17? You’d have to think really hard, and do a calculation using the System Two kind of thinking. The problem is that we rely on this System One—this almost instinctive response to situations—and often come out with bad answers as a result. That’s a framework within which a lot of his analysis is set.

I chose this book because it’s a good read, and it’s a book you can keep coming back to—but also because it’s written by a very important researcher in the area. So it’s got the authority of the person who did the actual psychological research. But it’s got some great descriptions of the phenomena he researches, I think. Anchoring, for instance. Do you know about anchoring?

I think so. Is that when you provide an initial example that shapes future responses? Perhaps you’d better explain it.

That’s more or less it. If you present somebody with an arbitrary number, psychologically, most people seem prone when you ask them a question to move in the direction of that number. For instance, there’s an experiment with judges. They were being asked off the cuff: What would be a good sentence for a particular crime, say shoplifting? Maybe they’d say it would be a six-month sentence for a persistent shoplifter.

But if you prime a judge by giving an anchoring number—if you ask, ‘Should the sentence for shoplifting be more than nine months?’ They’re more like to say on average that the sentence should be eight months than they would have been otherwise. And if you say, ‘Should it be punished by a sentence of longer than three months?’ they’re more likely to come down in the area of five, than they would otherwise.

So the way you phrase a question, by introducing these numbers, you give an anchoring effect. It sways people’s thinking towards that number. If you ask people if Gandhi was older than 114 years old when he died, people give a higher answer than if you just asked them: ‘How old was Gandhi when he died?’

I’ve heard this discussed in the context of charity donations. Asking if people will donate, say, £20 a month returns a higher average pledge than asking for £1 a month.

People use this anchoring technique often with selling wine on a list too. If there’s a higher-priced wine for £75, then somehow people are more drawn to one that costs £40 than they would otherwise have been. If  that was the most expensive one on the menu, they wouldn’t have been drawn to the £40 bottle, but just having seen the higher price, they seem to be drawn to a higher number. This phenomenon occurs in many areas.

And there are so many things that Kahneman covers. There’s the sunk cost fallacy, this tendency that we have when we give our energy, or money, or time to a project—we’re very reluctant to stop, even when it’s irrational to carry on. You see this a lot in descriptions of withdrawal from war situations. We say: ‘We’ve given all those people’s lives, all that money, surely we’re not going to stop this campaign now.’ But it might be the rational thing to do. All that money being thrown there, doesn’t mean that throwing more in that direction will get a good result. It seems that we have a fear of future regret that outweighs everything else. This dominates our thinking.

Support Five Books

Five Books interviews are expensive to produce. If you're enjoying this interview, please support us by .

What Kahneman emphasizes is that System One thinking produces overconfidence based on what’s often an erroneous assessment of a situation. All of us are subject to these cognitive biases, and that they’re extremely difficult to remove. Kahneman’s a deeply pessimistic thinker in some respects; he recognizes that even after years of studying these phenomena he can’t eliminate them from his own thinking. I interviewed him for a podcast once, and said to him: ‘Surely, if you teach people critical thinking, they can get better at eliminating some of these biases.’ He was not optimistic about that. I’m much more optimistic than him. I don’t know whether he had empirical evidence to back that up, about whether studying critical thinking can increase your thinking abilities. But I was surprised how pessimistic he was.

Interesting.

Unlike some of the other authors that we’re going to discuss . . .

Staying on Kahneman for a moment, you mentioned that he’d won a Nobel Prize, not for his research in psychology per se but for his influence on the field of economics. His and Tversky’s ground-breaking work on the irrationality of human behaviour and thinking forms the spine of a new field.

There has been a significant tendency in economics to talk about an ideal subject, making rational decisions for him or herself, and that didn’t take into account the kinds of cognitive biases that we’ve been discussing. The discipline of behavioural economics, which is very firmly established now, is kind of the antidote to that. You factor in these patterns of behaviour actual people have, rather than these idealized individuals making rational assessments about how they satisfy their desires. That’s probably a caricature of economics, but that’s the gist of it.

Let’s look at Hans Rosling’s book next, this is Factfulness. What does it tell us about critical thinking?

Rosling was a Swedish statistician and physician, who, amongst other things, gave some very popular TED talks. His book Factfulness, which was published posthumously—his son and daughter-in-law completed the book—is very optimistic, so completely different in tone from Kahneman’s. But he focuses in a similar way on the ways that people make mistakes.

We make mistakes, classically, in being overly pessimistic about things that are changing in the world. In one of Rosling’s examples he asks what percentage of the world population is living on less than $2 a day. People almost always overestimate that number, and also the direction in which things are moving, and the speed in which they’re moving. Actually, in 1966, half of the world’s population was in extreme poverty by that measure, but by 2017 it was only 9%, so there’s been a dramatic reduction in global poverty. But most people don’t realise this because they don’t focus on the facts, and are possibly influenced by what they may have known about the situation in the 1960s.

If people are asked what percentage of children are vaccinated against common diseases, they almost always underestimate it. The correct answer is a very high proportion, something like 80%. Ask people what the life expectancy for every child born today is, the global average, and again they get it wrong. It’s over 70 now, another surprisingly high figure. What Rosling’s done as a statistician is he’s looked carefully at the way the world is.

“Pessimists tend not to notice changes for the better”

People assume that the present is like the past, so when they’ve learnt something about the state of world poverty or they’ve learnt about health, they often neglect to take a second reading and see the direction in which things are moving, and the speed with which things are changing. That’s the message of this book.

It’s an interesting book; it’s very challenging. It may be over-optimistic. But it does have this startling effect on the readers of challenging widely held assumptions, much as Steven Pinker‘s The Better Angels of Our Nature has done. It’s a plea to look at the empirical data, and not just assume that you know how things are now. But pessimists tend not to notice changes for the better. In many ways, though clearly not in relation to global warming and climate catastrophe, the statistics are actually very good for humanity.

That’s reassuring.

So this is critical thinking of a numerical, statistical kind. It’s a bit different from the more verbally-based critical thinking that I’ve been involved with. I’m really interested to have my my assumptions challenged, and Factfulness is a very readable book. It’s lively and thought-provoking.

Coming back to what you said about formal logic earlier, statistics is another dense subject which needs specialist training. But it’s one that has a lot in common with critical thinking and a lot of people find very difficult—by which I mean, it’s often counter-intuitive.

One of the big problems for an ordinary reader looking at this kind of book is that we are not equipped to judge the reliability of his sources, and so the reliability of the conclusions that he draws. I think we have to take it on trust and authority and hope that, given the division of intellectual labour, there are other statisticians looking at his work and seeing whether he was actually justified in drawing the conclusions that he drew. He made these sorts of public pronouncements for a long time and responded to critics.

But you’re right that there is a problem here. I believe that most people can equip themselves with tools for critical thinking that work in everyday life. They can learn something about cognitive biases; they can learn about reasoning and rhetoric, and I believe that we can put ourselves as members of a democracy in a position where we think critically about the evidence and arguments that are being presented to us, politically and in the press. That should be open to all intelligent people, I think. It is not a particularly onerous task to equip yourself with a basic tools of thinking clearly.

But statistics requires a kind of numerical dexterity, a comfort working with numbers, and for some people it’s a difficult thing to get to a level where you can think critically about statistics. But it’s interesting to observe it being done, and that’s what I think you’re being invited to do with this book, to see somebody think critically about statistics, on a number of measures.

Absolutely. Next you wanted to talk about Five Books alumnus Matthew Syed‘s Black Box Thinking.

Yes, quite a different book. Matthew Syed is famous as a former international table tennis player, but—most people probably don’t know this—he has a first-class degree in Philosophy, Politics and Economics (PPE) from Oxford as well.

This book is really interesting. It’s an invitation to think differently about failure. The title, Black Box Thinking, comes from the black boxes which are standardly included in every passenger aircraft, so that if an accident occurs there’s a recording of the flight data and a recording of the audio communications as the plane goes down. When there’s a crash, rescuers always aim to recover these two black boxes. The data is then analysed, the causes of the crash, dissected and scrutinized, and the information shared across the aeronautic industry and beyond.

Obviously, everybody wants to avoid aviation disasters because they’re so costly in terms of loss of human life. They undermine trust in the whole industry. There’s almost always some kind of technical or human error that can be identified, and everybody can learn from particular crashes. This is a model of an industry where, when there is a failure, it’s treated as a very significant learning experience, with the result that airline travel has become a very safe form of transport.

This contrasts with some other areas of human endeavour, such as, sadly, much of healthcare, where the information about failures often isn’t widely shared. This can be for a number of reasons: there may be a fear of litigation—so if a surgeon does something unorthodox, or makes a mistake, and somebody as a result doesn’t survive an operation, the details of exactly what happened on the operating table will not be widely shared, typically, because there is this great fear of legal comeback.

The hierarchical aspects of the medical profession may have a part to play here, too. People higher up in the profession are able to keep a closed book, and not share their mistakes with others, because it might be damaging to their careers for people to know about their errors. There has been, historically anyway, a tendency for medical negligence and medical error, to be kept very quiet, kept hidden, hard to investigate.

“You can never fully confirm an empirical hypothesis, but you can refute one by finding a single piece of evidence against it”

What Matthew Syed is arguing is that we need to take a different attitude to failure and see it as the aviation industry does. He’s particularly interested in this being done within the healthcare field, but more broadly too. It’s an idea that’s come partly from his reading of the philosopher Karl Popper, who described how science progresses not by proving theories true, but by trying to disprove them. You can never fully confirm an empirical hypothesis, but you can refute one by finding a single piece of evidence against it. So, in a sense, the failure of the hypothesis is the way by which science progresses: conjecture followed by refutation, not hypothesis followed by confirmation.

As Syed argues, we progress in all kinds of areas is by making mistakes. He was a superb table-tennis player, and he knows that every mistake that he made was a learning experience, at least potentially, a chance to improve. I think you’d find the same attitude among musicians, or in areas where practitioners are very attentive to the mistakes that they make, and how those failures can teach them in a way that allows them to make a leap forward. The book has a whole range of examples, many from industry, about how different ways of thinking about failure can improve the process and the output of particular practices.

When we think of bringing up kids to succeed, and put emphasis on avoiding failure, we may not be helping them develop. Syed’s argument is that we should make failure a more positive experience, rather than treat it as something that’s terrifying, and always to be shied away from. If you’re trying to achieve success, and you think, ‘I have to achieve that by accumulating other successes,’ perhaps that’s the wrong mindset to achieve success at the higher levels. Perhaps you need to think, ‘Okay, I’m going to make some mistakes, how can I learn from this, how can I share these mistakes, and how can other people learn from them too?’

That’s interesting. In fact, just yesterday I was discussing a book by Atul Gawande, the surgeon and New Yorker writer, called The Checklist Manifesto. In that, Gawande also argues that we should draw from the success of aviation, in that case, the checklists that they run through before take-off and so on, and apply it to other fields like medicine. A system like this is aiming to get rid of human error, and I suppose that’s what critical thinking tries to do, too: rid us of the gremlins in machine.

Well, it’s also acknowledging that when you make an error, it can have disastrous consequence. But you don’t eliminate errors just by pretending they didn’t occur. With the Chernobyl disaster, for instance, there was an initial unwillingness to accept the evidence in front of people’s eyes that a disaster had occurred, combined with a fear of being seen to have messed up. There’s that tendency to think that everything’s going well, a kind of cognitive bias towards optimism and a fear of being responsible for error, but it’s also this unwillingness to see that in certain areas, admission of failure and sharing of the knowledge that mistakes have occurred is the best way to minimize failure in the future.

Very Beckettian. “Fail again. Fail better.”

I guess. But that’s a kind of pessimism—that you’re never going to achieve anything. Whereas I think Matthew Syed is a very optimistic person who believes that actually things can be a lot better, and the way they’ll get a lot better is by thinking critically about how we achieve things, about the best way to achieve success. Not to follow established practices which hide failure, but to see failure as probably a condition of success, not just a prelude to more failure. Though, in a way the Popperian line is that progress is a process of failing better, so perhaps you’re right.

Absolutely. Well, shall we move onto to Rolf Dobelli’s 2013 book, The Art of Thinking Clearly?

Yes. This is quite a light book in comparison with the others. It’s really a summary of 99 moves in thinking, some of them psychological, some of them logical, some of them social. What I like about it is that he uses lots of examples. Each of the 99 entries is pretty short, and it’s the kind of book you can dip into. I would think it would be very indigestible to read it from cover to cover, but it’s a book to keep going back to.

I included it because it suggests you can you improve your critical thinking by having labels for things, recognising the moves, but also by having examples which are memorable, through which you can learn. This is an unpretentious book. Dobelli doesn’t claim to be an original thinker himself; he’s a summariser of other people’s thoughts. What he’s done is brought lots of different things together in one place.

Just to give a flavour of the book: he’s got a chapter on the paradox of choice that’s three pages long called ‘Less is More,’ and it’s the very simple idea that if you present somebody with too many choices, rather than freeing them and improving their life and making them happier, it wastes a lot of their time, even destroys the quality of their life.

“If you present somebody with too many choices, it wastes a lot of their time”

I saw an example of this the other day in the supermarket. I bumped into a friend who was standing in front of about 20 different types of coffee. The type that he usually buys wasn’t available, and he was just frozen in this inability to make a decision between all the other brands that were in front of him. If there’d only been one or two, he’d have just gone for one of those quickly.

Dobelli here is summarising the work of psychologist Barry Schwartz who concluded that generally, a broader selection leads people to make poorer decisions for themselves. We think going into the world that what we need is more choice, because that’ll allow us to do the thing we want to do, acquire just the right consumable, or whatever. But perhaps just raising that possibility, the increased number of choices will lead us to make poorer choices than if we had fewer to choose between.

Now, that’s the descriptive bit, but at the end of this short summary, he asks ‘So what can you do about this practically?’ His answer is that you should think carefully about what you want before you look at what’s on offer. Write down the things you think you want and stick to them. Don’t let yourself be swayed by further choices. And don’t get caught up in a kind of irrational perfectionism. This is not profound advice, but it’s stimulating. And that’s typical of the book.

You can flip through these entries and you can take them or leave them. It’s a kind of self-help manual.

Oh, I love that. A critical thinking self-help book.

It really is in that self-help genre, and it’s nicely done. He gets in and out in a couple of pages for each of these. I wouldn’t expect this to be on a philosophy reading list or anything like that, but it’s been an international bestseller. It’s a clever book, and I think it’s definitely worth dipping into and coming back to. The author is not claiming that it is the greatest or most original book in the world; rather, it’s just a book that’s going to help you think clearly. That’s the point.

He’s optimistic too, unlike Kahneman. Dobelli’s not saying you’re caught up in all these biases and there’s nothing you can do about it. He’s saying there is a sense in which you can do something about all this. That may be just another cognitive bias, an illusion, but I’m biased towards thinking that thinking about things can change the way we behave. It might be difficult, but reflecting on the things that you’re doing is, I believe, the first step towards thinking more clearly.

Absolutely. Let’s move to the final title, Tom Chatfield’s Critical Thinking: Your Guide to Effective Argument, Successful Analysis and Independent Study. We had Tom on Five Books many moons ago to discuss books about computer games. This is rather different. What makes it so good?

Well, this is a different kind of book. I was trying to think about somebody reading this interview who wants to improve their thinking. Of the books I’ve discussed, the ones that are most obviously aimed at that are Black Box Thinking, the Dobelli book, and Tom Chatfield’s Critical Thinking. The others are more descriptive or academic. But this book is quite a contrast with the Dobelli’s. The Art of Thinking Clearly is a very short and punchy book, while Tom’s is longer, and more of a textbook. It includes exercises, with summaries in the margins, it’s printed in textbook format. But that shouldn’t put a general reader off, because I think it’s the kind of thing you can work through yourself and dip into.

It’s clearly written and accessible, but it is designed to be used on courses as well. Chatfield teaches a point, then asks you to test yourself to see whether you’ve learnt the moves that he’s described. It’s very wide-ranging: it includes material on cognitive biases as well as more logical moves and arguments. His aim is not simply to help you think better, and to structure arguments better, but also to write better. It’s the kind of book that you might expect a good university to present to the whole first year intake, across a whole array of courses. But I’m including it here more as a recommendation for the autodidact. If you want to learn to think better: here is a course in the form of a book. You can work through this on your own.

Fantastic.

It’s a contrast with the other books as well, so that’s part of my reason for putting it in there, so there’s a range of books on this list.

Definitely. I think Five Books readers, almost by definition, tend towards autodidacticism, so this is a perfect book recommendation. And, finally, to close: do you think that critical thinking is something that more people should make an effort to learn? I suppose the lack of it might help to explain the rise of post-truth politics.

It’s actually quite difficult to teach critical thinking in isolation. In the Open University’s philosophy department, when I worked there writing and designing course materials, we decided in the end to teach critical thinking as it arose in teaching other content: by stepping back from time to time to look at the critical thinking moves being made by philosophers, and the critical thinking moves a good student might make in response to them. Pedagogically, that often works much better than attempting to teach critical thinking as a separate subject in isolation.

This approach can work in scientific areas too. A friend of mine has run a successful university course for zoologists on critical thinking, looking at correlation and cause, particular types of rhetoric that are used in write ups and experiments, and so on, but all the time driven by real examples from zoology. If you’ve got some subject matter, and you’ve got examples of people reasoning, and you can step back from it, I think this approach can work very well.

But in answer to your question, I think that having some basic critical thinking skills is a prerequisite of being a good citizen in a democracy. If you are too easily swayed by rhetoric, weak at analysing arguments and the ways that people use evidence, and prone to all kinds of biases that you are unaware of, how can you engage politically? So yes, all of us can improve our critical thinking skills, and I do believe that that is an aspect of living the examined life that Socrates was so keen we all should do.

Interview by Cal Flyn, Deputy Editor

December 4, 2020

Five Books aims to keep its book recommendations and interviews up to date. If you are the interviewee and would like to update your choice of books (or even just what you say about them) please email us at [email protected]

Nigel Warburton

Nigel Warburton

Nigel Warburton is a freelance philosopher, writer and host of the podcast Philosophy Bites. Featuring short interviews with the world's best philosophers on bite-size topics, the podcast has been downloaded more than 40 million times. He is also our philosophy editor here at Five Books, where he has been interviewing other philosophers about the best books on a range of philosophy topics since 2013 (you can read all the interviews he's done here: not all are about philosophy). In addition, he's recommended books for us on the best introductions to philosophy, the best critical thinking books, as well as some of the key texts to read in the Western canon. His annual recommendations of the best philosophy books of the year are among our most popular interviews on Five Books. As an author, he is best known for his introductory philosophy books, listed below:

Nigel Warburton

Nigel Warburton

Nigel Warburton is a freelance philosopher, writer and host of the podcast Philosophy Bites. Featuring short interviews with the world's best philosophers on bite-size topics, the podcast has been downloaded more than 40 million times. He is also our philosophy editor here at Five Books, where he has been interviewing other philosophers about the best books on a range of philosophy topics since 2013 (you can read all the interviews he's done here: not all are about philosophy). In addition, he's recommended books for us on the best introductions to philosophy, the best critical thinking books, as well as some of the key texts to read in the Western canon. His annual recommendations of the best philosophy books of the year are among our most popular interviews on Five Books. As an author, he is best known for his introductory philosophy books, listed below: