Nonfiction Books » Science

The best books on Pseudoscience

recommended by Stephen Law

Believing Bullshit by Stephen Law

Believing Bullshit
by Stephen Law

Read

Human beings have a tendency to get sucked into dodgy belief systems, often never to escape, argues the philosopher. From UFOs to homeopathic medicine, Stephen Law chooses the best books on 'pseudoscience.'

Interview by Nigel Warburton

Believing Bullshit by Stephen Law

Believing Bullshit
by Stephen Law

Read
Buy all books

We’re going to be looking at your choice of books about pseudoscience, but before we go into the books themselves, could you explain what pseudoscience is?

Pseudoscience is a practice in which people convince themselves that what they’re doing is science – that it meets scientific standards – but, on closer examination, it turns out that they’re merely aping the methods of science. It’s a kind of fake science. I’m particularly interested in pseudoscience and other dodgy systems of belief. Our cultural landscape contains many belief systems which are intellectual black holes: as you approach them you find yourself getting drawn in. Eventually you pass the event horizon, and there is no escape, or at least it can be extremely difficult to think your way out again. The people that are trapped inside these belief systems are often intelligent, well-educated people. They really believe that what they believe is rational and reasonable and perhaps even scientifically credible. But the truth is that they are duping themselves. I’ve selected some books which illustrate this tendency of human beings to get sucked into these intellectual prisons, often never to escape.

What’s your first book choice?

The first book I’ve chosen is God’s Perfect Child by Caroline Fraser. It’s about the Christian Science Church. Caroline Fraser was, as a child, part of the Christian Science movement, but then she left. This book is partly an autobiography, but it’s also an analysis of what’s fundamentally wrong with Christian Science. There’s the word ‘science’ in the title, of course: those who practise Christian Science believe that what they’re doing is good science. But the truth is that it is not. They’re doing a kind of fake science. Fraser is very good at explaining what’s wrong with Christian Science. The movement was started by a woman called Mary Baker Eddy.

“You shouldn’t pretend that you’ve got good evidence when you haven’t”

Members of the Christian Science Church believe that matter doesn’t exist, and that illness and disease are ultimately an illusion. They don’t believe in using conventional medicine. Their approach involves prayer. Christian Scientists believe they have a good track record of success. The Christian Science Journal includes many reports of people who have been treated in this way and who have recovered, allegedly as a result of Christian Science methods being applied. There are thousands and thousands and thousands of these reports. What Christian Science does is point to all these cases and say, ‘Here is a great quantity of scientific evidence supporting Christian Science – evidence that shows that it works, that it really does make people better.’

So what’s wrong with that? It does sound like an accumulation of evidence in support of the idea that there is some kind of therapeutic benefit to prayer.

It’s a nice example of a belief system looking like science, but not really being science. Why isn’t it science? It seems to me that what Christian Science is guilty of is something called ‘confirmation bias’. Suppose I show you a small target at which I’ve fired my rifle, and you can see that I hit the target five times. You might be very impressed with my marksmanship…until I show you the barn door on which this target was hanging, and you can see the thousands of misses. Those were just five lucky hits. Now, it’s very important that you don’t simply count the hits when you’re looking for evidence. You must also count all of the misses. Christian Science never counts its misses. It only counts the hits. Now, of course, if you practise Christian Science, many of the people treated will get better anyway. Many have minor ailments from which they will spontaneously recover. In other cases, even more serious cases, you may find that a disease attributed to a person involved a mistaken diagnosis, so the person gets better. If you count all these successes, then you will be able to count a great many, in the end — thousands.

Is this good evidence that Christian Science works? No. You must also count the misses. That’s something that Christian Science does not do. In fact, there are many cases where Christian Science has been applied, and people have not got better. Children have died as a consequence of their not being given orthodox medicine and a Christian Science practitioner has instead practised Christian Science. The child has consequently died of an illness that could easily have been cured: a condition such as a burst appendix, for example. So, this is an illustration of how something that looks like science, and indeed may be called ‘science’, isn’t really science when you look at it more closely. The particular reasoning error here is that the Christian Scientists only count the hits, not the misses, committing the classic fallacy of confirmation bias.

The second book you’ve chosen is about UFOs. Now, in one sense, UFOs are literally Unidentified Flying Objects, so this description is almost neutral as to what they are. That doesn’t seem to be bad science or false science: it seems to be a description of something which is yet to be explained?

Yes, there are, of course, many UFOs and that’s uncontroversial. What is controversial is the claim that what we are looking at, in some cases, are visitors from other worlds. Many people believe that. Some even believe they’ve been abducted by alien visitors. My second book choice, a favourite of mine, is UFOs: The Public Deceived by Philip J. Klass. It was published back in the mid-1980s, and it’s a trawl through some of the great claims of ufology such as the Delphos case, the Travis Walton case, and cases involving airline pilots who have reported seeing quite extraordinary things in the skies. Klass looks very carefully at the evidence, and, in many cases, successfully debunks the suggestion that what was observed was in fact some sort of flying saucer or piloted vehicle from another world. I particularly like a story involving a nuclear power plant. Back in 1967, a power station was being built. The security guards reported seeing an extraordinary light hanging over the plant on several nights. The police were called, and they confirmed the presence of the light. The County Deputy Sheriff described ‘a large lighted object’. An auxiliary police officer stated that he saw ‘five objects, they appeared to be burning, an aircraft passed by while I was watching, they seemed to be 20 times the size of the plane’. The Wake County Magistrate saw ‘a rectangular object that looked like it was on fire’. They figured that it was about the size of a football field, and very bright. Newspaper reporters showed up to investigate the object. They then attempted to get closer to it in their car, but they found that as they drove towards it, it receded. No matter how fast and far they drove, they never got any closer. Eventually they stopped the car, got out, and the photographer looked at the object through a long lens on his camera. He said, ‘Yep, that’s the planet Venus alright’. It really was the planet Venus. Everyone had just seen the planet Venus. It seems extraordinary that these things happen.

Here we have a case in which you have police officers, a magistrate, trained eye-witnesses. And there was even hard evidence in the form of an unidentified blip on the local air traffic control radar screen. All of this evidence together, you might think, confirms beyond any doubt that there really was some mysterious object hanging over that nuclear power plant. But the fact is, there wasn’t, despite these numerous eyewitnesses, this multiple attestation. The observers stuck their necks out. They were brave enough to make bold claims, so they clearly thought they were observing something extraordinary, and there was even some hard evidence (the radar blip) to back up the claims. Nevertheless, that turned out to be a coincidence. This case illustrates that you should expect, every now and then, some remarkable claim like this to be made, despite the fact that there’s no truth to it whatsoever. People are duped, they are deceived, they are subject to hallucinations in quite surprising ways. The mere fact that claims like this are made every now and then is not good evidence.

Does Klass have a theory about why people are so prone to be deceived in the ways that they are?

He discusses some of the obvious cognitive biases that may be involved. We are prone to various kinds of optical illusion. One of the most interesting points that he makes is that what some ufologists say in support of the belief that there are bizarre piloted objects floating around the atmosphere, is that, whilst many of these claims can be explained, and many have been debunked, there’s a tiny, hard core that are not explained. It’s that tiny hard core that they think provides good evidence for the presence of such craft. However, we know that this 1967 observation could easily have gone down in the annals of ufology as one of the great unexplained cases. It could easily have been part of that hard core. It was only through the fact that we got lucky and the photographer attempted to chase the UFO and then looked at it through his long lens that we now know the truth. If that hadn’t happened, this would now be another of those great unsolved cases, and it would be trumpeted as powerful evidence for the presence of strange piloted craft in our skies. But, of course, even if it hadn’t been debunked, it really wouldn’t have been strong evidence. For we should know that these kinds of cases are going to crop up every now and then anyway, and so, if we know that, we should know they are not really evidence at all for these extraordinary claims.

What about your third book, Trick or Treatment by Simon Singh and Edzard Ernst?

I really like this book. It’s a modern classic of the sceptic movement. Simon Singh is an excellent science writer. Edzard Ernst is the world’s first professor of complementary medicine. Well he was, Ernst is retired now. He started out convinced that there was some truth to the claims made by homeopathy and some other alternative practices. He was trained as a homeopath and he was a practising homeopath. He believed homeopathy worked. He then decided to investigate the treatments scientifically, to show that they worked, and he found that they didn’t. The therapies that he practised didn’t come out well from this scrutiny. It turned out that the evidence didn’t support the beliefs that he previously held, and so he gave up those beliefs. Ernst thinks that some complementary medicine may work, but the vast majority of it simply does not. I like Tim Minchin’s question: What do you call alternative medicine that’s been scientifically proven? Answer: medicine. The problem with alternative ‘medicine’, very often, is there’s very little good scientific evidence for its efficacy. Very often, the ‘evidence’ is anecdotal. We’ve already seen two examples of how weak anecdotal evidence is – the story about the 1967 UFO, and the thousands of anecdotes collected by the Christian Science movement. If you want to know whether something works, don’t rely on the anecdotes which can seem very convincing, look at the data. Run controlled studies, and if those studies reveal that the therapy doesn’t work, then you really shouldn’t believe it does, notwithstanding the fact that you can point to those anecdotes.

One common explanation for the efficacy of some alternative medicine is the ‘placebo effect’. If there is a placebo effect, it still has the healing effect. So why not go with that? It doesn’t really matter what’s in the black box: the mechanism of healing isn’t the crucial thing, all that matters is that when you take this particular tablet, it relieves your headache.

Yes, a placebo can be very useful, and it may be that some of the effect of conventional medicine is achieved through placebo. But, of course, we know that it’s not just placebo when we’ve done the science. We know that, in fact, these drugs really do have medicinal effects. You might think that it’s worth prescribing placebo just because it’ll make people feel better. Some doctors in the past have done precisely that: they have prescribed sugar pills to alleviate somebody’s depression, say, and the pills have worked just fine. But when it comes to other serious diseases which you want cured, a placebo is not going to work. It may improve your mood; you may feel happier; it may reduce the pain to some extent, if you believe it’s going to have that effect. But if you actually want somebody cured of a serious illness such as cancer, a placebo won’t work. There’s also the question of whether state funds should be given to fund placebo treatments. If lots of people believe blancmange head rubs cure headaches should the NHS then be funding that kind of blancmange treatment? I think the answer is: No, the NHS should not be funding that. If that’s so, then neither should it be funding these alternative medicines, even if they do work as placebos. In many cases it’s clear that that’s all they really are.

There’s a particularly pernicious aspect of this, because people who are ill are often desperate, and looking for miracle cures, and some alternative therapies are supposed, according to their exponents, to be superior to conventional medicine in their ability to remove the symptoms.

What concerns me most is that in some cases people have forgone conventional medicine that works and instead chosen an alternative therapy that did not work. As a consequence, they died. For example, people have gone to areas in which malaria is rife, they’ve treated themselves with a homeopathic medicine to prevent them from becoming infected, and of course it doesn’t work, so they get malaria and die as a result.

But that’s their choice, nevertheless. It’s different with adults making decisions about whatever they want to do, from somebody making a decision for a child to take an alternative medicinal ‘prophylactic’ against malaria.

Yes, it’s your free choice. People can choose to rub blancmange on their heads too, thinking this will make their hair grow, if they wish; but nobody should be allowed to claim that blancmange makes your hair grow unless they can provide very good evidence it really does. That evidence is missing when it comes to many of these alternative approaches.

What about your next book, Intellectual Impostures by Bricmont and Sokal?

This is an entertaining book. It’s very different. The theme is still pseudoscience. Alan Sokal is a scientist, perhaps best known for the Sokal hoax. He became increasingly irritated by the way in which scientific jargon was being used by postmodern writers in a nonsensical or ridiculous way in their publications, so he decided to expose this by writing a spoof postmodern article called ‘Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity’. He peppered it with scientific gobbledygook, and included all of the relevant sexy buzzwords as far as postmodern philosophy was concerned. The leading journal of postmodern philosophy, Social Text, accepted his submission and published it. Many felt this was an emperor’s new clothes moment for the postmodern philosophical movement. The small boy had now pointed and laughed and everyone could see the truth, that the postmodern emperor had no clothes. This book was written after the hoax, and looks at the work of a number of different postmodern thinkers including Jacques Lacan, Julia Kristeva, Jean Baudrillard and Gilles Deleuze. Sokal explains and illustrates very patiently how they’re using scientific terminology in a way that is either wrong, or else, in many cases, nonsensical, often creating the illusion that they have some deep and profound insight when the truth is that they don’t. There’s a nice quotation from him about Jean Baudrillard’s work which is full of references to chaos theory, quantum mechanics, non-Euclidian geometries, and so on. Sokal and Bricmont write:
‘In summary, one finds in Baudrillard’s work a profusion of scientific terms used with total disregard for their meaning, and above all in a context where they are manifestly irrelevant. Whether or not one interprets them as metaphors, it is hard to see what role they could play, except to give an appearance of profundity to trite observations about sociology, or history. Moreover, the scientific terminology is mixed up with a non-scientific vocabulary that is employed with equal sloppiness. When all is said and done, one wonders what would be left of Baudrillard’s thought if the verbal veneer covering it was stripped away’. (p.143)

I guess the reason why many people want to use scientific jargon is because it gives them the illusion of having rigour and intellectual depth. Do you think that’s all that was going on here?

It’s certainly partly that. It’s clear that science has been extraordinarily successful at revealing fundamental truths about reality. No doubt some philosophers would like to make similarly impressive claims. If they can harness the scientific vocabulary, some of the credibility and the authority of science may then rub off on their own work. You can see the attraction of employing that vocabulary, even if it’s in a muddled-headed way.

Your final book has a good title: How to Think About Weird Things by Theordore Schick and Lewis Vaughn. How should we think about weird things?

Carefully and critically, aware of the various cognitive biases to which we are, unfortunately, all very prone. This book explains various fallacies to watch out for; the Slippery Slope, the Straw Man fallacy, the Post Hoc fallacy, and so on. It points out all of the problems that we’ve already looked at so far as anecdotal evidence is concerned. It includes many impressive case studies and examples and exercises. It’s a good, enjoyable introduction to critical thinking about the extraordinary and the weird.

What would you say, then, to somebody who said ‘Look, with your obsession with being rational and concern to have conclusive evidence before you believe in anything, aren’t you missing out on some of the mystery of human life, some of the fascination in the unexplained and the potentially occult phenomena around us’?

I’m not wedded to scientism: the view that science can, in principle, answer every legitimate question. I very much doubt that scientism is true, and I want to acknowledge that there remain many mysteries, and that many may be, in principle, beyond our ability to solve. That’s all fine. What I object to is the way in which some appeal to mystery in order to try and get themselves out of trouble, in order to deflect attention away from the fact that there’s no real evidence to suggest that what they’re saying is true (and perhaps even evidence contradicts what they claim). It’s important to me that if somebody claims that they have some kind of medicine that works for a particular illness, for example, that they can show that the medicine really works. I don’t think that anyone should be making those kind of claims, and in particular making money from those kind of claims, unless they can demonstrate that what they claim is, or is very probably, true.

Support Five Books

Five Books interviews are expensive to produce. If you're enjoying this interview, please support us by .

It’s particularly important that we all have some immunity to the kind of bullshit that surrounds us in our everyday lives. When I walk down the high street where I live, I find people promoting all sorts of strange and peculiar beliefs, religious beliefs, alternative medicines, and so on. Many of these people are fairly harmless, but not all of them. Some of them want to lure me and my children into belief systems that are potentially exploitative, and perhaps even dangerous. We all need some immunity to bullshit. We need to make sure that our critical faculties are engaged. We need to be sure that a little red light will come on in our heads as we begin to approach one of these intellectual black holes, so that we don’t fall victim. It’s particularly important that young people have some immunity to pseudoscience, and some awareness of the warning signs. No one is perfectly rational. I suspect I’m less rational than I would like to think I am. But applying reason as best we can has a fantastic track record of success so far as sifting the wheat from the chaff is concerned. I don’t say that you should only ever believe something if you’ve got really good evidence for it, but I do say you shouldn’t pretend that you’ve got good evidence when you haven’t. In particular, don’t explain away good evidence against what you believe by employing dubious intellectual strategies. Be honest with yourself. Be clear about what is good evidence, and what is not.

Interview by Nigel Warburton

March 9, 2015

Five Books aims to keep its book recommendations and interviews up to date. If you are the interviewee and would like to update your choice of books (or even just what you say about them) please email us at [email protected]

Stephen Law

Stephen Law

Stephen Law is head of the Centre for Inquiry UK, Senior Lecturer in Philosophy at Heythrop College, University of London, and Editor of the Royal Institute of Philosophy Journal Think. He is the author of Believing Bullshit and The Philosophy Gym.

Stephen Law

Stephen Law

Stephen Law is head of the Centre for Inquiry UK, Senior Lecturer in Philosophy at Heythrop College, University of London, and Editor of the Royal Institute of Philosophy Journal Think. He is the author of Believing Bullshit and The Philosophy Gym.