Irrationality: The Enemy Within by Stuart Sutherland (1992)

The only way to substantiate a belief is to try to disprove it.
(Irrationality: The Enemy Within, page 48)

Sutherland was 65 when he wrote this book, and nearing the end of a prestigious career in psychology research. His aim was to lay out, in 23 themed chapters, all the psychological and sociological research data from hundreds of experiments, which show just how vulnerable the human mind is to a plethora of unconscious biases, prejudices, errors, mistakes, misinterpretations and so on – the whole panoply of ways in which supposedly ‘rational’ human beings can end up making grotesque mistakes.

By the end of the book, Sutherland claims to have defined and demonstrated over 100 distinct cognitive errors humans are prone to (p.309).

I first read this book in 2000 and it made a big impact on me because I didn’t really know that this entire area of study existed, and had certainly never read such a compendium of sociology and psychology experiments before.

I found the naming of the various errors particularly powerful. They reminded me of the lists of weird and wonderful Christian heresies I was familiar with from years of of reading early Christians history. And, after all, the two have a lot in common, both being lists of ‘errors’ which the human mind can make as it falls short of a) orthodox theology and b) optimally rational thinking, the great shibboleths of the Middle Ages and of the Modern World, respectively.

Rereading Irrationality now, 20 years later, after having brought up two children, and worked in big government departments, I am a lot less shocked and amazed. I have witnessed at first hand the utter irrationality of small and medium-sized children; and I have seen so many examples of corporate conformity, the avoidance of embarrassment, unwillingness to speak up, deferral to authority, and general mismanagement in the civil service that, upon rereading the book, hardly any of it came as a surprise.

But to have all these errors so carefully named and defined and worked through in a structured way, with so many experiments giving such vivid proof of how useless humans are at even basic logic, was still very enjoyable.

What is rationality?

You can’t define irrationality without first defining what you mean by rationality:

Rational thinking is most likely to lead to the conclusion that is correct, given the information available at the time (with the obvious rider that, as new information comes to light, you should be prepared to change your mind).

Rational action is that which is most likely to achieve your goals. But in order to achieve this, you have to have clearly defined goals. Not only that but, since most people have multiple goals, you must clearly prioritise your goals.

Few people think hard about their goals and even fewer think hard about the many possible consequences of their actions. (p.129)

Cognitive biases contrasted with logical fallacies

Before proceeding it’s important to point out that there is a wholly separate subject of logical fallacies. As part of his Philosophy A-Level my son was given a useful handout with a list of about fifty logical fallacies i.e. errors in thinking. But logical fallacies are not the same as cognitive biases.

A logical fallacy stems from an error in a logical argument; it is specific and easy to identify and correct. Cognitive bias derives from deep-rooted, thought-processing errors which themselves stem from problems with memory, attention, self-awareness, mental strategy and other mental mistakes.

Cognitive biases are, in most cases, far harder to acknowledge and often very difficult to correct.

Fundamentals of irrationality

1. Innumeracy One of the largest causes of all irrational behaviour is that people by and large don’t understand statistics or maths. Thus most people are not intellectually equipped to understand the most reliable type of information available to human beings – data in the form of numbers. Instead they tend to make decisions based on a wide range of faulty and irrational psychological biases.

2. Physiology People are often influenced by physiological factors. Apart from obvious ones like tiredness or hunger, which are universally known to affect people’s cognitive abilities, there are also a) drives (direct and primal) like hunger, thirst, sex, and b) emotions (powerful but sometimes controllable) like love, jealousy, fear and – especially relevant – embarrassment, specifically, the acute reluctance to acknowledge limits to your own knowledge or that you’ve made a mistake.

At a more disruptive level, people might be alcoholics, drug addicts, or prey to a range of other obsessive behaviours, not to mention suffering from a wide range of mental illnesses or conditions which undermine any attempt at rational decision-making, such as stress, anxiety or, at the other end of the spectrum, depression and loss of interest.

3. The functional limits of consciousness Numerous experiments have shown that human beings have a limited capacity to process information. Given that people rarely have a) a sufficient understanding of the relevant statistical data to begin with, and b) lack the RAM capacity to process all the data required to make the optimum decision, it is no surprise that most of us fall back on all manner of more limited, non-statistical biases and prejudices when it comes to making decisions.

The wish to feel good The world is threatening, dangerous and competitive. Humans want to feel safe, secure, calm, and in control. This is fair enough, but it does mean that people have a way of blocking out any kind of information which threatens them. Most people irrationally believe that they are cleverer than they in fact are, are qualified in areas of activity of knowledge where they aren’t, people stick to bad decisions for fear of being embarrassed or humiliated, and for the same reason reject new evidence which contradicts their position.

Named types of error and bias

Jumping to conclusions

Sutherland tricks the reader on page one, by asking a series of questions and then pointing out that, if you tried to answer about half of them, you are a fool since the questions didn’t contain enough information to arrive at any sort of solution. Jumping to conclusions before we have enough evidence is a basic and universal error. One way round this is to habitually use a pen and paper to set out the pros and cons of any decision, which also helps highlight areas where you realise you don’t have enough information.

The availability error

All the evidence is that the conscious mind can only hold a small number of data or impressions at any one time (near the end of the book, Sutherland claims the maximum is seven items, p.319). Many errors are due to people reaching for the most available explanation, using the first thing that comes to mind, and not taking the time to investigate further and make a proper, rational survey of the information.

Many experiments show that you can unconsciously bias people by planting ideas, words or images in their minds which then directly affect decisions they take hours later about supposedly unconnected issues.

Studies show that doctors who have seen a run of a certain condition among their patients become more likely to diagnose it in new patients, who don’t have it. Because the erroneous diagnosis is more ‘available’.

The news media is hard-wired to publicise shocking and startling stories which leads to the permanent misleading of the reading public. One tourist eaten by a shark in Australia eclipses the fact that you are far more likely to die in a car crash than be eaten by a shark.

Thus ‘availability’ is also affected by impact or prominence. Experimenters read out a list of men and women to two groups without telling them that there are exactly 25 men and 25 women, and asked them to guess the ratio of the sexes. If the list included some famous men, the group was influenced to think there were more men, if the list included famous women, the group thought there are more women than men. The prominence effect.

The entire advertising industry is based on the availability error in the way it invents straplines, catchphrases and jingles designed to pop to the front of your mind when you consider any type of product, making those products – in other words – super available.

I liked the attribution of the well-known fact that retailers price goods at just under the nearest pound, to the availability error. Most of us find £5.95 much more attractive than £6. It’s because we only process the initial 5, the first digit. It is more available.

Numerous studies have shown that the availability error is hugely increased under stress. Under stressful situations – in an accident – people fixate on the first solution that comes to mind and refuse to budge.

The primacy effect

First impressions. Interviewers make up their minds about a candidate for a job in the first minute of an interview and then spend the rest of the time collecting data to confirm that first impression.

The anchor effect

In picking a number people tend to choose one close to any number they’ve recently been presented with. Two groups were asked to estimate whether the population of Turkey was a) bigger than 5 million b) less than 65 million, and what it was. The group who’d had 5 million planted in their mind hovered around 15 million, the group who’d had 65 million hovered around 35 million. They were both wrong. It is 80 million.

The halo effect

People extrapolate the nature of the whole from just one quality e.g. in tests, people think attractive people must be above average in personality and intelligence although, of course, there is no reason why they should be. Hence this error’s alternative name, the ‘physical attractiveness stereotype’. The halo effect is fundamental to advertising, which seeks to associate images of beautiful men, women, smiling children, sunlit countryside etc with the product being marketed.

The existence of the halo effect and primacy effect are both reasons why interviews are a poor way to assess candidates for jobs or places.

The devil effect

Opposite of the above: extrapolating from negative appearances to the whole. This is why it’s important to dress smartly for an interview or court appearance, it really does influence people. In an experiment examiners were given identical answers, but some in terrible handwriting, some in beautifully clear handwriting. The samples with clear handwriting consistently scored higher marks, despite the identical factual content of the scripts.

Illusory correlation

People find links between disparate phenomena which simply don’t exist, thus:

  • people exaggerate the qualities of people or things which stand out from their environments
  • people associate rare qualities with rare things

This explains a good deal of racial prejudice: a) immigrants stand out b) a handful of immigrants commit egregious behaviour – therefore it is a classic example of illusory correlation to associate the two. What is missing is taking into account all the negative examples i.e. the millions of immigrants who make no egregious behaviour and whose inclusion would give you a more accurate statistical picture. Pay attention to negative cases.

Stereotypes

  1. People tend to notice anything which supports their existing opinions.
  2. We notice the actions of ‘minorities’ much more than the actions of the invisible majority.

Projection

People project onto neutral phenomena, patterns and meanings they are familiar with or which bolster their beliefs. This is compounded by –

Obstinacy

Sticking to personal opinions (often made in haste / first impressions / despite all evidence to the contrary) aka The boomerang effect When someone’s opinions are challenged, they just become more obstinate about it. Aka Belief persistence. Aka pig-headedness. And this is axacerbated by –

Group think

People associate with others like themselves, which makes them feel safe by a) confirming their beliefs and b) letting them hide in a crowd. Experiments have shown how people in self-supporting groups are liable to become more extreme in their views. Also – and I’ve seen this myself – groups will take decisions that almost everyone in the group, as individuals, know to be wrong – but no-one is prepared to risk the embarrassment or humiliation of pointing it out. The Emperor’s New Clothes. Groups are more likely to make irrational decisions than individuals are.

Confirmation bias

The tendency to search for, interpret, favour, and recall information in a way that confirms one’s pre-existing beliefs or hypotheses. In an experiment people were read out a series of statements about a named person, who had a stated profession and then two adjectives describing them, one that you’d expect, the other less predictable. ‘Carol, a librarian, is attractive and serious’. When asked to do a quiz at the end of the session, participants showed a marked tendency to remember the expected adjective, and forget the unexpected one. Everyone remembered that the air stewardess was ‘attractive’ but remembered the librarian for being ‘serious’.

We remember what we expect to hear. (p.76)

Or: we remember what we remember in line with pre-existing habits of thought, values etc.

We marry people who share our opinions, we have friends with people who share our opinions, we agree with everyone in our circle on Facebook.

Self-serving biases

When things go well, people take the credit, when things go badly, people blame external circumstances.

Avoiding embarrassment

People obey, especially in a group situation, bad orders because they don’t want to stick out. People go along with bad decisions because they don’t want to stick out. People don’t want to admit they’ve made a mistake, in front of others, or even to themselves.

Avoiding humiliation

People are reluctant to admit mistakes in front of others. And rather than make a mistake in front of others, people would rather keep quiet and say nothing (in a meeting situation) or do nothing, if everyone else is doing nothing (in an action situation). Both of these avoidances feed into –

Obedience

The Milgram experiment proved that people will carry out any kind of atrocity for an authoritative man in a white coat. All of his students agreed to inflict life-threatening levels of electric shock on the victim, supposedly wired up in the next door room and emitting blood curdling (faked) screams of pain. 72% of Senior House Officers wouldn’t question the decision of a consultant, even if they thought he was wrong.

Conformity

Everyone else is saying or doing it, so you say or do it so as not to stick out / risk ridicule.

Obedience is behaving in a way ordered by an authority figure. Conformity is behaving in a way dictated by your peers.

The wrong length lines experiment

You’re put in a room with half a dozen stooges, and shown a piece of card with a line on it and then another piece of card with three lines of different length on it, and asked which of the lines on card B is the same length as the line on card A. To your amazement, everyone else in the room chooses a line which is obviously wildly wrong. In experiments up to 75% of people in this situation go along with the crowd and choose the line which they are sure, can see and know is wrong – because everyone else did.

Sunk costs fallacy

The belief that you have to continue wasting time and money on a project because you’ve invested x amount of time and money to date. Or ‘throwing good money after bad’.

Sutherland keeps cycling round the same nexus of issues, which is that people jump to conclusions – based on availability, stereotypes, the halo and anchor effects – and then refuse to change their minds, twisting existing evidence to suit them, ignoring contradictory evidence.

Misplaced consistency & distorting the evidence

Nobody likes to admit (especially to themselves) that they are wrong. Nobody likes to admit (especially to themselves) that they are useless at taking decisions.

Our inability to acknowledge our own errors even to ourselves is one of the most fundamental causes of irrationality. (p.100)

And so:

  • people consistently avoid exposing themselves to evidence that might disprove their beliefs
  • on being faced with evidence that disproves their beliefs, they ignore it
  • or they twist new evidence so as to confirm to their existing beliefs
  • people selectively remember their own experiences, or misremember the evidence they were using at the time, in order to validate their current decisions and beliefs
  • people will go to great lengths to protect their self-esteem

Sutherland says the best cleanser / solution / strategy to fixed and obstinate ideas is:

  1. to make the time to gather as much evidence as possible and
  2. to try to disprove your own position.

The best solution will be the one you have tried to demolish with all the evidence you have and still remains standing.

People tend to seek confirmation of their current hypothesis, whereas they should be trying to disconfirm it. (p.138)

Fundamental attribution error

Ascribing other people’s behaviour to their character or disposition rather than to their situation. Subjects in an experiment watched two people holding an informal quiz: the first person made up questions (based on what he knew) and asked the second person who, naturally enough, hardly got any of them right. Observers consistently credited the quizzer with higher intelligence than the answerer, completely ignoring the in-built bias of the situation, and instead ascribing the difference to character.

We are quick to personalise and blame in a bid to turn others into monolithic entities which we can then define and control – this saves time and effort, and makes us feel safer and secure – whereas the evidence is that all people are capable of a wide range of behaviours depending on the context and situation.

Once you’ve pigeon-holed someone, you will tend to notice aspects of their behaviour which confirm your view – confirmation bias and/or illusory correlation and a version of the halo/devil effect. One attribute colours your view of a more complex whole.

Actor-Observer Bias

Variation on the above: when we screw up we find all kinds of reasons in the situation to exonerate ourselves: we performed badly because we’re ill, jet-lagged, grandma died, reasons that are external to us. If someone else screws up, it is because they just are thick, lazy, useless. I.e. we think of ourselves as complex entities subject to multiple influences, and others as monolithic types.

False Consensus Effect

Over-confidence that other people think and feel like us, that our beliefs and values are the norm – in my view one of the profound cultural errors of our time.

It is a variation of the ever-present Availability Error because when we stop to think about any value or belief we will tend to conjure up images of our family and friends, maybe workmates, the guys we went to college with, and so on: in other words, the people available to memory – simply ignoring the fact that these people are a drop in the ocean of the 65 million people in the UK. See Facebubble.

The False Consensus Effect reassures us that we are normal, our values are the values, we’re the normal ones: it’s everyone else who is wrong, deluded, racist, sexist, whatever we don’t approve of.

Elsewhere, I’ve discovered some commentators naming this the Liberal fallacy:

For liberals, the correctness of their opinions – on universal health care, on Sarah Palin, on gay marriage – is self-evident. Anyone who has tried to argue the merits of such issues with liberals will surely recognize this attitude. Liberals are pleased with themselves for thinking the way they do. In their view, the way they think is the way all right-thinking people should think. Thus, ‘the liberal fallacy’: Liberals imagine that everyone should share their opinions, and if others do not, there is something wrong with them. On matters of books and movies, they may give an inch, but if people have contrary opinions on political and social matters, it follows that the fault is with the others. (Commentary magazine)

Self-Serving Bias

People tend to give themselves credit for successes but lay the blame for failures on outside causes. If the project is a success, it was all due to my hard work and leadership. If it’s a failure, it’s due to circumstances beyond my control, other people not pulling their weight etc.

Preserving one’s self-esteem 

These three errors are all aspects of preserving our self-esteem. You can see why this has an important evolutionary and psychological purpose. In order to live, we must believe in ourselves, our purposes and capacities, believe our values are normal and correct, believe we make a difference, that our efforts bring results. No doubt it is a necessary belief and a collapse of confidence and self-belief can lead to depression and possibly despair. But that doesn’t make it true.

People should learn the difference between having self-belief to motivate themselves, and developing the techniques to gather the full range of evidence – including the evidence against your own opinions and beliefs – which will enable them to make correct decisions.

Representative error

People estimate the likelihood of an event by comparing it to an existing prototype / stereotype that already exists in our minds. Our prototype is what we think is the most relevant or typical example of a particular event or object. This often happens around notions of randomness: people have a notion of what randomness should look like i.e. utterly scrambled. But in fact plenty of random events or sequences arrange themselves into patterns we find meaningful. So we dismiss them as not really random.  I.e. we have judged them against our preconception of what random ought to look like.

Ask a selection of people which of these three sets of six coin tosses where H stands for heads, T for tails is random.

  1. TTTTTT
  2. TTTHHH
  3. THHTTH

Most people will choose 3 because it feels random. But of course all three are equally likely or unlikely.

Hindsight

In numerous experiments people have been asked to predict the outcome of an event, then after the event questioned about their predictions. Most people forget their inaccurate predictions and misremember that they were accurate.

Overconfidence

Most professionals have been shown to overvalue their expertise i.e. exaggerate their success rates.


Statistics

A problem with Irrationality and with John Allen Paulos’s book about Innumeracy is that they mix up cognitive biases and statistics, Now, statistics is a completely separate and distinct area from errors of thought and cognitive biases. You can imagine someone who avoids all of the cognitive and psychological errors named above, but still makes howlers when it comes to statistics simply because they’re not very good at it.

This is because the twin areas of Probability and Statistics are absolutely fraught with difficulty. Either you have been taught the correct techniques, and understand them, and practice them regularly (and both books demonstrate that even experts make terrible mistakes in the handling of statistics and probability) or, like most of us, you have not and do not.

As Sutherland points out, most people’s knowledge of statistics is non-existent. Since we live in a society whose public discourse i.e. politics, is ever more dominated by statistics, there is a simple conclusion: most of us have little or no understanding of the principles and values which underpin modern society.

Errors in estimating probability or misunderstanding samples, opinion polls and so on, are probably a big part of irrationality, but I felt that they are so distinct from the psychological biases discussed above, that they almost require a separate volume, or a separate ‘part’ of this volume.

Briefly, common statistical mistakes are:

  • too small a sample size
  • biased sample
  • not understanding that any combination of probabilities is less likely than either on their own, which requires an understanding of base rate or a priori probability
  • the law of large numbers – the more a probabilistic event takes place, the more likely the result will move towards the theoretical probability
  • be aware of the law of regression to the mean
  • be aware of the law of large numbers

Gambling

My suggestion that mistakes in handling statistics are not really the same as unconscious cognitive biases, applies even more to the world of gambling. Gambling is a highly specialised and advanced form of probability applied to games. The subject has been pored over by very clever people for centuries. It’s not a question of a few general principles, this is a vast, book-length subject in its own right. A practical point that emerges from Sutherland’s examples is:

  • always work out the expected value of a bet i.e. the amount to be won times the probability of winning it

The two-by-two box

It’s taken me some time to understand this principle which is given in both Paulos and Sutherland.

When two elements with a yes/no result are combined, people tend to look at the most striking correlation and fixate on it. The only way to avoid the false conclusions that follow from that is to draw a 2 x 2 box and work through the figures.

Here is a table of 1,000 women who had a mammogram because their doctors thought they had symptoms of breast cancer.

Women with cancer Women with no cancer Total
Women with positive mammography 74 110 184
Women with negative mammography 6 810 816
80 920 1000

Bearing in mind that a conditional probability is saying that if X and Y are linked, then the chances of X, if Y, are so and so – i.e. the probability of X is conditional on the probability of Y – this table allows us to work out the following conditional probabilities:

1. The probability of getting a positive mammogram or test result, if you do actually have cancer, is 74 out of 80 = .92 (out of the 80 women with cancer, 74 were picked up by the test)

2. The probability of getting a negative mammogram or test result and not having cancer, is 810 out of 920 = .88

3. The probability of having cancer if you test positive, is 74 out of 184 = .40

4. The probability of having cancer if you test negative, is 6 out of 816 = .01

So 92% of women of women with cancer were picked up by the test. BUT Sutherland quotes a study which showed that a shocking 95% of doctors thought that this figure – 92% – was also the probability of a patient who tested positive having the disease. By far the majority of US doctors thought that, if you tested positive, you had a 92% chance of having cancer. They fixated on the 92% figure and transposed it from one outcome to the other, confusing the two. But this is wrong. The probability of a woman testing positive actually having cancer is given in conclusion 3: 74 out of 184 = 40%. This is because 110 out of the total 184 women tested positive, but did not have cancer.

So if a woman tested positive for breast cancer, the chances of her actually having it are 40%, not 92%. Quite a big difference (and quite an indictment of the test, by the way). And yet 95% of doctors thought that if a woman tested positive she had a 92% likelihood of having cancer.

Sutherland goes on to quote a long list of other situations where doctors and others have comprehensively misinterpreted the results of studies like this, with sometimes very negative consequences.

The moral of the story is if you want to determine whether one event is associated with another, never attempt to keep the co-occurrence of events in your head. It’s just too complicated. Maintain a written tally of the four possible outcomes and refer to these.


Deep causes

Sutherland concludes the book by speculating that all the hundred or so types of irrationality he has documented can be attributed to five fundamental causes:

  1. Evolution We evolved to make snap decisions, we are brilliant at processing visual information and responding before we’re even aware of it. Conscious thought is slower, and the conscious application of statistics, probability, regression analysis and so on, is slowest of all. Most people never acquire it.
  2. Brain structure As soon as we start perceiving, learning and remembering the world around us our brain cells make connections. The more the experience is repeated, the stronger the connections become. Routines and ruts form, which are hard to budge.
  3. Heuristics Everyone develops mental short-cuts, techniques to help make quick decisions. Not many people bother with the laborious statistical techniques for assessing relative benefits which Sutherland describes.
  4. Failure to use elementary probability and elementary statistics Ignorance is another way of describing this, mass ignorance. Sutherland (being an academic) blames the education system. I, being a pessimist, attribute it to basic human nature. Lots of people just are lazy, lots of people just are stupid, lots of people just are incurious.
  5. Self-serving bias In countless ways people are self-centred, overvalue their judgement and intelligence, overvalue the beliefs of their in-group, refuse to accept it when they’re wrong, refuse to make a fool of themselves in front of others by confessing error or pointing out errors in others (especially the boss) and so on.

I would add two more:

Suggestibility

Humans are just tremendously suggestible. Say a bunch of positive words to test subjects, then ask them questions on an unrelated topic: they’ll answer positively. Take a different representative sample of subjects and run a bunch of negative words past them, then ask them the same unrelated questions, and their answers will be measurably more negative. Everyone is easily suggestible.

Ask subjects how they get a party started and they will talk and behave in an extrovert manner to the questioner. Ask them how they cope with feeling shy and ill at ease at parties, and they will tend to act shy and speak quieter. Same people, but their thought patterns have been completely determined by the questions asked: the initial terms or anchor defines the ensuing conversation.

In one experiment a set of subjects were shown one photo of a car crash. Half were asked to describe what they think happened when one car hit another; the other half were asked to describe what they thought happened when one car smashed into the other. The ones given the word ‘smashed’ gave much more melodramatic accounts. Followed up a week later, the subjects were asked to describe what they remembered of the photo. The subjects given the word ‘hit’ fairly accurately described it, whereas the subjects given the word ‘smashed’ invented all kinds of details, like a sea of broken glass around the vehicles which simply wasn’t there, which their imaginations had invented, all at the prompting of one word.

Many of the experiments Sutherland quotes demonstrate what you might call higher-level biases: but underlying many of them is this simple-or-garden observation: that people are tremendously easily swayed, by both external and internal causes, away from the line of cold logic.

Anthropomorphism 

Another big underlying cause is anthropomorphism, namely the attribution of human characteristics to objects, events, chances, odds and so on. In other words, people really struggle to accept the high incidence of random accidents. Almost everyone attributes a purpose or intention to almost everything that happens. This means our perceptions of almost everything in life are skewed from the start.

During the war Londoners devised innumerable theories about the pattern of German bombing. After the war, when Luftwaffe records were analysed, it showed the bombing was more or less at random.

The human desire to make sense of things – to see patterns where none exists or to concoct theories… can lead people badly astray. (p.267)

Suspending judgement is about the last thing people are capable of. People are extremely uneasy if things are left unexplained. Most people rush to judgement like water into a sinking ship.

Cures

  • keep an open mind
  • reach a conclusion only after reviewing all the possible evidence
  • it is a sign of strength to change one’s mind
  • seek out evidence which disproves your beliefs
  • do not ignore or distort evidence which disproves your beliefs
  • never make decisions in a hurry or under stress
  • where the evidence points to no obvious decision, don’t take one
  • learn basic statistics and probability
  • substitute mathematical methods (cost-benefit analysis, regression analysis, utility theory) for intuition and subjective judgement

Comments on the book

Out of date

Irrationality was first published in 1992 and this makes the book dated in several ways (maybe this is why the first paperback edition was published by upmarket mass publisher Penguin, whereas the most recent edition was published by the considerably more niche publisher, Pinter & Martin).

In the chapter about irrational business behaviour Sutherland quotes quite a few examples from the 1970s and the oil crisis of 1974. These and other examples – such as the long passage about how inefficient the civil service was in the early 1970s – feel incredibly dated now.

And the whole thing was conceived, researched and written before there was an internet or any of the digital technology we take for granted nowadays. Can’t help wondering whether the digital age has solved, or merely added to the long list of biases, prejudices and faulty thinking which Sutherland catalogues, and what errors of reason have emerged specific to our fabulous digital technology.

On the other hand, out of date though the book in many ways is, it’s surprising to see how some hot button issues haven’t changed at all. In the passage about the Prisoners’ Dilemma, Sutherland takes as a real life example the problem the nations of the world were having in 1992 in agreeing to cut back carbon dioxide emissions. Sound familiar? He states that the single biggest factor undermining international co-operation against climate change was America’s refusal to sign global treaties to limit global warming. In 1992! Plus ça change.

Grumpy

The books also has passages where Sutherland gives his personal opinions about things and some of these sound more like the grousing of a grumpy old man than anything based on evidence.

Thus Sutherland whole-heartedly disapproves of ‘American’ health fads, dismisses health foods as masochistic fashion and is particularly scathing about jogging.

He thinks ‘fashion’ in any sphere of life is ludicrously irrational. He is dismissive of doctors as a profession, who he accuses of rejecting statistical evidence, refusing to share information with patients, and wildly over-estimating their own diagnostic abilities.

Sutherland thinks the publishers of learned scientific journals are more interested in making money out of scientists than in ‘forwarding the progress of science’ (p.185).

He thinks the higher average pay that university graduates tend to get is unrelated to their attendance at university and more to do with having well connected middle- and upper-middle-class parents, and thus considers the efforts of successive Education Secretaries to introduce student loans to be unscientific and innumerate (p.186).

Surprisingly, he criticises Which consumer magazine for using too small samples in its testing (p.215).

In an extended passage he summarises Leslie Chapman’s blistering (and very out of date) critique of the civil service, Your Disobedient Servant published in 1978 (pp.69-75).

Sutherland really has it in for psychoanalysis, which he accuses of all sorts of irrational thinking such as projecting, false association, refusal to investigate negative instances, failing to take into account the likelihood that the patient would have improved anyway, and so on. Half-way through the book he gives a thumbnail summary:

Self-deceit exists on a massive scale: Freud was right about that. Where he went wrong was in attributing it all to the libido, the underlying sex drive. (p.197)

In other words, the book is liberally sprinkled with Sutherland’s own grumpy personal opinions, which sometimes risk giving it a crankish feel.

Against stupidity the gods themselves contend in vain

Neither this nor John Allen Paulos’s books take into account the obvious fact that lots of people are, how shall we put it, of low educational achievement. They begin with poor genetic material, are raised in families where no-one cares about education, are let down by poor schools, and are excluded or otherwise demotivated by the whole educational experience, with the result that :

  • the average reading age in the UK is 9
  • about one in five Britons (over ten million) are functionally illiterate, and probably about the same rate innumerate

His book, like all books of this type, is targeted at a relatively small proportion of the population, the well-educated professional classes. Most people aren’t like that. You want proof? Trump. Brexit. Boris Johnson landslide.

Trying to keep those pesky cognitive errors at bay (in fact The Witch by Pieter Bruegel the Elder)

Trying to keep those cognitive errors at bay (otherwise known as The Witch by Pieter Bruegel the Elder)


Reviews of other science books

Chemistry

Cosmology

The Environment

Genetics and life

Human evolution

Maths

Particle physics

Psychology

Innumeracy by John Allen Paulos (1988)

Our innate desire for meaning and pattern can lead us astray… (p.81)

Giving due weight to the fortuitous nature of the world is, I think, a mark of maturity and balance. (p.133)

John Allen Paulos is an American professor of mathematics who won fame beyond his academic milieu with the publication of this short (134-page) but devastating book thirty years ago, the first of a series of books popularising mathematics in a range of spheres from playing the stock market to humour.

As Paulos explains in the introduction, the world is full of humanities graduates who blow a fuse if you misuse ‘infer’ and ‘imply’, or end a sentence with a dangling participle, but are quite happy to believe and repeat the most hair-raising errors in maths, statistics and probability.

The aim of this book was:

  • to lay out examples of classic maths howlers and correct them
  • to teach readers to be more alert when maths, stats and data need to be used
  • and to provide basic rules in order to understand when innumerate journalists, politicians, tax advisors and other crooks are trying to pull the wool over your eyes, or are just plain wrong

There are five chapters:

  1. Examples and principles
  2. Probability and coincidence
  3. Pseudoscience
  4. Whence innumeracy
  5. Statistics, trade-offs and society

Many common themes emerge:

Don’t personalise, numeratise

One contention of this book is that innumerate people characteristically have a strong tendency to personalise – to be misled by their own experiences, or by the media’s focus on individuals and drama… (p.1)

Powers

The first chapter uses lots of staggering statistics to get the reader used to very big and very small numbers, and how to compute them.

1 million seconds is 11 and a half days. 1 billion seconds is 32 years.

He suggests you come up with personal examples of numbers for each power up to 12 or 13 i.e. meaningful embodiments of thousands, tens of thousands, hundreds of thousands and so on to help you remember and contextualise them in a hurry.

A snail moves at 0.005 miles an hour, Concorde at 2,000 miles per hour. Escape velocity from earth is about 7 miles per second, or 25,000 miles per hour. The mass of the Earth is 5.98 x 1024 kg

Early on he tells us to get used to the nomenclature of ‘powers’ – using 10 to the power 3 or 10³ instead of 1,000, or 10 to negative powers to express numbers below 1. (In fact, right at this early stage I found myself stumbling because one thousand means more to me that 10³ and a thousandth means more than more 10-3 but if you keep at it, it is a trick you can acquire quite quickly.)

The additive principle

He introduces us to basic ideas like the additive principle (aka the rule of sum), which states that if some choice can be made in M different ways and some subsequent choice can be made in N different ways, then there are M x N different ways these choices can be made in succession – which can be applied to combinations of multiple items of clothes, combinations of dishes on a menu, and so on.

Thus the number of results you get from rolling a die is 6. If you roll two dice, you can now get 6 x 6 = 36 possible numbers. Three numbers = 216. If you want to exclude the number you get on the first dice from the second one, the chances of rolling two different numbers on two dice is 6 x 5, of rolling different numbers on three dice is 6 x 5 x 4, and so on.

Thus: Baskin Robbins advertises 31 different flavours of ice cream. Say you want a triple scoop cone. If you’re happy to have any combination of flavours, including where any 2 or 3 flavours are the same – that’s 31 x 31 x 31 = 29,791. But if you ask how many combinations of flavours there are, without a repetition of the same flavour in any of the cones – that is 31 x 30 x 29 = 26,970 ways of combining.

Probability

I struggled with even the basics of probability. I understand a 1 in five chance of something happening, reasonably understand a 20% chance of something happening, but struggled when probability was expressed as a decimal number e.g. 0.2 as a way of writing a 20 percent or 1 in 5 chance.

With the result that he lost me on page 16 on or about the place where he explained the following example.

Apparently a noted 17th century gambler asked the famous mathematician Pascal which is more likely to occur: obtaining at least one 6 in four rolls of a single die, or obtaining at least one 12 in twenty four rolls of a pair of dice. Here’s the solution:

Since 5/6 is the probability of not rolling a 6 on a single roll of a die, (5/6)is the probability of not rolling a 6 in four rolls of the die. Subtracting this number from 1 gives us the probability that this latter event (no 6s) doesn’t occur; in other words, of there being at least one 6 rolled in four tries: 1 – (5/6)= .52. Likewise, the probability of rolling at least one 12 in twenty-four rolls of a pair of dice is seen to be 1 – (35/36)24 = .49.

a) He loses me in the second sentence which I’ve read half a dozen times and still don’t understand – it’s where he says the chances that this latter event doesn’t occur: something about the phrasing there, about the double negative, loses me completely, with the result that b) I have no idea whether .52 is more likely or less likely than .49.

He goes on to give another example: if 20% of drinks dispensed by a vending machine overflow their cups, what is the probability that exactly three of the next ten will overflow?

The probability that the first three drinks overflow and the next seven do not is (.2)x (.8)7. But there are many different ways for exactly three of the ten cups to overflow, each way having probability (.2)x (.8)7. It may be that only the last three cups overflow, or only the fourth, fifth and ninth cups, and so on. Thus, since there are altogether (10 x 9 x 8) / (3 x 2 x 1) = 120 ways for us to pick three out of the ten cups, the probability of some collection of exactly three cups overflowing is 120 x (.2)x (.8)7.

I didn’t understand the need for the (10 x 9 x 8) / (3 x 2 x 1) equation – I didn’t understand what it was doing, and so didn’t understand what it was measuring, and so didn’t understand the final equation. I didn’t really have a clue what was going on.

In fact, by page 20, he’d done such a good job of bamboozling me with examples like this that I sadly concluded that I must be innumerate.

More than that, I appear to have ‘maths anxiety’ because I began to feel physically unwell as I read that problem paragraph again and again and again and didn’t understand it. I began to feel a tightening of my chest and a choking sensation in my throat. Rereading it now is making it feel like someone is trying to strangle me.

Maybe people don’t like maths because being forced to confront something you don’t understand, but which everyone around you is saying is easy-peasy, makes you feel ill.

2. Probability and coincidence

Having more or less given up on trying to understand Paulos’s maths demonstrations in the first twenty pages, I can at least latch on to his verbal explanations of what he’s driving at, in sentences like these:

A tendency to drastically underestimate the frequency of coincidences is a prime characteristic of innumerates, who generally accord great significance to correspondences of all sorts while attributing too little significance to quite conclusive but less flashy statistical evidence. (p.22)

It would be very unlikely for unlikely events not to occur. (p.24)

There is a strong general tendency to filter out the bad and the failed and to focus on the good and the successful. (p.29)

Belief in the… significance of coincidences is a psychological remnant of our past. It constitutes a kind of psychological illusion to which innumerate people are particularly prone. (p.82)

Slot machines light up and make a racket when people win, there is unnoticed silence for all the failures. Big winners on the lottery are widely publicised, whereas every one of the tens of millions of failures is not.

One result is ‘Golden Age’ thinking when people denigrate today’s sports or arts or political figures, by comparison with one or two super-notable figures from the vast past, Churchill or Shakespeare or Michelangelo, obviously neglecting the fact that there were millions of also-rans and losers in their time as well as ours.

The Expected value of a quality is the average of its values weighted according to their probabilities. I understood these words but I didn’t understand any of the five examples he gave.

The likelihood of probability In many situations, improbability is to be expected. The probability of being dealt a particular hand of 13 cards in bridge is less than 1 in 600 billion. And yet it happens every time someone is dealt a hand in bridge. The improbable can happen. In fact it happens all the time.

The gambler’s fallacy The belief that, because a tossed coin has come up tails for a number of tosses in a row, it becomes steadily more likely that the next toss will be a head.

3. Pseudoscience

Paulos rips into Freudianism and Marxism for the way they can explain away any result counter to their ‘theories’. The patient gets better due to therapy: therapy works. The patient doesn’t get better during therapy, well the patient was resisting, projecting their neuroses on the therapist, any of hundreds of excuses.

But this is just warming up before he rips into a real bugbear of  his, the wrong-headedness of Parapsychology, the Paranormal, Predictive dreams, Astrology, UFOs, Pseudoscience and so on.

As with predictive dreams, winning the lottery or miracle cures, many of these practices continue to flourish because it’s the handful of successes which stand out and grab our attention and not the thousands of negatives.

Probability

As Paulos steams on with examples from tossing coins, rolling dice, playing roulette, or poker, or blackjack, I realise all of them are to do with probability or conditional probability, none of which I understand.

This is why I have never gambled on anything, and can’t play poker. When he explains precisely how accumulating probabilities can help you win at blackjack in a casino, I switch off. I’ve never been to a casino. I don’t play blackjack. I have no intention of ever playing blackjack.

When he says that probability theory began with gambling problems in the seventeenth century, I think, well since I don’t gamble at all, on anything, maybe that’s why so much of this book is gibberish to me.

Medical testing and screening

Apart from gambling the two most ‘real world’ areas where probability is important appear to be medicine and risk and safety assessment. Here’s an extended example he gives of how even doctors make mistakes in the odds.

Assume there is a test for cancer which is 98% accurate i.e. if someone has cancer, the test will be positive 98 percent of the time, and if one doesn’t have it, the test will be negative 98 percent of the time. Assume further that .5 percent – one out of two hundred people – actually have cancer. Now imagine that you’ve taken the test and that your doctor sombrely informs you that you have tested positive. How depressed should you be? The surprising answer is that you should be cautiously optimistic. To find out why, let’s look at the conditional probability of your having cancer, given that you’ve tested positive.

Imagine that 10,000 tests for cancer are administered. Of these, how many are positive? On the average, 50 of these 10,000 people (.5 percent of 10,000) will have cancer, and, so, since 98 percent of them will test positive, we will have 49 positive tests. Of the 9,950 cancerless people, 2 percent of them will test positive, for a total of 199 positive tests (.02 x 9,950 = 199). Thus, of the total of 248 positive tests (199 + 49 = 248), most (199) are false positives, and so the conditional probability of having cancer given that one tests positive is only 49/248, or about 20 percent! (p.64)

I struggled to understand this explanation. I read it four or five times, controlling my sense of panic and did, eventually, I think, follow the argumen.

However, worse in a way, when I think I did finally understand it, I realised I just didn’t care. It’s not just that the examples he gives are hard to follow. It’s that they’re hard to care about.

Whereas his descriptions of human psychology and cognitive errors in human thinking are crystal clear and easy to assimilate:

If we have no direct evidence of theoretical support for a story, we find that detail and vividness vary inversely with likelihood; the more vivid details there are to a story, the less likely the story is to be true. (p.84)

4. Whence innumeracy?

It came as a vast relief when Paulos stopped trying to explain probability and switched to a long chapter puzzling over why innumeracy is so widespread in society, which kicks off by criticising the poor level of teaching of maths in school and university.

This was like the kind of hand-wringing newspaper article you can read any day of the week in a newspaper or online, and so felt reassuringly familiar and easy to assimilate. I stopped feeling so panic-stricken.

This puzzling over the disappointing level of innumeracy goes on for quite a while. Eventually it ends with a digression about what appears to be a pet idea of his: the notion of introducing a safety index for activities and illnesses.

Paulos’s suggestion is that his safety index would be on a logarithmic scale, like the Richter Scale – so straightaway he has to explain what a logarithm is: The logarithm for 100 is 2 because 100 is 102, the logarithm for 1,000 is 3 because 1,000 is 103. I’m with him so far, as he goes on to explain that the logarithm of 700 i.e. between 2 (100) and 3 (1,000) is 2.8. Since 1 in 5,300 Americans die in a car crash each year, the safety index for driving would be 3.7, the logarithm of 5,300. And so on with numerous more examples, whose relative risks or dangers he reduces to figures like 4.3 and 7.1.

I did understand his aim and the maths of this. I just thought it was bonkers:

1. What is the point of introducing a universal index which you would have to explain every time anyone wanted to use it? Either it is designed to be usable by the widest possible number of citizens; or it is a neat exercise on maths to please other mathematicians and statisticians.

2. And here’s the bigger objection – What Paulos, like most of the university-educated, white, liberal intellectuals I read in papers, magazines and books, fails to take into account is that a large proportion of the population is thick.

Up to a fifth of the adult population of the UK is functionally innumerate, that means they don’t know what a ‘25% off’ sign means on a shop window. For me an actual social catastrophe being brought about by this attitude is the introduction of Universal Credit by the Conservative government which, from top to bottom, is designed by middle-class, highly educated people who’ve all got internet accounts and countless apps on their smartphones, and who have shown a breath-taking ignorance about what life is like for the poor, sick, disabled, illiterate and innumerate people who are precisely the people the system is targeted at.

Same with Paulos’s scheme. Smoking is one of the most dangerous and stupid things which any human can do. Packs of cigarettes have for years, now, carried pictures of disgusting cancerous growths and the words SMOKING KILLS. And yet despite this, about a fifth of adults, getting on for 10 million people, still smoke. 🙂

Do you really think that introducing a system using ornate logarithms will get people to make rational assessments of the risks of common activities and habits?

Paulos then goes on to complicate the idea by suggesting that, since the media is always more interested in danger than safety, maybe it would be more effective, instead of creating a safety index, to create a danger index.

You would do this by

  1. working out the risk of an activity (i.e. number of deaths or accidents per person doing the activity)
  2. converting that into a logarithmic value (just to make sure than nobody understands it) and then
  3. subtracting the logarithmic value of the safety index from 10, in order to create a danger index

He goes on to say that driving a car and smoking would have ‘danger indices’ of 3.7 and 2.9, respectively. The trouble was that by this point I had completely ceased to understand what he’s saying. I felt like I’ve stepped off the edge of a tall building into thin air. I began to have that familiar choking sensation, as if someone was squeezing my chest. Maths anxiety.

Under this system being kidnapped would have a safety index of 6.7. Playing Russian roulette once a year would have a safety index of 0.8.

It is symptomatic of the uselessness of the whole idea that Paulos has to remind you what the values mean (‘Remember that the bigger the number, the smaller the risk.’ Really? You expect people to run with this idea?)

Having completed the danger index idea, Paulos returns to his extended lament on why people don’t like maths. He gives a long list of reasons why he thinks people are so innumerate a condition which is, for him, a puzzling mystery.

For me this lament is a classic example of what you could call intellectual out-of-touchness. He is genuinely puzzled why so many of his fellow citizens are innumerate, can’t calculate simple odds and fall for all sorts of paranormal, astrology, snake-oil blether.

He proposes typically academic, university-level explanations for this phenomenon – such as that people find maths too cold and analytical and worry that it prevents them thinking about the big philosophical questions in life. He worries that maths has an image problem.

In other words, he fails to consider the much more obvious explanation that maths, probability and numeracy in general might be a combination of fanciful, irrelevant and deeply, deeply boring.

I use the word ‘fanciful’ deliberately. When he writes that the probability of drawing two aces in succession from a pack of cards is not (4/52 x 4/52) but (4/52 x 3/51) I do actually understand the distinction he’s making (having drawn one ace there are only 3 left and only 52 cards left) – I just couldn’t care less. I really couldn’t care less.

Or take this paragraph:

Several years ago Pete Rose set a National League record by hitting safely in forty-four consecutive games. If we assume for the sake of simplicity that he batted .300 (30 percent of the time he got a hit, 70 percent of the time he didn’t) and that he came to bat four times a game, the chances of his not getting a hit in any given game were, assuming independence, (.7)4 – .24… [at this point Paulos has to explain what ‘independence’ means in a baseball context: I couldn’t care less]… So the probability he would get at least one hit in any game was 1-.24 = .76. Thus, the chances of him getting a hit in any given sequence of forty-four consecutive games were (.76)44 = .0000057, a tiny probability indeed. (p.44)

I did, in fact, understand the maths and the working out in this example. I just don’t care about the problem or the result.

For me this is a – maybe the – major flaw of this book. This is that in the blurbs on the front and back, in the introduction and all the way through the text, Paulos goes on and on about how we as a society need to be mathematically numerate because maths (and particularly probability) impinges on so many areas of our life.

But when he tries to show this – when he gets the opportunity to show us what all these areas of our lives actually are – he completely fails.

Almost all of the examples in the book are not taken from everyday life, they are remote and abstruse problems of gambling or sports statistics.

  • which is more likely: obtaining at least one 6 in four rolls of a single die, or obtaining at least one 12 in twenty four rolls of a pair of dice?
  • if 20% of drinks dispensed by a vending machine overflow their cups, what is the probability that exactly three of the next ten will overflow?
  • Assume there is a test for cancer which is 98% accurate i.e. if someone has cancer, the test will be positive 98 percent of the time, and if one doesn’t have it, the test will be negative 98 percent of the time. Assume further that .5 percent – one out of two hundred people – actually have cancer. Now imagine that you’ve taken the test and that your doctor sombrely informs you that you have tested positive. How depressed should you be?
  • What are the odds on Pete Rose getting a hit in a sequence of forty-four games?

Are these the kinds of problems you are going to encounter today? Or tomorrow? Or ever?

No. The longer the book went on, the more I realised just how little a role maths plays in my everyday life. In fact more or less the only role maths plays in my life is looking at the prices in supermarkets, where I am attracted to goods which have a temporary reduction on them. But I do that because they’re labels are coloured red, not because I calculate the savings. Being aware of the time, so I know when to do household chores or be somewhere punctually. Those are the only times I used numbers today.

5. Statistics, trade-offs and society

This feeling that the abstruseness of the examples utterly contradicts the bold claims that reading the book will help us with everyday experiences was confirmed in the final chapter, which begins with the following example.

Imagine four dice, A, B, C and D, strangely numbered as follows: A has 4 on four faces and 0 on two faces; B has 3s on all six faces; C has four faces with 2 and two faces with 6; and D has 5 on three faces and 1 on three faces…

I struggled to the end of this sentence and just thought: ‘No, no more, I don’t have to make myself feel sick and unhappy any more’ – and skipped the couple of pages detailing the fascinating and unexpected results you can get from rolling such a collection of dice.

This chapter goes on to a passage about the Prisoner’s Dilemma, a well-known problem in logic, which I have read about and instantly forgotten scores of times over the years.

Paulos gives us three or four variations on the idea, including:

  • Imagine you are locked up in prison by a philanthropist with 20 other people.

Or:

  • Imagine you are locked in a dungeon by a sadist with 20 other people.

Or:

  • Imagine you are one of two drug traffickers making a quick transaction on a street corner and forced to make a quick decision.

Or:

  • Imagine you are locked in a prison cell, and another prisoner is locked in an identical cell down the corridor.

Well, I’m not any of these things, I’m never likely to be, and I am not really interested in these fanciful speculations.

Moreover, I am well into middle age, have travelled round the world, had all sorts of jobs in companies small, large and enormous – and I am not aware of having ever been in any situation which remotely resembled any variation of the Prisoner’s Dilemma I’ve ever heard of.

In other words, to me, it is another one of the endless pile of games and puzzles which logicians and mathematicians love to spend all day playing but which have absolutely no impact whatsoever on any aspect of my life.

Pretty much all of his examples conclusively prove how remote mathematical problems and probabilistic calculation is from the everyday lives you and I lead. When he asks:

How many people would there have to be in a group in order for the probability to be half that at least two people in it have the same birthday? (p.23)

Imagine a factory which produces small batteries for toys, and assume the factory is run by a sadistic engineer… (p.117)

It dawns on me that my problem might not be that I’m innumerate, so much as I’m just uninterested in trivial or frivolous mental exercises.

Someone offers you a choice of two envelopes and tells you one has twice as much money in it as the other. (p.127)

Flip a coin continuously until a tail appears for the first time. If this doesn’t happen until the twentieth (or later) flip, you win $1 billion. If the first tail occurs before the twentieth flip, you pay $100. Would you play? (p.128)

No. I’d go and read an interesting book.

Thoughts

If Innumeracy: Mathematical Illiteracy and Its Consequences is meant to make its readers more numerate, it failed with me.

This is for a number of reasons:

  1. crucially – because he doesn’t explain maths very well; or, the way he explained probability had lost me by about page 16 – in other words, if this is meant to be a primer for innumerate people it’s a fail
  2. because the longer it goes on, the more convinced I became that I rarely use maths, arithmetic and probability in my day today life: whole days go by when I don’t do a single sum, and so lost all motivation to submit myself to the brain-hurting ordeal of trying to understand his examples

3. Also because the structure and presentation of the book is a mess. The book meanders through a fog of jokes, anecdotes and maths trivia, baseball stories and gossip about American politicians – before suddenly unleashing a fundamental aspect of probability theory on the unwary reader.

I’d have preferred the book to have had a clear, didactic structure, with an introduction and chapter headings explaining just what he was going to do, an explanation, say, of how he was going to take us through some basic concepts of probability one step at a time.

And then for the concepts to have been laid out very clearly and explained very clearly, from a number of angles, giving a variety of different examples until he and we were absolutely confident we’d got it – before we moved on to the next level of complexity.

The book is nothing like this. Instead it sacrifices any attempt at logical sequencing or clarity for anecdotes about Elvis Presley or UFOs, for digressions about Biblical numerology, the silliness of astrology, the long and bewildering digression about introducing a safety index for activities (summarised above), or prolonged analyses of baseball or basketball statistics. Oh, and a steady drizzle of terrible jokes.

Which two sports have face-offs?
Ice hockey and leper boxing.

Half way through the book, Paulos tells us that he struggles to write long texts (‘I have a difficult time writing at extended length about anything’, p.88), and I think it really shows.

It certainly explains why:

  • the blizzard of problems in coin tossing and dice rolling stopped without any warning, as he switched tone copletely, giving us first a long chapter about all the crazy irrational beliefs people hold, and then another chapter listing all the reasons why society is innumerate
  • the last ten pages of the book give up the attempt of trying to be a coherent narrative and disintegrate into a bunch of miscellaneous odds and ends he couldn’t find a place for in the main body of the text

Also, I found that the book was not about numeracy in the broadest sense, but mostly about probability. Again and again he reverted to examples of tossing coins and rolling dice. One enduring effect of reading this book is going to be that, the next time I read a description of someone tossing a coin or rolling a die, I’m just going to skip right over the passage, knowing that if I read it I’ll either be bored to death (if I understand it) or have an unpleasant panic attack (if I don’t).

In fact in the coda at the end of the book Paulos explicitly says it has mostly been about probability – God, I wish he’d explained that at the beginning.

Right at the very, very end he briefly lists key aspects of probability theory which he claims to have explained in the book – but he hasn’t, some of them are only briefly referred to with no explanation at all, including: statistical tests and confidence intervals, cause and correlation, conditional probability, independence, the multiplication principle, the notion of expected value and of probability distribution.

These are now names I have at least read about, but they are all concepts I am nowhere near understanding, and light years away from being able to use in practical life.

Innumeracy – or illogicality?

Also there was an odd disconnect between the broadly psychological and philosophical prose explanations of what makes people so irrational, and the incredibly narrow scope of the coin-tossing, baseball-scoring examples.

What I’m driving at is that, in the long central chapter on Pseudoscience, when he stopped to explain what makes people so credulous, so gullible, he didn’t really use any mathematical examples to disprove Freudianism or astrology or so on: he had to appeal to broad principles of psychology, such as:

  • people are drawn to notable exceptions, instead of considering the entire field of entities i.e.
  • people filter out the bad and the failed and focus on the good and the successful
  • people seize hold of the first available explanation, instead of considering every single possible permutation
  • people humanise and personalise events (‘bloody weather, bloody buses’)
  • people over-value coincidences

My point is that there is a fundamental conceptual confusion in the book which is revealed in the long chapter about pseudoscience which is that his complaint is not, deep down, right at bottom, that people are innumerate; it is that people are hopelessly irrational and illogical.

Now this subject – the fundamental ways in which people are irrational and illogical – is dealt with much better, at much greater length, in a much more thorough, structured and comprehensible way in Stuart Sutherland’s great book, Irrationality, which I’ll be reviewing and summarising later this week.

Innumeracy amounts to random scratches on the surface of the vast iceberg which is the deep human inability to think logically.

Conclusion

In summary, for me at any rate, this was not a good book – badly structured, meandering in direction, unable to explain even basic concepts but packed with digressions, hobby horses and cul-de-sacs, unsure of its real purpose, stopping for a long rant against pseudosciences and an even longer lament on why maths is taught so badly  – it’s a weird curate’s egg of a text.

Its one positive effect was to make me want to track down and read a good book about probability.


Related links

Reviews of other science books

Chemistry

Cosmology

The Environment

Genetics and life

Human evolution

Maths

Particle physics

Psychology