Tips For Trying To Think Less Irrationally

Professor Stuart Sutherland divides his book Irrationality: The Enemy Within into 23 chapters, each addressing a different aspect of why human beings are so prone to irrational, illogical, biased and erroneous thinking.

Having trotted through its allotted subject, each chapter ends with a few tentative suggestions of how to address the various biases and errors described in it.

This blog post is a summary of that advice. I have omitted tips which are so tied to specific examples that they’re incomprehensible out of context, and trimmed most of them down (and expanded a few).

The Wrong Impression

  1. Never base a judgement or decision on a single case, no matter how striking.
  2. In forming an impression of a person (or object) try to break your judgement down into his (or its) separate qualities without letting any strikingly good or bad qualities influence your opinion about the remainder: especially in interviews or medical diagnoses.
  3. When exposed to a train of evidence or information, suspend judgement until the end: try to give as much weight to the last piece of evidence as the first.
  4. Try to resist the temptation to seek out only information which reinforces the decision you have already taken. Try to seek out all the relevant information needed to make a decision.

Obedience

  1. Think before obeying.
  2. Question whether an order is justified.

Conformity

  1. Think carefully before announcing a decision or commitment in front of others. Once done, these are hard to change.
  2. Ask yourself whether you are doing or saying something merely because other are doing or saying it. If you have doubts, really reflect on them and gather evidence for them.
  3. Don’t be impressed by advice on a subject from someone just because you admire them, unless they are an expert on the matter in hand.
  4. Don’t be stampeded into acting by crowds. Stand aloof.

In-groups and out-groups

  1. Don’t get carried away by group decisions. Consciously formulate arguments against the group decision.
  2. If you’re forming a team or committee, invite people with different beliefs or skill sets.
  3. Reflect on your own prejudices and the ‘types’ of people you dislike or despise.

Organisational folly

(A list of errors in large organisations, which are difficult to cure, hence there are no tips at the end of this chapter.)

Misplaced consistency

  1. Beware of over-rating the results of a choice you’ve made (because the human tendency is to slowly come to believe all your decisions have been perfect).
  2. Try not to move by small steps to an action or attitude you would initially have disapproved of.
  3. No matter how much time, effort or money you have invested in a project, cut your losses if the future looks uncertain / risky.

Misuse of Rewards and Punishments

  1. If you want someone to value a task and perform well, do not offer material rewards. Appeal to their sense of responsibility and pride.
  2. If you are a manager, adopt as participatory and egalitarian a style as possible.
  3. If you want to stop children (and anyone else) from doing something, try to persuade rather than threatening them with punishment.

Drive and Emotion

  1. Don’t take important decisions when under stress or strong emotions.
  2. Every time you subdue an impulse, it becomes easier to do so.

Ignoring the Evidence (Pearl Harbour)

  1. Search for the evidence against your hypothesis, decision, beliefs.
  2. Try to entertain hypotheses which are antagonistic to each other.
  3. Respect beliefs and ideas which conflict with your own. They might be right.

Distorting the Evidence (Battle of Arnhem)

  1. If new evidence comes in don’t distort it to support your existing actions or views. The reverse: consider carefully whether it disproves your position.
  2. Don’t trust your memory. Countless experiments prove that people remember what they need to remember to justify their actions and bolster their self-esteem.
  3. Changing your mind in light of new evidence is a sign of strength, not weakness.

Making the Wrong Connections

  1. If you want to determine whether one event is associated with another, never attempt to keep the co-occurrence of events in your head. Maintain a written tally of the four possible outcomes in a 2 x 2 box.
  2. Remember that A is only associated with B if B occurs a higher percentage of the time in the presence of A than in its absence.
  3. Pay particular attention to negative cases.
  4. In particular, do not associate things together because you expect them to be, or because they are unusual.

Mistaking Connections in Medicine

(Focuses on doctors failure to use 2 x 2 tables in order to establish correct probabilities in diagnosis, so maybe the tip should be: Don’t try to calculate conditional probabilities in your head – write it down.)

Mistaking the Cause

  1. Suspect any explanation of an event in which the cause and the effect are similar to one another.
  2. Suspect all epidemiological findings unless they are supported by more reliable evidence.
  3. Consider whether an event could have causes other than the one you first think of.
  4. In allocating cause and effect, consider that they might happen in the opposite direction to that you first choose.
  5. Be sceptical of any causal relationship unless there is an underlying theory that explains it.
  6. In apportioning responsibility for an action, do not be influenced by the magnitude of its effect.
  7. Don’t hold someone responsible for an action without first considering what others would have done in their place.
  8. Don’t assume that other people are like you.

Misinterpreting the Evidence

  1. Do not judge solely by appearances. If someone looks more like an X than a Y, they may still be a Y if there are many more Ys than Xs.
  2. A statement containing two or more pieces of information is always less likely to be true than one containing only one piece of information.
  3. Do not believe a statement is true just because part of it is true.
  4. If you learn the probability of X given Y, to arrive at a true probability you must know the base rate of X.
  5. Don’t trust small samples.
  6. Beware of biased samples.

Inconsistent decisions and Bad Bets

  1. Always work out the expected value of a gamble before accepting it.
  2. Before accepting any form of gamble be clear what you want from it – high expected value, the remote possibility of winning a large sum with a small outlay, a probable but small gain, or just the excitement of gambling and damn the expense. If you seriously intend solely to make money, work out the expected value of a gamble before accepting it.
  3. Don’t be anchored by the first figure you hear; ignore it and reason from scratch.
  4. Many connected or conditional probabilities make an event more unlikely with every new addition. Conversely, the sum of numerous independent probabilities may add up to make something quite likely.

Overconfidence

  1. Distrust anyone who says they can predict the present from the past.
  2. Be wary of anyone who claims to be able to predict the future.
  3. Try to control your own over-confidence e.g.
    • wherever possible, try to write out and calculate probabilities rather than using ‘intuition’
    • always think of arguments which contradict your position and work them through

Risks

  1. People are liable to ignore risks if told to i.e. it is managers’ responsibility to assess the risks for their staff.
  2. Insidious chronic dangers may kill more people than dramatic accidents i.e. coal pollution kills more people than nuclear accidents.

False Inferences

  1. Regression to the mean: remember that whenever anything extreme happens, chances are the next thing will be a lot less extreme: explains why second novels or albums or sports seasons are often disappointing following an award-winning first novel, album or season.
  2. If two pieces of evidence always agree, you only need one of them to make a prediction.
  3. Avoid the gambler’s fallacy i.e. the belief that a certain random event is less likely or more likely, given a previous event or a series of events i.e. if you toss a coin long enough heads must come up. No. Each new toss is a new event, uninfluenced by previous events.

The Failure of Intuition

  1. Suspect anyone who claims to have good intuition.
  2. If you are in a profession, consider using mathematical models of decision making instead of trusting your ‘intuition’.

Utility

  1. When the importance of a decision merits the expenditure of time, use Utility Theory.
  2. Before making an important decision decide what your overall aim is, whether it be to maximise the attainment of your goals, to save yourself from loss, to make at least some improvement to your situation etc.

Causes, cures and costs

  • keep an open mind
  • reach a conclusion only after reviewing all the possible evidence
  • it is a sign of strength to change one’s mind
  • seek out evidence which disproves your beliefs
  • do not ignore or distort evidence which disproves your beliefs
  • never make decisions in a hurry or under stress
  • where the evidence points to no obvious decision, don’t take one
  • learn basic statistics and probability
  • substitute mathematical methods (cost-benefit analysis, regression analysis, utility method) for intuition and subjective judgement

Thoughts

This is all very good advice, and I’d advise anyone to read Sutherland’s book. However, I can see scope for improvement or taking it further.

The structure above reflects Sutherland’s i.e. it has arranged the field in terms of the errors people make – each chapter devoted to a type of error with various types of evidence describing experiments which show how common they are.

In a sense this is an easy approach. There exist nowadays numerous lists of cognitive errors and biases.

Arguably, it would be more helpful to try and make a book or helpsheet arranged by problems and solutions, in which – instead of beginning another paragraph ‘Imagine you toss a coin a thousand times…’ in order to demonstrate another common misunderstanding of probability theory – each chapter focused on types real world of situation and how to handle them.

It would be titled something like How to think more clearly about… and then devote a chapter each to meeting new people, interviews, formal meetings and so on. There would be a standalone chapter devoted just to probability theory, since this stands out to me as being utterly different from the psychological biases – and maybe another one devoted solely to gambling since this, also, amounts to a specialised area of probability.

There would be one on financial advisers and stock brokers, giving really detailed advice on what to look for before hiring one, and whether you need one at all.

There would be one solely about medical statistics i.e. explaining how to understand the risks and benefits of medical treatment, if you ever need some.

Currently, although Sutherland’s book and the list of tips listed above are useful, it is impossible to remember all of them. A more practical approach would be to have a book (or website) of problems or situations where you could look up the situation and be reminded of the handful of simple but effective principles you should bear in mind.


Related link

Reviews of other science books

Cosmology

Environment / human impact

Genetics

  • The Double Helix by James Watson (1968)

Maths

Particle physics

Psychology

Irrationality: The Enemy Within by Stuart Sutherland (1992)

The only way to substantiate a belief is to try to disprove it. (p.48)

Sutherland was 65 when he wrote this book, nearing the end of a prestigious career in psychology research. His aim was to lay out, in 23 themed chapters, all the psychological and sociological research data  from hundreds of experiments, which show just how prey the human mind is to a plethora of unconscious biases, prejudices, errors, mistakes, misinterpretations and so on – the whole panoply of ways in which the supposedly rational human beings can end up making grotesque mistakes. By the end he claims to have defined and demonstrated over 100 distinct cognitive errors humans are prone to (p.309).

I first read it in 2000 and it made a big impact on me because I didn’t really know that this entire area of study existed, and had certainly never read such a compendium of sociology and psychology experiments before.

I found the naming of the various errors particularly powerful. They reminded me of the lists of weird and wonderful Christian heresies I was familiar with from years of reading medieval history. And, after all, the two have a lot in common, both being lists of ‘errors’ which the human mind can make as it falls short of a) orthodox theology and b) optimally rational thinking, the great shibboleths of the Middle Ages and of the Modern World, respectively.

 

Reading it now, 20 years later, having brought up a couple of children and worked for a while in big government departments, I am a lot less shocked and amazed. I have witnessed at first hand the utter irrationality of small and medium-sized children – and then so many examples of the corporate conformity, avoidance of embarrassment, unwillingness to speak up, deferral to authority, and general mismanagement to be found in the civil service that, upon rereading the book, hardly any of it came as a surprise, more a confirmation of what I’ve witnessed at first hand.

But to have the errors so carefully named and defined and worked through in a structured way, with so many experiments giving such vivid proof of how useless humans are at even basic logic was still very enjoyable.

What is rationality?

You can’t define irrationality without first defining what you mean by rationality:

Rational thinking is most likely to lead to the conclusion that is correct, given the information available at the time (with the obvious rider that, as new information comes to light, you should be prepared to change your mind).

Rational action is that which is most likely to achieve your goals. But in order to achieve this, you have to have clearly defined goals. Not only that but, since most people have multiple goals, you must clearly prioritise your goals.

Few people think hard about their goals and even fewer think hard about the many possible consequences of their actions. (p.129)

Cognitive biases contrasted with logical fallacies

Before proceeding it’s important to point out that there is a wholly separate subject of logical fallacies. As part of his Philosophy A-Level my son was given a useful handout with a list of about fifty of these. But logical fallacies are not the same as cognitive biases.

A logical fallacy stems from an error in a logical argument; it is specific and easy to identify and correct. Cognitive bias derives from deep-rooted, thought-processing errors which themselves stem from problems with memory, attention, self-awareness, mental strategy and other mental mistakes. Far harder to acknowledge, in many cases, very hard to correct.

Fundamentals of irrationality

1. Innumeracy One of the largest causes of all irrational behaviour is that people by and large don’t understand statistics or maths. Thus most people are not intellectually equipped to understand the most reliable type of information available to human beings – data in the form of numbers. Instead they tend to make decisions based on a wide range of faulty and irrational psychological biases.

2. Physiology People are often influenced by physiological factors. Apart from obvious ones like tiredness or hunger, which are universally known to affect people’s cognitive abilities, there are also a) drives (direct and primal) like hunger, thirst, sex, and b) emotions (powerful but sometimes controllable) like love, jealousy, fear and – especially relevant – embarrassment: acute reluctance to acknowledge limits to your own knowledge or that you’ve made a mistake.

More seriously people can be alcoholics, drug addicts, and prey to a wide range of other obsessive behaviours, not to mention suffering from a wide range of mental illnesses or conditions which undermine any attempt at rational decision-making, such as stress, anxiety or, at the other end of the spectrum, depression and loss of interest.

3. The functional limits of consciousness Numerous experiments have shown that human beings have a limited capacity to process information. Given that people rarely have a) a sufficient understanding of the relevant statistical data, and b) the RAM capacity to process all the data required to make the optimum decision, it is no surprise that most of us fall back on all manner of more limited, non-statistical biases and prejudices when it comes to making decisions.

The wish to feel good The world is threatening, dangerous and competitive. Humans want to feel safe, secure, calm, in control. This is fair enough, but it does mean that people have a way of blocking out any kind of information which threatens them. People irrationally believe they are cleverer than they in fact are, are qualified in areas of activity of knowledge where they aren’t, people stick to bad decisions for fear of being embarrassed or humiliated, and for the same reason reject new evidence which contradicts their position.

Named types of error and bias

Jumping to conclusions Sutherland tricks the reader no page one by asking a series of questions and then pointing out, that if you tried to answer about half of them, you are a fool since they don’t contain enough information to arrive at any sort of solution. Jumping to conclusions before we have enough evidence is a basic and universal error. One way round this is to habitually use a pen and paper to set out the pros and cons of any decision, which also helps highlight areas where you realise you don’t have enough information.

The availability error All the evidence is that the conscious mind can only hold a small number of data or impressions at any one time (near the end of the book, Sutherland claims the maximum is seven items, p.319). Many errors are due to people reaching for the most available explanation, using the first thing that comes to mind, and not taking the time to investigate further and make a proper, rational survey of the information.

Many experiments show that you can unconsciously bias people by planting ideas, words or images in their minds which then directly affect decisions they take hours later about supposedly unconnected issues.

Studies show that doctors who have seen a run of a certain condition among their patients become more likely to diagnose it in patients who don’t have it. The diagnosis is more ‘available’.

The news media is hard-wired to publicise shocking and startling stories which leads to the permanent misleading of the reading public. One tourist eaten by a shark in Australia eclipses the fact that you are far more likely to die in a car crash than be eaten by a shark.

Thus ‘availability’ is also affected by impact or prominence. Experimenters read out a list of men and women to two groups without telling them that there are exactly 25 men and 25 women, and asked them to guess the ratio of the sexes. If the list included some famous men, the group was influenced to think there were more men, if the list included famous women, the group thought there are more women than men.

The entire advertising industry is based on the availability error in the way it invents straplines, catchphrases and jingles designed to pop to the front of your mind when you consider any type of product, to be – in other words – super available.

I liked the attribution of the well-known fact that retailers price goods at just under the nearest pound, to the availability error. Most of us find £5.95 much more attractive than £6. It’s because we only process the initial 5, the first digit, it is more available.

Numerous studies have shown that the effect is hugely increased under stress. Under stressful situations – in an accident – people fixate on the first solution that comes to mind and refuse to budge.

The primacy effect First impressions. Interviewers make up their minds in the first minute of an interview and then spend the rest of the time collecting data to confirm that first impression.

The anchor effect In picking a number people tend to choose one close to any number they were presented with. Two groups were asked to estimate whether the population of Turkey was a) bigger than 5 million b) less than 65 million, and what it was. The group who’d had 5 million planted in their mind hovered around 15 million, the group who’d had 65 million hovered around 35 million. They were both wrong. It is 80 million.

The halo effect People extrapolate the nature of the whole from just one quality e.g. in tests, people think attractive people must be above average in personality and intelligence although of course there is no reason why they should be. Hence this error’s alternative name, the ‘physical attractiveness stereotype’. The halo effect is fundamental to advertising which seeks to associate images of beautiful men, women, smiling children, sunlit countryside etc with the product.

The existence of the halo effect and primacy effect are both reasons why interviews are a poor way to assess candidates for jobs or places.

The devil effect Opposite of the above: extrapolating from negative appearances to the whole. This is why it’s important to dress smartly for an interview or court appearance, it really does influence. In an experiment examiners were given identical answers, but some in terrible handwriting, some in beautifully clear handwriting. Clear handwriting consistently scored higher marks despite identical factual content of the scripts.

Illusory correlation People find links between disparate phenomena which simply don’t exist, thus:

  • people exaggerate the qualities of people or things which stand out from their environments
  • people associate rare qualities with rare things

This explains a good deal of racial prejudice: a) immigrants stand out b) a handful of immigrants commit egregious behaviour – therefore it is a classic example of illusory correlation to associate the two. What is missing is taking into account all the negative examples i.e. the millions of immigrants who make no egregious behaviour and whose inclusion would give you a more accurate statistical picture. Pay attention to negative cases.

Stereotypes 1. People tend to notice anything which supports their existing opinions. 2. We notice the actions of ‘minorities’ much more than the actions of the invisible majority.

Projection People project onto neutral phenomena patterns and meanings they are familiar with or which bolster their beliefs. Compounded by –

Obstinacy Sticking to personal opinions (often made in haste / first impressions / despite all evidence to the contrary) aka The boomerang effect When someone’s opinions are challenged, they just become more obstinate about it. Aka Belief persistence. Aka pig-headedness. Exacerbated by –

Group think People associate with others like themselves, which makes them feel safe by a) confirming their beliefs and b) letting them hide in a crowd. Experiments have shown how people in self-supporting groups are liable to become more extreme in their views. Also – and I’ve seen this myself – groups will take decisions that almost everyone in the group, as individuals, know to be wrong – but no-one is prepared to risk the embarrassment or humiliation of pointing it out. The Emperor’s New Clothes. Groups are more likely to make irrational decisions than individuals are.

Confirmation bias The tendency to search for, interpret, favour, and recall information in a way that confirms one’s pre-existing beliefs or hypotheses. In an experiment people were read out a series of statements about a named person, who had a stated profession and then two adjectives describing them, one what you’d expect, the other less predictable. ‘Carol, a librarian, is attractive and serious’. When asked to do a quiz at the end of the session, participants showed a marked tendency to remember the expected adjective, and forget the unexpected one. Everyone remembered that the air stewardess was ‘attractive’ but remembered the librarian for being ‘serious’.

We remember what we expect to hear. (p.76)

Or: we remember what we remember in line with pre-existing habits of thought, values etc.

We marry people who share our opinions, we have friends with people who share our opinions, we agree with everyone in our circle on Facebook.

Self-serving biases When things go well, people take the credit, when things go badly, people blame external circumstances.

Avoiding embarrassment People obey, especially in a group situation, bad orders because they don’t want to stick out. People go along with bad decisions because they don’t want to stick out. People don’t want to admit they’ve made a mistake, in front of others, or even to themselves.

Avoiding humiliation People are reluctant to admit mistakes in front of others. And rather than make a mistake in front of others, people would rather keep quiet and say nothing (in a meeting situation) or do nothing, if everyone else is doing nothing (in an action situation). Both of these avoidances feed into –

Obedience The Milgram experiment proved that people will do any kind of atrocity for an authoritative man in a white coat. All of his students agreed to inflict life-threatening levels of electric shock on the victim, supposedly wired up in the next door room and emitting blood curdling (faked) screams of pain. 72% of Senior House Officers wouldn’t question the decision of a consultant, even if they thought he was wrong.

Conformity Everyone else is saying or doing it, so you say or do it so as not to stick out / risk ridicule.

Obedience is behaving in a way ordered by an authority figure. Conformity is behaving in a way dictated by your peers.

The wrong length lines experiment. You’re put in a room with half a dozen stooges, and shown a piece of card with a line on it and then another piece of card with three lines of different length on it, and asked which of the lines on card B is the same length as the line on card A. To your amazement, everyone else in the room chooses a line which is obviously wildly wrong. In experiments up to 75%! of people in this situation go along with the crowd and choose the line which they are sure, can see, know is wrong – because people are that easily swayed.

Sunk costs fallacy The belief that you have to continue wasting time and money on a project because you’ve invested x amount of time and money to date. Or ‘throwing good money after bad’.

Sutherland keeps cycling round the same nexus of issues, which is that people jump to conclusions – based on availability, stereotypes, the halo and anchor effects – and then refuse to change their minds, twisting existing evidence to suit them, ignoring contradictory evidence.

Misplaced consistency & distorting the evidence Nobody likes to admit (especially to themselves) that they are wrong. Nobody likes to admit (especially to themselves) that they are useless at taking decisions.

Our inability to acknowledge our own errors even to ourselves is one of the most fundamental causes of irrationality. (p.100)

And so:

  • people consistently avoid exposing themselves to evidence that might disprove their beliefs
  • on being faced with evidence that disproves their beliefs, they ignore it
  • or they twist new evidence so as to confirm their existing beliefs
  • people selectively remember their own experiences, or misremember the evidence they were using at the time, in order to validate their current decisions and beliefs
  • people will go to great lengths to protect their self-esteem

Sutherland says the best cleanser / solution / strategy to fixed and obstinate ideas is to make the time to gather as much evidence as possible and to try to disprove your own position. The best solution will be the one you have tried to demolish with all the evidence you have and still remains standing.

People tend to seek confirmation of their current hypothesis, whereas they should be trying to disconfirm it. (p.138)

Fundamental attribution error Ascribing other people’s behaviour to their character or disposition rather than to their situation. Subjects in an experiment watched two people holding an informal quiz: the first person made up questions (based on what he knew) and asked the second person who, naturally enough, hardly got any of them right. Observers consistently credited the quizzer with higher intelligence than the answerer, completely ignoring the in-built bias of the situation, and instead ascribing the difference to character.

We are quick to personalise and blame in a bid to turn others into monolithic entities which we can then define and control – this saves time and effort, and makes us feel safer and secure – whereas the evidence is that all people are capable of a wide range of behaviours depending on the context and situation.

Once you’ve pigeon-holed someone, you will tend to notice aspects of their behaviour which confirm your view – confirmation bias and/or illusory correlation and a version of the halo/devil effect. One attribute colours your view of a more complex whole.

Actor -Observer Bias Variation on the above: when we screw up we find all kinds of reasons in the situation to exonerate ourselves, we performed badly because we’re ill, jet-lagged, grandma died, reasons that are external to us. If someone else screws up, it is because they just are thick, lazy, useless. I.e. we think of ourselves as complex entities subject to multiple influences, and others as monolithic types.

False Consensus Effect Over-confidence that other people think and feel like us, that our beliefs and values are the norm – in my view one of the greatest errors of our time.

It is a variation of the ever-present Availability Error because when we stop to think about any value or belief we will tend to conjure up images of our family and friends, maybe workmates, the guys we went to college with, and so on: in other words, the people available to memory – simply ignoring the fact that these people are a drop in the ocean of the 65 million people in the UK. See Facebubble.

The False Consensus Effect reassures us that we are normal, our values are the values, we’re the normal ones: it’s everyone else who is wrong, deluded, racist, sexist, whatever we don’t approve of.

Not in Sutherland’s book, I’ve discovered some commentators naming this the Liberal fallacy:

For liberals, the correctness of their opinions – on universal health care, on Sarah Palin, on gay marriage – is self-evident. Anyone who has tried to argue the merits of such issues with liberals will surely recognize this attitude. Liberals are pleased with themselves for thinking the way they do. In their view, the way they think is the way all right-thinking people should think. Thus, “the liberal fallacy”: Liberals imagine that everyone should share their opinions, and if others do not, there is something wrong with them. On matters of books and movies, they may give an inch, but if people have contrary opinions on political and social matters, it follows that the fault is with the others. (Commentary magazine)

Self-Serving Bias People tend to give themselves credit for successes but lay the blame for failures on outside causes. If the project is a success, it was all due to my hard work and leadership. If it’s a failure, it’s due to circumstances beyond my control, other people not pulling their weight etc.

Preserving one’s self-esteem These three errors are all aspects of preserving our self-esteem. You can see why this has an important evolutionary and psychological purpose. In order to live, we must believe in ourselves, our purposes and capacities, believe our values are normal and correct, believe we make a difference, that our efforts bring results. No doubt it is a necessary belief and a collapse of confidence and self-belief can lead to depression and possibly despair. But that doesn’t make it true. People should learn the difference between having self-belief to motivate themselves, and developing the techniques to gather the full range of evidence – including the evidence against your own opinions and beliefs – which will enable them to make correct decisions.

Representative error People estimate the likelihood of an event by comparing it to an existing prototype / stereotype that already exists in our minds. Our prototype is what we think is the most relevant or typical example of a particular event or object. This often happens around notions of randomness: people have a notion of what randomness should look like i.e. utterly scrambled. But in fact plenty of random events or sequences arrange themselves into patterns we find meaningful. So we dismiss them as not really random.  I.e. we have judged them against our preconception of what random ought to look like.

Ask a selection of people which of these three sets of six coin tosses where H stands for heads, T for tails is random.

  1. TTTTTT
  2. TTTHHH
  3. THHTTH

Most people will choose 3 because it feels random. But of course all three are equally likely or unlikely.

Hindsight In numerous experiments people have been asked to predict the outcome of an event, then after the event questioned about their predictions. Most people forget their inaccurate predictions and misremember that they were accurate.

Overconfidence Most professionals have been shown to overvalue their expertise i.e. exaggerate their success rates.


Statistics

The trouble with this and Paulos’s books is that the entire area of statistics is separate and distinct from errors of thought and cognitive biases. I.e. you can imagine someone who avoids all of the cognitive and psychological errors named above, but still makes howlers when it comes to statistics simply because they’re not very good at it.

This is because the twin areas of Probability and Statistics are absolutely fraught with difficulty. Either you have been taught the correct techniques, and understand them, and practice them regularly (and both books demonstrate that even experts make howling mistakes in the handling of statistics and probability) or, like most of us, you have not.

As Sutherland points out, most people’s knowledge of statistics is non-existent. Since we live in a society whose public discourse i.e. politics, is ever more dominated by statistics…

Errors in estimating probability or misunderstanding samples, opinion polls and so on are probably a big part of irrationality, but I felt that they’re so distinct from the psychological biases discussed above, that they almost require a separate volume, or a separate ‘part’ of this volume. Briefly, common mistakes are:

  • too small a sample size
  • biased sample
  • not understanding that any combination of probabilities is less likely than either on their own, which requires an understanding of base rate or a priori probability
  • the law of large numbers – the more a probabilistic event takes place, the more likely the result will move towards the theoretical probability
  • be aware of the law of regression to the mean
  • be aware of the law of large numbers

Gambling

This is even more true of gambling. It is a highly specialised and advanced form of probability applied to games which have been pored over by very clever people for centuries. It’s not a question of a few general principles, this is a vast, book-length subject in its own right. Some points that emerge:

  • always work out the expected value of a bet i.e. the amount to be won times the probability of winning it

The two-by-two box

It’s taken me some time to understand this principle which is given in both Paulos and Sutherland.

When two elements with a yes/no result are combined, people tend to look at the most striking correlation and fixate on it. The only way to avoid the false conclusions that follow from that is to draw a 2 x 2 box and work through the figures.

Here is a table of 1,000 women who had a mammogram because their doctors thought they had symptoms of breast cancer.

Women with cancer Women with no cancer Total
Women with positive mammography 74 110 184
Women with negative mammography 6 810 816
80 920 1000

Bearing in mind that a conditional probability is saying that if X and Y are linked, then the chances of X, if Y, are so and so – i.e. the probability of X is conditional on the probability of Y – this table allows us to work out the following conditional probabilities:

1. The probability of getting a positive mammogram or test result, if you do actually have cancer, is 74 out of 80 = .92 (out of the 80 women with cancer, 74 were picked up by the test)

2. The probability of getting a negative mammogram or test result and not having cancer, is 810 out of 920 = .88

3. The probability of having cancer if you test positive, is 74 out of 184 = .40

4. The probability of having cancer if you test negative, is 6 out of 816 = .01

So 92% of women of women with cancer were picked up by the test. BUT Sutherland quotes a study which showed that a shocking 95% of doctors thought that this figure – 92% – was also the probability of a patient who tested positive having the disease. By far the majority of US doctors thought that, if you tested positive, you had a 92% chance of having cancer. They fixated on the 92% figure and transposed it from one outcome to the other, confusing the two. But this is wrong. The probability of a woman testing positive actually having cancer is given in conclusion 3 – 74 out of 184 = 40%. This is because 110 out of the total 184 women tested positive, but did not have cancer.

So if a woman tested positive for breast cancer, the chances of her actually having it are 40%, not 92%. Quite a big difference (and quite an indictment of the test, by the way). And yet 95% of doctors thought that if a woman tested positive she had a 92% likelihood of having cancer.

Sutherland goes on to quote a long list of other situations where doctors and others have comprehensively  misinterpreted the results of studies like this, with varied and sometimes very negative consequences.

The moral of the story is if you want to determine whether one event is associated with another, never attempt to keep the co-occurrence of events in your head. It’s just too complicated. Maintain a written tally of the four possible outcomes and refer to these.


Deep causes

Sutherland concludes the book by speculating that all the hundred or so types of irrationality he has documented can be attributed to five fundamental causes:

  1. Evolution We evolved to make snap decisions, we are brilliant at processing visual information and responding before we’re even aware of it. Conscious thought is slower, and the conscious application of statistics, probability, regression analysis and so on is very slow and laborious. Most people never acquire it.
  2. Brain structure As soon as we start perceiving, learning and remembering the world around us brain cells make connections. The more the experience is repeated, the stronger the connections become. Routines and ruts form, which are hard to budge.
  3. Heuristics Everyone develops mental short-cuts, techniques to help make quick decisions. Not many people bother with the laborious statistical techniques for assessing relative benefits which Sutherland describes.
  4. Failure to use elementary probability and elementary statistics Ignorance is another way of describing this, mass ignorance. Sutherland (being an academic) blames the education system. I, being a pessimist, attribute it to basic human nature. Lots of people just are lazy, lots of people just are stupid, lots of people just are incurious.
  5. Self-serving bias In countless ways people are self-centred, overvalue their judgement and intelligence, overvalue the beliefs of their in-group, refuse to accept it when they’re wrong, refuse to make a fool of themselves in front of others by confessing error or pointing out errors in others (especially the boss) and so on.

I would add two more:

Suggestibility. Humans are just tremendously suggestible.

Say a bunch of positive words to test subjects, then ask them questions on an unrelated topic: they’ll answer positively. Take a different representative sample of subjects and run a bunch of negative words past them, then ask them the same unrelated questions, and their answers will be measurably more negative.

Ask subjects how they get a party started and they will talk and behave extrovert to the questioner. Ask them how they cope with feeling shy and ill at ease at parties, and they will tend to act shy and speak quieter. The initial terms or anchor defines the ensuing conversation.

In one experiment a set of subjects were shown one photo of a car crash. Half were asked to describe what they think happened when one car hit another; the other half were asked to describe what they thought happened when one car smashed into the other. The ones given the word ‘smashed’ gave much more melodramatic accounts. Followed up a week later, the subjects were asked to describe what they remembered of the photo. The subjects given the word ‘hit’ fairly accurately described it, whereas the subjects given the word ‘smashed’ invented all kinds of details like a sea of broken glass around the vehicles which simply wasn’t there, which their imaginations had invented, all at the prompting of one word.

Many of the experiments Sutherland quotes demonstrate what you might call higher-level biases: but underling many of them is this simple-or-garden observation, that people are tremendously easily swayed, by both external and internal causes, away from the line of cold logic.

Anthropomorphism Another big underlying cause is anthropomorphism, namely the attribution of human characteristics to objects, events, chances, odds and so on. In other words, people really struggle to accept the high incidence of random accidents. Almost everyone attributes a purpose or intention to almost everything that happens. This means our perceptions of almost everything in life are skewed from the start.

During the war Londoners devised innumerable theories about the pattern of German bombing. After the war, when Luftwaffe records were analysed, it showed the bombing was more or less at random.

The human desire to make sense of things – to see patterns where none exists or to concoct theories… can lead people badly astray. (p.267)

Suspending judgement is about the last thing people are capable of. People are extremely uneasy if things are left unexplained. Most people rush to judgement like water into a sinking ship.

Cures

  • keep an open mind
  • reach a conclusion only after reviewing all the possible evidence
  • it is a sign of strength to change one’s mind
  • seek out evidence which disproves your beliefs
  • do not ignore or distort evidence which disproves your beliefs
  • never make decisions in a hurry or under stress
  • where the evidence points to no obvious decision, don’t take one
  • learn basic statistics and probability
  • substitute mathematical methods (cost-benefit analysis, regression analysis, utility theory) for intuition and subjective judgement

Comments on the book

Out of date

Irrationality was first published in 1992 and this makes the book dated in several ways (maybe this is why the first paperback edition was published by upmarket mass publisher Penguin, whereas the most recent edition was published by the considerably more niche publisher, Pinter & Martin).

In the chapter about irrational business behaviour he quotes quite a few examples from the 1970s and the oil crisis of 1974. These and other examples – such as the long passage about how inefficient the civil service was in the early 1970s – feel incredibly dated now.

And the whole thing was conceived, researched and written before there was an internet or any of the digital technology we take for granted nowadays. Can’t help make wonder how the digital age has changed or added to the long list of biases, prejudices and faulty thinking he gives, and what errors of reason have emerged specific to our fabulous digital technology.

Grumpy

But it also has passages where Sutherland extrapolates out to draw general conclusions and some of these sound more like the grumblings of a grumpy old man than anything based on evidence.

Thus Sutherland whole-heartedly disapproves of ‘American’ health fads, dismisses health foods as masochistic fashion and is particularly scathing about jogging. He thinks ‘fashion’ in any sphere of life is ludicrously irrational. He is dismissive of doctors who he accuses of rejecting statistical evidence, refusing to share information with patients and wildly over-estimating their own diagnostic abilities.

He thinks the publishers of learned scientific journals are more interested in making money out of scientists than in ‘forwarding the progress of science’ (p.185). He thinks the higher average pay that university graduates tend to get is unrelated to their attendance at university and more to do with having well connected middle and upper middle class parents, and thus considers the efforts of successive Education Secretaries to introduce student loans to be unscientific and innumerate (p.186). He criticises Which consumer magazine for using too small samples in its testing (p.215). In an extended passage he summarises Leslie Chapman’s blistering (and very out of date) critique of the civil service, Your Disobedient Servant published in 1978 (pp.69-75).

He really has it in for psychoanalysis which he accuses of all sorts of irrational thinking such as projecting, false association, refusal to investigate negative instances, failing to take into account the likelihood that the patient would have improved anyway, and so on. Half way through the book he gives a thumbnail summary:

Self-deceit exists on a massive scale: Freud was right about that. Where he went wrong was in attributing it all to the libido, the underlying sex drive. (p.197)

In other words, the book is liberally sprinkled with Sutherland’s own personal opinions, which sometimes risk giving it a crankish feel.

On the other hand it’s surprising to see how some hot button issues haven’t changed at all. In the passage about the Prisoners’ Dilemma, Sutherland takes as a real life example the problem the nations of the world were having in 1992 in agreeing to cut back carbon dioxide emissions. Sound familiar?

He also states that the single biggest factor undermining international co-operation was America’s refusal to sign global treaties to limit global warming. In 1992! Plus ça change.

Against stupidity the gods themselves contend in vain

And finally, these are the mistakes made by the most intelligent and best educated among us, people trained to assess and act on evidence.

Neither this nor John Allen Paulos’s books take into account the obvious fact that lots of people are stupid. They begin with poor genetic material, are raised in families where no-one cares about education, are let down by poor schools, and are excluded or otherwise demotivated, with the result that :

  • the average reading age in the UK is 9
  • about one in five Britons (over ten million) are functionally illiterate, and probably about the same rate innumerate

which all adds to the general festival of idiocy.

Trying to keep those pesky cognitive errors at bay (in fact The Witch by Pieter Bruegel the Elder)

Trying to keep those pesky cognitive errors at bay (otherwise known as The Witch by Pieter Bruegel the Elder)


Related link

Reviews of other science books

Cosmology

Environment / human impact

Genetics

  • The Double Helix by James Watson (1968)

Maths

Particle physics

Psychology

A Mathematician Reads the Newspaper: Making Sense of the Numbers in the Headlines by John Allen Paulos (1995)

Always be smart. Seldom be certain. (p.201)

Mathematics is not primarily a matter of plugging numbers into formulas and performing rote computations. It is a way of thinking and questioning that may be unfamiliar to many of us, but is available to almost all of us. (p.3)

John Allen Paulos

John Allen Paulos is an American professor of mathematics who came to wider fame with publication of his short (130-page) primer, Innumeracy: Mathematical Illiteracy and its Consequences, published in 1988.

It was followed by Beyond Numeracy: Ruminations of a Numbers Man in 1991 and this book, A Mathematician Reads the Newspaper in 1995.

Structure

The book is made up of about 50 short chapters. He explains that each one of them will take a topic in the news in 1993 and 1994 and show how it can be analysed and understood better using mathematical tools.

The subjects of the essays are laid out under the same broad headings that you’d encounter in a newspaper, with big political stories at the front, giving way to:

  • Local, business and social issues
  • Lifestyle, spin and soft news
  • Science, medicine and the environment
  • Food, book reviews, sports and obituaries

Response

The book is disappointing in all kinds of ways.

First and foremost, he does not look at specific stories. All the headlines are invented. Each 4 or 5-page essay may or may not call in aspects of various topics in the news, but they do not look at one major news story and carefully deconstruct how it has been created and publicised in disregard of basic mathematics and probability and statistics. (This alone is highly suggestive of the possibility that, despite all his complaints to the contrary, specific newspaper stories where specific mathematical howlers are made and can be corrected are, in fact surprisingly rare.)

The second disappointment is that, even though these essays are very short, they cannot stay focused on one idea or example for much more than a page. I hate to say it and I don’t mean to be rude, but Paulos’s text has some kind of attention deficit disorder: the essays skitter all over the place, quickly losing whatever thread they ever had in a blizzard of references to politics, baseball, pseudoscience and a steady stream of bad jokes. He is so fond of digressions, inserts, afterthoughts and tangents that it is often difficult to say what any given essay is about.

I was hoping that each essay would take a specific news story and show how journalists had misunderstood the relevant data and maths to get it wrong, and would then show the correct way to analyse and interpret it. I was hoping that the 50 or so examples would have been carefully chosen to build up for the reader an armoury of techniques of arithmetic, probability, calculus, logarithms and whatever else is necessary to immediately spot, deconstruct and correct articles with bad maths in them.

Nope. Not at all.

Lani ‘Quota Queen’ Guinier

Take the very first piece, Lani ‘Quota Queen’ Guinier. For a start he doesn’t tell us who Lani ‘Quota Queen’ Guinier is. I deduce from his introduction that she was President Clinton’s nomination for the post of assistant attorney general for civil rights. We can guess, then, that the nickname ‘quota queen’ implies she was a proponent of quotas, though whether for black people, women or what is not explained.

Why not?

Paulos introduces us to the Banzhaf power index, devised in 1965 by lawyer John F. Banzhaf.

The Banzhaf power index of a group, party or person is defined to be the number of ways in which that group, party or person can change a losing coalition into a winning coalition or vice versa. (p.10)

He gives examples of companies where three or four shareholders hold different percentages of voting rights and shows how some coalitions of shareholders will always have decisive voting rights, whereas others never will (these are called the dummy) while even quite small shareholders can hold disproportionate power. For example in a situation where three shareholders hold 45%, 45% and 10% of the shares, the 10% party can often have the decisive say. In 45%, 45%, 8% and 2% the 2% is the dummy.

He then moves on to consider voting systems in some American states, including: cumulative voting, systems where votes don’t count as 1 but are proportionate to population, Borda counts (where voters rank the candidates and award progressively more points to those higher up the rankings), approval voting (where voters have as many votes as they want and can vote for as many candidates as they approve of), before going on to conclude that all voting systems have their drawbacks.

The essay ends with a typical afterthought, one-paragraph coda suggesting how the Supreme Court could end up being run by a cabal of just three judges. There are nine judges on the U.S. Supreme Court. Imagine (key word for Paulos), imagine a group of five judges agree to always discuss issues among themselves first, before the vote of the entire nine, and imagine they decide to always vote according to whatever the majority (3) decide. Then imagine that a sub-group of just three judges go away and secretly decide, that in the group of five, they will always agree. Thus they will dictate the outcome of every Supreme Court decision.

So:

1. I had no idea who Lani ‘Quota Queen’ Guinier was or, more precisely, I had to do a bit of detective work to figure it out, and still wasn’t utterly sure.

2. This is a very sketchy introduction to the issue of democratic voting systems. This is a vast subject, which Paulos skates over quickly and thinly.

Thus, in these four and a bit pages you have the characteristic Paulos experience of feeling you are wandering all over the place, not quite at random, but certainly not in a carefully planned sequential way designed to explore a topic thoroughly and reach a conclusion. You are introduced to a number of interesting ideas, with some maths formulae, but not in enough detail or at sufficient length to really understand them. And because he’s not addressing any particular newspaper report or article, there are no particular misconceptions to clear up: the essay is a brief musing, a corralling of thoughts on an interesting topic.

This scattergun approach characterises the whole book.

Psychological availability and anchoring effects

The second essay is titled Psychological availability and anchoring effects. He explains what the availability error, the anchor effect and the halo effect are. If this is the first time you’ve come across these notions, they’re powerful new ideas. But I recently reread Irrationality by Stuart Sutherland which came out three years before Paulos’s book and spends over three hundred pages investigating these and all the other cognitive biases which afflict mankind in vastly more depth than Paulos, with many more examples. Next to it, Paulos’s three-minute essay seemed sketchy and superficial.

General points

Rather than take all 50 essays to pieces, here are notes on what I actually did learn. Note that almost none of it was about maths, but general-purpose cautions about how the news media work, and how to counter its errors of logic. In fact, all of it could have come from a media studies course without any maths at all:

  • almost all ‘news’ reinforces conventional wisdom
  • because they’re so brief, almost all headlines must rely on readers’ existing assumptions and prejudices
  • almost all news stories relate something new back to similar examples from the past, even when the comparison is inappropriate, again reinforcing conventional wisdom and failing to recognise the genuinely new
  • all economic forecasts are rubbish: this is because economics (like the weather and many other aspects of everyday life) is a non-linear system. Chaos theory shows that non-linear systems are highly sensitive to even minuscule differences in starting conditions, which has been translated into pop culture as the Butterfly Effect
  • and also with ‘futurologists’: the further ahead they look, the less reliable their predictions
  • the news is deeply biased by always assuming human agency is at work in any outcome: if any disaster happens anywhere the newspapers always go searching for a culprit; in the present Brexit crisis lots of news outlets are agreeing to blame Theresa May. But often things happen at random or as an accumulation of unpredictable factors. Humans are not good at acknowledging the role of chance and randomness.

There is a tendency to look primarily for culpability and conflicts of human will rather than at the dynamics of a natural process. (p.160)

  • Hence so many newspapers endlessly playing the blame game. The Grenfell Tower disaster was, first and foremost, an accident in the literal sense of ‘an unfortunate incident that happens unexpectedly and unintentionally, typically resulting in damage or injury’ – but you won’t find anybody who doesn’t fall in with the prevailing view that someone must be to blame. There is always someone to blame. We live in a Blame Society.
  • personalising beats stats, data or probability: nothing beats ‘the power of dramatic anecdote’ among the innumerate: ‘we all tend to be unduly swayed by the dramatic, the graphic, the visceral’ (p.82)
  • if you combine human beings’ tendency to personalise everything, and to look for someone to blame, you come up with Donald Trump, who dominates every day’s news
  • so much is happening all the time, in a world with more people and incidents than ever before, in which we are bombarded with more information via more media than ever before – that it would be extraordinary if all manner or extraordinary coincidences, correspondences and correlations didn’t happen all the time
  • random events can sometimes present a surprisingly ordered appearance
  • because people imbue meaning into absolutely everything, then the huge number of coincidences and correlations are wrongfully interpreted as meaningful

Tips and advice

I was dismayed at the poor quality of many of the little warnings which each chapter ends with. Although Paulos warns against truisms (on page 54) his book is full of them.

Local is not what it used to be, and we shouldn’t be surprised at how closely we’re linked. (p.55)

In the public realm, often the best we can do is to stand by and see how events unfold. (p.125)

Chapter three warns us that predictions about complex systems (the weather, the economy, big wars) are likely to be more reliable the simpler the system they’re predicting, and the shorter period they cover. Later he says we should be sceptical about all long-term predictions by politicians, economists and generals.

It didn’t need a mathematician to tell us that.

A lot of it just sounds like a grumpy old man complaining about society going to the dogs:

Our increasingly integrated and regimented society undermines our sense of self… Meaningless juxtapositions and coincidences replace conventional narratives and contribute to our dissociation… (pp.110-111)

News reports in general, and celebrity coverage in particular, are becoming ever-more self-referential. (p.113)

We need look no further than the perennial appeal of pseudoscientific garbage, now being presented in increasingly mainstream forums… (p.145)

The fashion pages have always puzzled me. In my smugly ignorant view, they appear to be so full of fluff and nonsense as to make the astrology columns insightful by comparison. (p.173)

Another aspect of articles in the society pages or in the stories about political and entertainment figures is the suggestion that ‘everybody’ knows everybody else. (p.189)

Sometimes his liberal earnestness topples into self-help book touchy-feeliness.

Achieving personal integration and a sense of self is for the benefit of ourselves and those we’re close to. (p.112)

But just occasionally he does say something unexpected:

The attention span created by television isn’t short; it’s long, but very, very shallow. (p.27)

That struck me as an interesting insight but, as with all his interesting comments, no maths was involved. You or I could have come up with it from general observation.

Complexity horizon

The notion that the interaction of human laws, conventions, events, politics, and general information overlap and interplay at ever-increasing speeds to eventually produce situations so complex as to appear unfathomable. Individuals, and groups and societies, have limits of complexity beyond which they cannot cope, but have to stand back and watch. Reading this made me think of Brexit.

He doesn’t mention it, but a logical spin-off would be that every individual has a complexity quotient like an intelligence quotient or IQ. Everyone could take a test in which they are faced with situations of slowly increasing complexity – or presented with increasingly complex sets of information – to find out where their understanding breaks off – which would become their CQ.

Social history

The book was published in 1995 and refers back to stories current in the news in 1993 and 1994. The run of domestic political subjects he covers in the book’s second quarter powerfully support my repeated conviction that it is surprising how little some issues have changed, how little movement there has been on them, and how they have just become a settled steady part of the social landscape of our era.

Thus Paulos has essays on:

  • gender bias in hiring
  • homophobia
  • accusations of racism arising from lack of ethnic minorities in top jobs (the problem of race crops up numerous times (pp.59-62, p.118)
  • the decline in educational standards
  • the appallingly high incidence of gun deaths, especially in black and minority communities
  • the fight over abortion

I feel increasingly disconnected from contemporary politics, not because it is addressing new issues I don’t understand, but for the opposite reason: it seems to be banging on about the same issues which I found old and tiresome twenty-five years ago.

The one topic which stood out as having changed is AIDS. In Innumeracy and in this book he mentions the prevalence or infection rates of AIDS and is obviously responding to numerous news stories which, he takes it for granted, report it in scary and alarmist terms. Reading these repeated references to AIDS made me realise how completely and utterly it has fallen off the news radar in the past decade or so.

In the section about political correctness he makes several good anti-PC points:

  • democracy is about individuals, the notion that everyone votes according to their conscience and best judgement; as soon as you start making it about groups (Muslims, blacks, women, gays) you start undermining democracy
  • racism and sexism and homophobia are common enough already without making them the standard go-to explanations for social phenomena which often have more complex causes; continually attributing all aspects of society to just a handful of inflammatory issues, keeps the issues inflammatory
  • members of groups often vie with each other to assert their loyalty, to proclaim their commitment to the party line and this suggests a powerful idea: that the more opinions are expressed, the more extreme these opinions will tend to become. This is a very relevant idea to our times when the ubiquity of social media has a) brought about a wonderful spirit of harmony and consensus, or b) divided society into evermore polarised and angry groupings

Something bad is coming

I learned to fear several phrases which indicate that a long, possibly incomprehensible and frivolously hypothetical example is about to appear:

‘Imagine…’

Imagine flipping a penny one thousand times in succession and obtaining some sequence of heads and tails… (p.75)

Imagine a supercomputer, the Delphic-Cray 1A, into which has been programmed the most complete and up-to-date scientific knowledge, the initial condition of all particles, and sophisticated mathematical techniques and formulas. Assume further that… Let’s assume for argument’s sake that… (p.115)

Imagine if a computer were able to generate a random sequence S more complex than itself. (p.124)

Imagine the toast moistened, folded, and compressed into a cubical piece of white dough… (p.174)

Imagine a factory that produces, say, diet food. Let’s suppose that it is run by a sadistic nutritionist… (p.179)

‘Assume that…’

Let’s assume that each of these sequences is a billion bits long… (p.121)

Assume the earth’s oceans contain pristinely pure water… (p.141)

Assume that there are three competing healthcare proposals before the senate… (p.155)

Assume that the probability of your winning the coin flip, thereby obtaining one point, is 25 percent. (p.177)

Assume that these packages come off the assembly line in random order and are packed in boxes of thirty-six. (p.179)

Jokes and Yanks

All the examples are taken from American politics (President Clinton), sports (baseball) and wars (Vietnam, First Gulf War) and from precisely 25 years ago (on page 77, he says he is writing in March 1994), both of which emphasise the sense of disconnect and irrelevance with a British reader in 2019.

As my kids know, I love corny, bad old jokes. But not as bad as the ones the book is littered with:

And then there was the man who answered a matchmaking company’s computerised personals ad in the paper. He expressed his desire for a partner who enjoys company, is comfortable in formal wear, likes winter sports, and is very short. The company matched him with a penguin. (pp.43-44)

The moronic inferno and the liberal fallacy

The net effect of reading this book carefully is something that the average person on the street knew long ago: don’t believe anything you read in the papers.

And especially don’t believe any story in a newspaper which involves numbers, statistics, percentages, data or probabilities. It will always be wrong.

More broadly his book simply fails to take account of the fact that most people are stupid and can’t think straight, even very, very educated people. All the bankers whose collective efforts brought about the 2008 crash. All the diplomats, strategists and military authorities who supported the Iraq War. All the well-meaning liberals who supported the Arab Spring in Egypt and Libya and Syria. Everyone who voted Trump. Everyone who voted Brexit.

Most books of this genre predicate readers who are white, university-educated, liberal middle class and interested in news and current affairs, the arts etc and – in my opinion – grotesquely over-estimate both their value and their relevance to the rest of the population. Because this section of the population – the liberal, university-educated elite – is demonstrably in a minority.

Over half of Americans believe in ghosts, and a similar number believes in alien abductions. A third of Americans believe the earth is flat, and that the theory of evolution is a lie. About a fifth of British adults are functionally illiterate and innumerate. This is what Saul Bellow referred to as ‘the moronic inferno’.

On a recent Radio 4 documentary about Brexit, one contributor who worked in David Cameron’s Number Ten commented that he and colleagues went out to do focus groups around the country to ask people whether we should leave the EU and that most people didn’t know what they were talking about. Many people they spoke to had never heard of the European Union.

On page 175 he says the purpose of reading a newspaper is to stretch the mind, to help us envision distant events, different people and unusual situations, and broaden our mental landscape.

Is that really why he thinks people read newspapers? As opposed to checking the sports results, catching up with celebrity gossip, checking what’s happening in the soaps, reading interviews with movie and pop stars, looking at fashion spreads, reading about health fads and, if you’re one of the minority who bother with political news, having all your prejudices about how wicked and stupid the government, the poor, the rich or foreigners etc are, and despising everyone who disagrees with you (Guardian readers hating Daily Mail readers; Daily Mail readers hating Guardian readers; Times readers feeling smugly superior to both).

This is a fairly entertaining, if very dated, book – although all the genuinely useful bits are generalisations about human nature which could have come from any media studies course.

But if it was intended as any kind of attempt to tackle the illogical thinking and profound innumeracy of Western societies, it is pissing in the wind. The problem is vastly bigger than this chatty, scattergun and occasionally impenetrable book can hope to scratch. On page 165 he says that a proper understanding of mathematics is vital to the creation of ‘an informed and effective citizenry’.

‘An informed and effective citizenry’?


Related links

Reviews of other science books

Cosmology

Environment / human impact

Genetics

  • The Double Helix by James Watson (1968)

Maths

Particle physics

Psychology

  • Irrationality: The Enemy Within by Stuart Sutherland (1992)
%d bloggers like this: