John Christopher on the changing face of science fiction (2003)

Christopher’s preface

When his young adult novel The White Mountains was reissued by Penguin in 2003, John Christopher was asked to write a new introduction to it. The resulting preface is only eight pages long and mostly explains a bit about the book’s conception and execution. But it also includes quite a passage describing how science fiction developed during his lifetime, which I think is worth publicising and pondering.

Christopher tells us that he was a well-established author of a dozen or more novels for adults when he received a letter from his agent telling him a publisher was asking whether he would consider writing a novel for children.

But what sort of book was it going to be? The publisher obviously wanted science fiction, but I was getting tired of destroying the world – by famine or freezing or earthquakes – and I was no longer interested in exploring the universe outside our planet. There was a reason for that.

When I was the age of the boys and girls for whom it was now proposed I write, I’d been very excited about the possibilities of space travel, but those had been different days. In the early thirties we knew just about enough about the solar system for its possibilities to be a magnet to the imagination. The moon might be cold and dead, but the planets offered scope for dreaming. Mars, for instance, was colder than our earth and had a thinner atmosphere, but possibly not too cold or airless to support life.

And Mars had those canals. An Italian astronomer called Schiaparelli, looking through his telescope in the nineteenth century, said he had seen canali on Mars’s rust-red surface. In Italian that just means ‘channels’, but it got translated as ‘canals’, which was much more intriguing. Maybe in that thin but breathable atmosphere there were long waterways, built by an ancient race of Martians, dotted with Martian cities that were lit by day by a smaller sun and at night by the magic gleam of two low-lying moons. An ancient race, because one might suppose that on that chillier planet the process of life’s evolution had been in advance of ours. Apart from being older, the Martians might well be wiser and able to pass on to us the fruit of their knowledge. Or, if they were so ancient as to have become extinct, the ruins of their cities might still be there to be explored.

Then there was Venus – closer to the sun and much hotter than the earth – with its permanent blanket of clouds. What might lie beneath the clouds? Perhaps a planet in an earlier period of evolution, as Mars was in a later one. Something like our own Carboniferous era, perhaps. Did tropical swamps teeming with dinosaurs and hovering pterodactyls await the arrival of our first spaceship?

Because that was something else we felt confident about: early experiments with rockets had already made the eventual conquest of space more than plausible. It could happen in our lifetime, and with it bring unthinkable wonders. It was a bit like being in Elizabethan England, reading stories about what might be found in the new world which was opening up on the far side of the barely explored western ocean.

But in three short decades everything changed. By the 1960s we knew more about the universe and the solar system – but what we’d learned was much less interesting than what we’d imagined. We knew that Mars was not just cold but an altogether hostile environment, Venus a choking oven of poisonous gases. The chance of any kind of life existing on either planet – or anywhere within reach of our probing rockets – was incredibly remote.

A couple of years after I wrote The White Mountains, space itself was finally conquered. The landing on the moon was televised around the world, timed to coincide with prime-time US television viewing. That meant the early hours of the morning in the Channel Islands, where I then lived. The boy I had been at fourteen would never have believed that I couldn’t be bothered to stay up to watch.

I had seen the future, and found it disappointing: so what remained? Well, there was the past. The colour which had bleached out of our interplanetary speculations was still bright in human history and there was life there, and romance and action… The publisher wanted the future: I was more interested in the past…

The Tripod trilogy reconciles future and past

Christopher then goes on to explain how he conceived a way of combining the two, the publisher’s request for science fiction with his own disillusion with science fiction tropes and growing fondness for past history, by imagining an earth set in the future and which has been conquered by futuristic machines, the tripods (very similar to the Martians of H.G. Wells’s War of The Worlds) but the invaders have realised the best way of controlling human society is to take it back to the Middle Ages, by creating small rural communities of serfs obeying the local lord of the manor who in turn owes fealty to the king who is himself guided by the tripods.

And hence the odd atmosphere of Christopher’s Tripod trilogy, which combine futuristic alien masters with a society which is thoroughly feudal and medieval in feel.

Disillusionment with space travel

So much for the origins of this particular novel, but the point of quoting his words in full is to convey Christopher’s eye-witness testimony to how young science-fiction-minded writers’ attitude changed massively between, say, 1930 and 1970.

The just-enough knowledge of the solar system which he describes in the 1930s is the imaginative backdrop to the Flash Gordon, space rocket and ray gun, bubble gum sci fi stories of the 1940s, 50s and on into the 60s. It explains the early space fiction of John Wyndham, two of whose novels are set on a Mars where humans can breathe the ‘air’, can settle and meet the native ‘Martians’, as they do in Ray Bradbury’s Martian Chronicles, the first of which was written as long ago as 1946, and as they do in thousands and thousands of other travelling-to-Mars and colonising-Venus stories.

I wonder if we could delve deeper and locate just when that sense of disillusionment kicked in. Immediately after the Second World War science fiction received a boost from at least two specific inventions: one was the atom bomb, with its ramifications for new ‘atomic power’ which imaginative writers speculated could be turned into engines which could power spaceships across the solar system; the second was the practical application of rocket technology by the Nazis, who developed their big V1 and V2 rockets, both of which are prototypes for the countless cigar-shaped rockets to the moon, to Mars or to Venus which infest the science fiction magazines of the period.

And behind specifically sci fi-friendly inventions there lay the enormous psychological boost of America’s post-war economic boom, when cars and bras got bigger and bigger, the consumer revolution of fridges, washing machines and so on, which fuelled the widespread expectation that pretty soon gadgets would be developed to solve every household or lifestyle problem – including ones for teleporting round the planet or jetting off to the stars.

Is it possible, I wonder, to date precisely when the sense of disillusion which Christopher so eloquently describes, began to kick in? Or did it happen to different people at different times? I grew up in the late 1960s and early 1970s and remember watching Tomorrow’s World with James Burke who also covered the Apollo moon landings, and there was still plenty of optimism about building a space station and using it as a jumping off point for Mars and all the rest of it.

J.G. Ballard was a relatively lone voice when he declared in about 1973 that the Space Age was over. That seemed a mad thing to say but what he was specifically referring to was the fact that the later moon landings were not covered live by American TV because ratings fell off. By the last moon mission, the Apollo 17 trip of 1972, the moon landings and the TV series that presented them to a worldwide audience, had been cancelled.

People were bored. Although we then went on to decades of the space shuttle and the creation of the international space station (the 1980s and 90s) Ballard was, I think, right to realise that these developments no longer captured widespread popular attention. They relapsed into being the special interest of a diminishing band of fans, with occasional flare-ups of wider interest whenever a rocket or shuttle blew up (January 28, 1986) or the occasional landing of a little buggy on Mars (as with the current Mars rover mission).

Anthropomorphism and Western chauvinism

But more than just shedding light on the trajectory from optimism to indifference about space travel in the mind of Christopher and by extension his generation (he was born in 1922), this passage also tells us something else about the sociological shape of the human imagination.

What I mean is the incredibly anthropomorphic nature of the speculations Christopher found so exciting. He expected there to be cities, or ruins of cities, or ‘wise old civilisations’ which could teach us newbies the secrets of the universe. Or maybe Venus would be at the other end of the evolutionary scale and just like earth in the age of the dinosaurs.

Either way you can see how these are obviously entirely human, anthropomorphic imaginings.

Digging a bit deeper, the notion that there might be ‘ruins’ on Mars is not only anthropomorphic but very Anglocentric. The 1920s and 30s were a great era for finding ruins of lost civilisations, crystallised by the publicity surrounding the discovery of Tutankhamen’s tomb in 1922. But the point is that these, along with discoveries made along the Silk Road in Asia or aboriginal holy sites in Australia, or Inca and Aztec sites in Central America, or the imperial cities of Zimbabwe or Chad, these were all discoveries made by Europeans and Americans, and so became part of our culture, the relics were brought back to our countries and became part of our colonial ownership of the rest of the world.

The ruins might be in Central America or Asia but they were made by white men, written up in white men’s journals for white men organisations and popularised through the newspapers, tabloids and magazines of the West, percolating down to schoolboys like Christopher and his contemporaries as controlled and ordered and structured into heroic narratives of Western exploration and discovery and understanding.

And it’s this ordered, directed, pro-Western structuring of narratives of discovery which underpin thousands and thousands of science fiction planetary stories from the 30s, 40s and 50s. Underpinned by the basic assumption that we earthlings, generally American earthlings, have a God-given right to colonise, inhabit, discover, communicate with, define and categorise and generally own the rest of the solar system if not the galaxy.

Which makes all the narratives which share this basic underpinning or ideological framework – no matter how disturbing their surface details and gaudy monsters might be – at their core, reassuring and comforting because they reinforce the notions of order and civilisation and morality and hierarchy and category which underpinned Western discourse (i.e. the aggregated total of the news media, scientific research, history and the humanities and all types of fiction) during that era.

Christopher’s young notions about the solar system and aliens were human-friendly and Western friendly.

Moving from adult to children’s fiction

In this respect Christopher’s transition from writing for adults to writing for children at just the time he did makes perfect sense, because the adult world, at the end of the 1960s, was ceasing to be the homogenous world of the 30s, 40s and 50s, and morphing into something else, something harsher and more fragmented.

Of course the Great Depression of the 1930s and then the vast calamity of the Second World War were physically and economically much more disastrous than anything which happened in the 60s and 70s. But the late 1960s and 70s saw the breakdown of the ideological, moral and cultural consensus which had dominated the West since 1945.

John Wyndham’s science fiction novels are ‘cosy’ because the protagonists all share the same values and worldview, even when they’re taking potshots at each other – to take a tiny example, Croker, the ostensible ‘baddie’ who staged the attack on Senate House in Day of The Triffids, later candidly admits it was the wrong solution to the plight of a world gone blind, and ends up becoming the leader of a new community. Deep down everyone is on the same side, believes the same things, shares the same values.

J.G. Ballard’s fiction represents, from the start, the collapse of this consensus. In Ballard’s early works the characters go mad, have psychotic breakdowns. To be precise, his characters’ response to some environmental catastrophe is to withdraw into private worlds and fantasies and to cease altogether to share values with anyone else. The moral consensus apparent in all Wyndham’s novels vanishes like morning dew leaving a ruined landscape of wandering psychotics – not psychotic killers, just people living entirely inside their own heads, to their own made-up values.

In the mid- to late-1960s, Ballard’s novels featured a lot of casual sex and violence and psychological breakdown which outraged the philistines and traditionalists. What is not so often commented on is that, as the 1970s progressed, the decade Tom Wolfe labelled the Me Decade (‘characterised by narcissism, self-indulgence, and a lack of social concern’) Ballard’s fictions came to seem prophetic of the widespread collapse of communitarianism and the rise of atomized individualism widely observed in that decade.

By the time Reagan and Thatcher were elected in 1979, although he’d carried on writing pretty much the same kind of thing, society had so completely transformed its values that Ballard came to seem like the prophet of smug, gated, amoral, rich sybarites, the subjects of his final (and, to me, deeply unsatisfying) novels, Running Wild (1988), Cocaine Nights (1996), Super-Cannes (2000), Millennium People (2003) and Kingdom Come (2006).

These all describe ‘transgressive’ behaviour among upper-middle-class professional types. They’re often described as satires, but they’re not, they’re more like shopping lists or role models for the era of the Sunday Times rich list and the never-ending series of lifestyle magazines which arose during the 1980s.

Thus to read in chronological order the novels of John Wyndham in the 30s, 40s, 50s, of John Christopher in the 50s and 60s, the optimistic techno-novels of Arthur C. Clarke from the 1950s through the 1970s, and then onto the stories and novels of J.G. Ballard is to watch the decline of Western optimism and consensus, to observe the death and burial of any sense of shared values and morals.

Now we are living in the aftermath of that collapse, with ever-increasing fragmentation of Western societies into angry tribes all convinced that they are the hard-done-by ones, and demanding restitution, justice and compensation from everyone else – the splintering of shared progressive ideas on the left into a welter of special interest and identity groups which itself mirrors the anger of right-wing communities who perceive their own white ethnic and traditional (cis-) gender identities under attack.

Sometimes reading the media, especially social media, feels like watching wild ferrets snapping at each other’s throats, against the darkening backdrop of the never-ending pandemic and the relentless environmental catastrophe of global warming.

We have come a long, long way from the innocently triumphalist vision of space-suited chaps rocketing off to colonise Venus and Mars. Now, far from colonising any other planets, it looks like we don’t even know how to hold democratic elections any more, and can’t agree what they’re for (this piece was written soon after the Proud Boys invaded the US Capitol building on 6 January 2021).

We certainly don’t know how to manage the planet we live on, let alone set ourselves up to ‘conquer’ and run others.


Reviews of other John Christopher novels

Irrationality: The Enemy Within by Stuart Sutherland (1992)

The only way to substantiate a belief is to try to disprove it.
(Irrationality: The Enemy Within, page 48)

Sutherland was 65 when he wrote this book, and nearing the end of a prestigious career in psychology research. His aim was to lay out, in 23 themed chapters, all the psychological and sociological research data from hundreds of experiments, which show just how vulnerable the human mind is to a plethora of unconscious biases, prejudices, errors, mistakes, misinterpretations and so on – the whole panoply of ways in which supposedly ‘rational’ human beings can end up making grotesque mistakes.

By the end of the book, Sutherland claims to have defined and demonstrated over 100 distinct cognitive errors humans are prone to (p.309).

I first read this book in 2000 and it made a big impact on me because I didn’t really know that this entire area of study existed, and had certainly never read such a compendium of sociology and psychology experiments before.

I found the naming of the various errors particularly powerful. They reminded me of the lists of weird and wonderful Christian heresies I was familiar with from years of of reading early Christians history. And, after all, the two have a lot in common, both being lists of ‘errors’ which the human mind can make as it falls short of a) orthodox theology and b) optimally rational thinking, the great shibboleths of the Middle Ages and of the Modern World, respectively.

Rereading Irrationality now, 20 years later, after having brought up two children, and worked in big government departments, I am a lot less shocked and amazed. I have witnessed at first hand the utter irrationality of small and medium-sized children; and I have seen so many examples of corporate conformity, the avoidance of embarrassment, unwillingness to speak up, deferral to authority, and general mismanagement in the civil service that, upon rereading the book, hardly any of it came as a surprise.

But to have all these errors so carefully named and defined and worked through in a structured way, with so many experiments giving such vivid proof of how useless humans are at even basic logic, was still very enjoyable.

What is rationality?

You can’t define irrationality without first defining what you mean by rationality:

Rational thinking is most likely to lead to the conclusion that is correct, given the information available at the time (with the obvious rider that, as new information comes to light, you should be prepared to change your mind).

Rational action is that which is most likely to achieve your goals. But in order to achieve this, you have to have clearly defined goals. Not only that but, since most people have multiple goals, you must clearly prioritise your goals.

Few people think hard about their goals and even fewer think hard about the many possible consequences of their actions. (p.129)

Cognitive biases contrasted with logical fallacies

Before proceeding it’s important to point out that there is a wholly separate subject of logical fallacies. As part of his Philosophy A-Level my son was given a useful handout with a list of about fifty logical fallacies i.e. errors in thinking. But logical fallacies are not the same as cognitive biases.

A logical fallacy stems from an error in a logical argument; it is specific and easy to identify and correct. Cognitive bias derives from deep-rooted, thought-processing errors which themselves stem from problems with memory, attention, self-awareness, mental strategy and other mental mistakes.

Cognitive biases are, in most cases, far harder to acknowledge and often very difficult to correct.

Fundamentals of irrationality

1. Innumeracy One of the largest causes of all irrational behaviour is that people by and large don’t understand statistics or maths. Thus most people are not intellectually equipped to understand the most reliable type of information available to human beings – data in the form of numbers. Instead they tend to make decisions based on a wide range of faulty and irrational psychological biases.

2. Physiology People are often influenced by physiological factors. Apart from obvious ones like tiredness or hunger, which are universally known to affect people’s cognitive abilities, there are also a) drives (direct and primal) like hunger, thirst, sex, and b) emotions (powerful but sometimes controllable) like love, jealousy, fear and – especially relevant – embarrassment, specifically, the acute reluctance to acknowledge limits to your own knowledge or that you’ve made a mistake.

At a more disruptive level, people might be alcoholics, drug addicts, or prey to a range of other obsessive behaviours, not to mention suffering from a wide range of mental illnesses or conditions which undermine any attempt at rational decision-making, such as stress, anxiety or, at the other end of the spectrum, depression and loss of interest.

3. The functional limits of consciousness Numerous experiments have shown that human beings have a limited capacity to process information. Given that people rarely have a) a sufficient understanding of the relevant statistical data to begin with, and b) lack the RAM capacity to process all the data required to make the optimum decision, it is no surprise that most of us fall back on all manner of more limited, non-statistical biases and prejudices when it comes to making decisions.

The wish to feel good The world is threatening, dangerous and competitive. Humans want to feel safe, secure, calm, and in control. This is fair enough, but it does mean that people have a way of blocking out any kind of information which threatens them. Most people irrationally believe that they are cleverer than they in fact are, are qualified in areas of activity of knowledge where they aren’t, people stick to bad decisions for fear of being embarrassed or humiliated, and for the same reason reject new evidence which contradicts their position.

Named types of error and bias

Jumping to conclusions

Sutherland tricks the reader on page one, by asking a series of questions and then pointing out that, if you tried to answer about half of them, you are a fool since the questions didn’t contain enough information to arrive at any sort of solution. Jumping to conclusions before we have enough evidence is a basic and universal error. One way round this is to habitually use a pen and paper to set out the pros and cons of any decision, which also helps highlight areas where you realise you don’t have enough information.

The availability error

All the evidence is that the conscious mind can only hold a small number of data or impressions at any one time (near the end of the book, Sutherland claims the maximum is seven items, p.319). Many errors are due to people reaching for the most available explanation, using the first thing that comes to mind, and not taking the time to investigate further and make a proper, rational survey of the information.

Many experiments show that you can unconsciously bias people by planting ideas, words or images in their minds which then directly affect decisions they take hours later about supposedly unconnected issues.

Studies show that doctors who have seen a run of a certain condition among their patients become more likely to diagnose it in new patients, who don’t have it. Because the erroneous diagnosis is more ‘available’.

The news media is hard-wired to publicise shocking and startling stories which leads to the permanent misleading of the reading public. One tourist eaten by a shark in Australia eclipses the fact that you are far more likely to die in a car crash than be eaten by a shark.

Thus ‘availability’ is also affected by impact or prominence. Experimenters read out a list of men and women to two groups without telling them that there are exactly 25 men and 25 women, and asked them to guess the ratio of the sexes. If the list included some famous men, the group was influenced to think there were more men, if the list included famous women, the group thought there are more women than men. The prominence effect.

The entire advertising industry is based on the availability error in the way it invents straplines, catchphrases and jingles designed to pop to the front of your mind when you consider any type of product, making those products – in other words – super available.

I liked the attribution of the well-known fact that retailers price goods at just under the nearest pound, to the availability error. Most of us find £5.95 much more attractive than £6. It’s because we only process the initial 5, the first digit. It is more available.

Numerous studies have shown that the availability error is hugely increased under stress. Under stressful situations – in an accident – people fixate on the first solution that comes to mind and refuse to budge.

The primacy effect

First impressions. Interviewers make up their minds about a candidate for a job in the first minute of an interview and then spend the rest of the time collecting data to confirm that first impression.

The anchor effect

In picking a number people tend to choose one close to any number they’ve recently been presented with. Two groups were asked to estimate whether the population of Turkey was a) bigger than 5 million b) less than 65 million, and what it was. The group who’d had 5 million planted in their mind hovered around 15 million, the group who’d had 65 million hovered around 35 million. They were both wrong. It is 80 million.

The halo effect

People extrapolate the nature of the whole from just one quality e.g. in tests, people think attractive people must be above average in personality and intelligence although, of course, there is no reason why they should be. Hence this error’s alternative name, the ‘physical attractiveness stereotype’. The halo effect is fundamental to advertising, which seeks to associate images of beautiful men, women, smiling children, sunlit countryside etc with the product being marketed.

The existence of the halo effect and primacy effect are both reasons why interviews are a poor way to assess candidates for jobs or places.

The devil effect

Opposite of the above: extrapolating from negative appearances to the whole. This is why it’s important to dress smartly for an interview or court appearance, it really does influence people. In an experiment examiners were given identical answers, but some in terrible handwriting, some in beautifully clear handwriting. The samples with clear handwriting consistently scored higher marks, despite the identical factual content of the scripts.

Illusory correlation

People find links between disparate phenomena which simply don’t exist, thus:

  • people exaggerate the qualities of people or things which stand out from their environments
  • people associate rare qualities with rare things

This explains a good deal of racial prejudice: a) immigrants stand out b) a handful of immigrants commit egregious behaviour – therefore it is a classic example of illusory correlation to associate the two. What is missing is taking into account all the negative examples i.e. the millions of immigrants who make no egregious behaviour and whose inclusion would give you a more accurate statistical picture. Pay attention to negative cases.

Stereotypes

  1. People tend to notice anything which supports their existing opinions.
  2. We notice the actions of ‘minorities’ much more than the actions of the invisible majority.

Projection

People project onto neutral phenomena, patterns and meanings they are familiar with or which bolster their beliefs. This is compounded by –

Obstinacy

Sticking to personal opinions (often made in haste / first impressions / despite all evidence to the contrary) aka The boomerang effect When someone’s opinions are challenged, they just become more obstinate about it. Aka Belief persistence. Aka pig-headedness. And this is axacerbated by –

Group think

People associate with others like themselves, which makes them feel safe by a) confirming their beliefs and b) letting them hide in a crowd. Experiments have shown how people in self-supporting groups are liable to become more extreme in their views. Also – and I’ve seen this myself – groups will take decisions that almost everyone in the group, as individuals, know to be wrong – but no-one is prepared to risk the embarrassment or humiliation of pointing it out. The Emperor’s New Clothes. Groups are more likely to make irrational decisions than individuals are.

Confirmation bias

The tendency to search for, interpret, favour, and recall information in a way that confirms one’s pre-existing beliefs or hypotheses. In an experiment people were read out a series of statements about a named person, who had a stated profession and then two adjectives describing them, one that you’d expect, the other less predictable. ‘Carol, a librarian, is attractive and serious’. When asked to do a quiz at the end of the session, participants showed a marked tendency to remember the expected adjective, and forget the unexpected one. Everyone remembered that the air stewardess was ‘attractive’ but remembered the librarian for being ‘serious’.

We remember what we expect to hear. (p.76)

Or: we remember what we remember in line with pre-existing habits of thought, values etc.

We marry people who share our opinions, we have friends with people who share our opinions, we agree with everyone in our circle on Facebook.

Self-serving biases

When things go well, people take the credit, when things go badly, people blame external circumstances.

Avoiding embarrassment

People obey, especially in a group situation, bad orders because they don’t want to stick out. People go along with bad decisions because they don’t want to stick out. People don’t want to admit they’ve made a mistake, in front of others, or even to themselves.

Avoiding humiliation

People are reluctant to admit mistakes in front of others. And rather than make a mistake in front of others, people would rather keep quiet and say nothing (in a meeting situation) or do nothing, if everyone else is doing nothing (in an action situation). Both of these avoidances feed into –

Obedience

The Milgram experiment proved that people will carry out any kind of atrocity for an authoritative man in a white coat. All of his students agreed to inflict life-threatening levels of electric shock on the victim, supposedly wired up in the next door room and emitting blood curdling (faked) screams of pain. 72% of Senior House Officers wouldn’t question the decision of a consultant, even if they thought he was wrong.

Conformity

Everyone else is saying or doing it, so you say or do it so as not to stick out / risk ridicule.

Obedience is behaving in a way ordered by an authority figure. Conformity is behaving in a way dictated by your peers.

The wrong length lines experiment

You’re put in a room with half a dozen stooges, and shown a piece of card with a line on it and then another piece of card with three lines of different length on it, and asked which of the lines on card B is the same length as the line on card A. To your amazement, everyone else in the room chooses a line which is obviously wildly wrong. In experiments up to 75% of people in this situation go along with the crowd and choose the line which they are sure, can see and know is wrong – because everyone else did.

Sunk costs fallacy

The belief that you have to continue wasting time and money on a project because you’ve invested x amount of time and money to date. Or ‘throwing good money after bad’.

Sutherland keeps cycling round the same nexus of issues, which is that people jump to conclusions – based on availability, stereotypes, the halo and anchor effects – and then refuse to change their minds, twisting existing evidence to suit them, ignoring contradictory evidence.

Misplaced consistency & distorting the evidence

Nobody likes to admit (especially to themselves) that they are wrong. Nobody likes to admit (especially to themselves) that they are useless at taking decisions.

Our inability to acknowledge our own errors even to ourselves is one of the most fundamental causes of irrationality. (p.100)

And so:

  • people consistently avoid exposing themselves to evidence that might disprove their beliefs
  • on being faced with evidence that disproves their beliefs, they ignore it
  • or they twist new evidence so as to confirm to their existing beliefs
  • people selectively remember their own experiences, or misremember the evidence they were using at the time, in order to validate their current decisions and beliefs
  • people will go to great lengths to protect their self-esteem

Sutherland says the best cleanser / solution / strategy to fixed and obstinate ideas is:

  1. to make the time to gather as much evidence as possible and
  2. to try to disprove your own position.

The best solution will be the one you have tried to demolish with all the evidence you have and still remains standing.

People tend to seek confirmation of their current hypothesis, whereas they should be trying to disconfirm it. (p.138)

Fundamental attribution error

Ascribing other people’s behaviour to their character or disposition rather than to their situation. Subjects in an experiment watched two people holding an informal quiz: the first person made up questions (based on what he knew) and asked the second person who, naturally enough, hardly got any of them right. Observers consistently credited the quizzer with higher intelligence than the answerer, completely ignoring the in-built bias of the situation, and instead ascribing the difference to character.

We are quick to personalise and blame in a bid to turn others into monolithic entities which we can then define and control – this saves time and effort, and makes us feel safer and secure – whereas the evidence is that all people are capable of a wide range of behaviours depending on the context and situation.

Once you’ve pigeon-holed someone, you will tend to notice aspects of their behaviour which confirm your view – confirmation bias and/or illusory correlation and a version of the halo/devil effect. One attribute colours your view of a more complex whole.

Actor-Observer Bias

Variation on the above: when we screw up we find all kinds of reasons in the situation to exonerate ourselves: we performed badly because we’re ill, jet-lagged, grandma died, reasons that are external to us. If someone else screws up, it is because they just are thick, lazy, useless. I.e. we think of ourselves as complex entities subject to multiple influences, and others as monolithic types.

False Consensus Effect

Over-confidence that other people think and feel like us, that our beliefs and values are the norm – in my view one of the profound cultural errors of our time.

It is a variation of the ever-present Availability Error because when we stop to think about any value or belief we will tend to conjure up images of our family and friends, maybe workmates, the guys we went to college with, and so on: in other words, the people available to memory – simply ignoring the fact that these people are a drop in the ocean of the 65 million people in the UK. See Facebubble.

The False Consensus Effect reassures us that we are normal, our values are the values, we’re the normal ones: it’s everyone else who is wrong, deluded, racist, sexist, whatever we don’t approve of.

Elsewhere, I’ve discovered some commentators naming this the Liberal fallacy:

For liberals, the correctness of their opinions – on universal health care, on Sarah Palin, on gay marriage – is self-evident. Anyone who has tried to argue the merits of such issues with liberals will surely recognize this attitude. Liberals are pleased with themselves for thinking the way they do. In their view, the way they think is the way all right-thinking people should think. Thus, ‘the liberal fallacy’: Liberals imagine that everyone should share their opinions, and if others do not, there is something wrong with them. On matters of books and movies, they may give an inch, but if people have contrary opinions on political and social matters, it follows that the fault is with the others. (Commentary magazine)

Self-Serving Bias

People tend to give themselves credit for successes but lay the blame for failures on outside causes. If the project is a success, it was all due to my hard work and leadership. If it’s a failure, it’s due to circumstances beyond my control, other people not pulling their weight etc.

Preserving one’s self-esteem 

These three errors are all aspects of preserving our self-esteem. You can see why this has an important evolutionary and psychological purpose. In order to live, we must believe in ourselves, our purposes and capacities, believe our values are normal and correct, believe we make a difference, that our efforts bring results. No doubt it is a necessary belief and a collapse of confidence and self-belief can lead to depression and possibly despair. But that doesn’t make it true.

People should learn the difference between having self-belief to motivate themselves, and developing the techniques to gather the full range of evidence – including the evidence against your own opinions and beliefs – which will enable them to make correct decisions.

Representative error

People estimate the likelihood of an event by comparing it to an existing prototype / stereotype that already exists in our minds. Our prototype is what we think is the most relevant or typical example of a particular event or object. This often happens around notions of randomness: people have a notion of what randomness should look like i.e. utterly scrambled. But in fact plenty of random events or sequences arrange themselves into patterns we find meaningful. So we dismiss them as not really random.  I.e. we have judged them against our preconception of what random ought to look like.

Ask a selection of people which of these three sets of six coin tosses where H stands for heads, T for tails is random.

  1. TTTTTT
  2. TTTHHH
  3. THHTTH

Most people will choose 3 because it feels random. But of course all three are equally likely or unlikely.

Hindsight

In numerous experiments people have been asked to predict the outcome of an event, then after the event questioned about their predictions. Most people forget their inaccurate predictions and misremember that they were accurate.

Overconfidence

Most professionals have been shown to overvalue their expertise i.e. exaggerate their success rates.


Statistics

A problem with Irrationality and with John Allen Paulos’s book about Innumeracy is that they mix up cognitive biases and statistics, Now, statistics is a completely separate and distinct area from errors of thought and cognitive biases. You can imagine someone who avoids all of the cognitive and psychological errors named above, but still makes howlers when it comes to statistics simply because they’re not very good at it.

This is because the twin areas of Probability and Statistics are absolutely fraught with difficulty. Either you have been taught the correct techniques, and understand them, and practice them regularly (and both books demonstrate that even experts make terrible mistakes in the handling of statistics and probability) or, like most of us, you have not and do not.

As Sutherland points out, most people’s knowledge of statistics is non-existent. Since we live in a society whose public discourse i.e. politics, is ever more dominated by statistics, there is a simple conclusion: most of us have little or no understanding of the principles and values which underpin modern society.

Errors in estimating probability or misunderstanding samples, opinion polls and so on, are probably a big part of irrationality, but I felt that they are so distinct from the psychological biases discussed above, that they almost require a separate volume, or a separate ‘part’ of this volume.

Briefly, common statistical mistakes are:

  • too small a sample size
  • biased sample
  • not understanding that any combination of probabilities is less likely than either on their own, which requires an understanding of base rate or a priori probability
  • the law of large numbers – the more a probabilistic event takes place, the more likely the result will move towards the theoretical probability
  • be aware of the law of regression to the mean
  • be aware of the law of large numbers

Gambling

My suggestion that mistakes in handling statistics are not really the same as unconscious cognitive biases, applies even more to the world of gambling. Gambling is a highly specialised and advanced form of probability applied to games. The subject has been pored over by very clever people for centuries. It’s not a question of a few general principles, this is a vast, book-length subject in its own right. A practical point that emerges from Sutherland’s examples is:

  • always work out the expected value of a bet i.e. the amount to be won times the probability of winning it

The two-by-two box

It’s taken me some time to understand this principle which is given in both Paulos and Sutherland.

When two elements with a yes/no result are combined, people tend to look at the most striking correlation and fixate on it. The only way to avoid the false conclusions that follow from that is to draw a 2 x 2 box and work through the figures.

Here is a table of 1,000 women who had a mammogram because their doctors thought they had symptoms of breast cancer.

Women with cancer Women with no cancer Total
Women with positive mammography 74 110 184
Women with negative mammography 6 810 816
80 920 1000

Bearing in mind that a conditional probability is saying that if X and Y are linked, then the chances of X, if Y, are so and so – i.e. the probability of X is conditional on the probability of Y – this table allows us to work out the following conditional probabilities:

1. The probability of getting a positive mammogram or test result, if you do actually have cancer, is 74 out of 80 = .92 (out of the 80 women with cancer, 74 were picked up by the test)

2. The probability of getting a negative mammogram or test result and not having cancer, is 810 out of 920 = .88

3. The probability of having cancer if you test positive, is 74 out of 184 = .40

4. The probability of having cancer if you test negative, is 6 out of 816 = .01

So 92% of women of women with cancer were picked up by the test. BUT Sutherland quotes a study which showed that a shocking 95% of doctors thought that this figure – 92% – was also the probability of a patient who tested positive having the disease. By far the majority of US doctors thought that, if you tested positive, you had a 92% chance of having cancer. They fixated on the 92% figure and transposed it from one outcome to the other, confusing the two. But this is wrong. The probability of a woman testing positive actually having cancer is given in conclusion 3: 74 out of 184 = 40%. This is because 110 out of the total 184 women tested positive, but did not have cancer.

So if a woman tested positive for breast cancer, the chances of her actually having it are 40%, not 92%. Quite a big difference (and quite an indictment of the test, by the way). And yet 95% of doctors thought that if a woman tested positive she had a 92% likelihood of having cancer.

Sutherland goes on to quote a long list of other situations where doctors and others have comprehensively misinterpreted the results of studies like this, with sometimes very negative consequences.

The moral of the story is if you want to determine whether one event is associated with another, never attempt to keep the co-occurrence of events in your head. It’s just too complicated. Maintain a written tally of the four possible outcomes and refer to these.


Deep causes

Sutherland concludes the book by speculating that all the hundred or so types of irrationality he has documented can be attributed to five fundamental causes:

  1. Evolution We evolved to make snap decisions, we are brilliant at processing visual information and responding before we’re even aware of it. Conscious thought is slower, and the conscious application of statistics, probability, regression analysis and so on, is slowest of all. Most people never acquire it.
  2. Brain structure As soon as we start perceiving, learning and remembering the world around us our brain cells make connections. The more the experience is repeated, the stronger the connections become. Routines and ruts form, which are hard to budge.
  3. Heuristics Everyone develops mental short-cuts, techniques to help make quick decisions. Not many people bother with the laborious statistical techniques for assessing relative benefits which Sutherland describes.
  4. Failure to use elementary probability and elementary statistics Ignorance is another way of describing this, mass ignorance. Sutherland (being an academic) blames the education system. I, being a pessimist, attribute it to basic human nature. Lots of people just are lazy, lots of people just are stupid, lots of people just are incurious.
  5. Self-serving bias In countless ways people are self-centred, overvalue their judgement and intelligence, overvalue the beliefs of their in-group, refuse to accept it when they’re wrong, refuse to make a fool of themselves in front of others by confessing error or pointing out errors in others (especially the boss) and so on.

I would add two more:

Suggestibility

Humans are just tremendously suggestible. Say a bunch of positive words to test subjects, then ask them questions on an unrelated topic: they’ll answer positively. Take a different representative sample of subjects and run a bunch of negative words past them, then ask them the same unrelated questions, and their answers will be measurably more negative. Everyone is easily suggestible.

Ask subjects how they get a party started and they will talk and behave in an extrovert manner to the questioner. Ask them how they cope with feeling shy and ill at ease at parties, and they will tend to act shy and speak quieter. Same people, but their thought patterns have been completely determined by the questions asked: the initial terms or anchor defines the ensuing conversation.

In one experiment a set of subjects were shown one photo of a car crash. Half were asked to describe what they think happened when one car hit another; the other half were asked to describe what they thought happened when one car smashed into the other. The ones given the word ‘smashed’ gave much more melodramatic accounts. Followed up a week later, the subjects were asked to describe what they remembered of the photo. The subjects given the word ‘hit’ fairly accurately described it, whereas the subjects given the word ‘smashed’ invented all kinds of details, like a sea of broken glass around the vehicles which simply wasn’t there, which their imaginations had invented, all at the prompting of one word.

Many of the experiments Sutherland quotes demonstrate what you might call higher-level biases: but underlying many of them is this simple-or-garden observation: that people are tremendously easily swayed, by both external and internal causes, away from the line of cold logic.

Anthropomorphism 

Another big underlying cause is anthropomorphism, namely the attribution of human characteristics to objects, events, chances, odds and so on. In other words, people really struggle to accept the high incidence of random accidents. Almost everyone attributes a purpose or intention to almost everything that happens. This means our perceptions of almost everything in life are skewed from the start.

During the war Londoners devised innumerable theories about the pattern of German bombing. After the war, when Luftwaffe records were analysed, it showed the bombing was more or less at random.

The human desire to make sense of things – to see patterns where none exists or to concoct theories… can lead people badly astray. (p.267)

Suspending judgement is about the last thing people are capable of. People are extremely uneasy if things are left unexplained. Most people rush to judgement like water into a sinking ship.

Cures

  • keep an open mind
  • reach a conclusion only after reviewing all the possible evidence
  • it is a sign of strength to change one’s mind
  • seek out evidence which disproves your beliefs
  • do not ignore or distort evidence which disproves your beliefs
  • never make decisions in a hurry or under stress
  • where the evidence points to no obvious decision, don’t take one
  • learn basic statistics and probability
  • substitute mathematical methods (cost-benefit analysis, regression analysis, utility theory) for intuition and subjective judgement

Comments on the book

Out of date

Irrationality was first published in 1992 and this makes the book dated in several ways (maybe this is why the first paperback edition was published by upmarket mass publisher Penguin, whereas the most recent edition was published by the considerably more niche publisher, Pinter & Martin).

In the chapter about irrational business behaviour Sutherland quotes quite a few examples from the 1970s and the oil crisis of 1974. These and other examples – such as the long passage about how inefficient the civil service was in the early 1970s – feel incredibly dated now.

And the whole thing was conceived, researched and written before there was an internet or any of the digital technology we take for granted nowadays. Can’t help wondering whether the digital age has solved, or merely added to the long list of biases, prejudices and faulty thinking which Sutherland catalogues, and what errors of reason have emerged specific to our fabulous digital technology.

On the other hand, out of date though the book in many ways is, it’s surprising to see how some hot button issues haven’t changed at all. In the passage about the Prisoners’ Dilemma, Sutherland takes as a real life example the problem the nations of the world were having in 1992 in agreeing to cut back carbon dioxide emissions. Sound familiar? He states that the single biggest factor undermining international co-operation against climate change was America’s refusal to sign global treaties to limit global warming. In 1992! Plus ça change.

Grumpy

The books also has passages where Sutherland gives his personal opinions about things and some of these sound more like the grousing of a grumpy old man than anything based on evidence.

Thus Sutherland whole-heartedly disapproves of ‘American’ health fads, dismisses health foods as masochistic fashion and is particularly scathing about jogging.

He thinks ‘fashion’ in any sphere of life is ludicrously irrational. He is dismissive of doctors as a profession, who he accuses of rejecting statistical evidence, refusing to share information with patients, and wildly over-estimating their own diagnostic abilities.

Sutherland thinks the publishers of learned scientific journals are more interested in making money out of scientists than in ‘forwarding the progress of science’ (p.185).

He thinks the higher average pay that university graduates tend to get is unrelated to their attendance at university and more to do with having well connected middle- and upper-middle-class parents, and thus considers the efforts of successive Education Secretaries to introduce student loans to be unscientific and innumerate (p.186).

Surprisingly, he criticises Which consumer magazine for using too small samples in its testing (p.215).

In an extended passage he summarises Leslie Chapman’s blistering (and very out of date) critique of the civil service, Your Disobedient Servant published in 1978 (pp.69-75).

Sutherland really has it in for psychoanalysis, which he accuses of all sorts of irrational thinking such as projecting, false association, refusal to investigate negative instances, failing to take into account the likelihood that the patient would have improved anyway, and so on. Half-way through the book he gives a thumbnail summary:

Self-deceit exists on a massive scale: Freud was right about that. Where he went wrong was in attributing it all to the libido, the underlying sex drive. (p.197)

In other words, the book is liberally sprinkled with Sutherland’s own grumpy personal opinions, which sometimes risk giving it a crankish feel.

Against stupidity the gods themselves contend in vain

Neither this nor John Allen Paulos’s books take into account the obvious fact that lots of people are, how shall we put it, of low educational achievement. They begin with poor genetic material, are raised in families where no-one cares about education, are let down by poor schools, and are excluded or otherwise demotivated by the whole educational experience, with the result that :

  • the average reading age in the UK is 9
  • about one in five Britons (over ten million) are functionally illiterate, and probably about the same rate innumerate

His book, like all books of this type, is targeted at a relatively small proportion of the population, the well-educated professional classes. Most people aren’t like that. You want proof? Trump. Brexit. Boris Johnson landslide.

Trying to keep those pesky cognitive errors at bay (in fact The Witch by Pieter Bruegel the Elder)

Trying to keep those cognitive errors at bay (otherwise known as The Witch by Pieter Bruegel the Elder)


Reviews of other science books

Chemistry

Cosmology

The Environment

Genetics and life

Human evolution

Maths

Particle physics

Psychology

%d bloggers like this: