We need more acceptance of error, of being wrong. This might sound an odd proposition. Most of us strive to avoid mistakes, at work and home. We bring up our children to answer exam questions correctly rather than incorrectly. And yet, despite our desire to be right, error is necessary. It is part of what makes us human. We resist this. After all, the pleasure we take in being right is one of the most fundamental we have. The opportunity to say, or at least think, ”I told you so”, exists in just about everyone.
And apart from being right about specific events – an outcome in foreign policy, say, or the winner of the first race at Randwick – we have an even more fundamental feeling that we are right about pretty well everything. This point is well made in an unusual book called Being Wrong by American journalist Kathryn Schulz.
It’s one of those books that states plainly things you have often felt but never put into words.
Evolutionary psychology suggests why being right is so important to us. During evolution, those who were right, about practical matters such as where to find game and when a big storm was coming, survived, while those who were wrong did not. We evolved as individuals who appreciated being right, in small matters as well as big ones. What we tend to overlook is that, despite this yearning for truth, the road to it is a maze through many errors, and to reach our destination it is necessary not to ignore those errors but to acknowledge and understand them.
We may need to learn to love our mistakes. ”Far from being a sign of intellectual inferiority,” observes Schulz, ”the capacity to err is crucial to human cognition. Far from being a moral flaw, it is inextricable from some of our most humane and honourable qualities: empathy, optimism, imagination, conviction, and courage” (all of which are frequently based on delusions). ”And far from being a mark of indifference or intolerance, wrongness is a vital part of how we learn and change. Thanks to error, we can revise our understanding of ourselves and amend our ideas about the world.”
Error does not exist only in our own actions, of course, but is all around us. It pervades our view of the world. Acknowledging this enables us to see the world less clearly, but more accurately. This is tough to accept, so much so that on most days we prefer a wrong explanation to no explanation at all. But as Schulz notes, ”even the most seemingly bullet-proof scientific theories of times past eventually proved wrong [so] we must assume that today’s theories will someday prove wrong as well”. I should stress that Schulz is not some deconstructionist arguing there is no such thing as truth. The truth is out there but we know it far less often than we like to think. That’s a big idea and not one most of us learned in science class. And the mutability of truth in science also applies to many other areas of knowledge. The implications of this vary with different fields.
In science we can at least hope for some sort of improvement through the ages. We don’t know everything yet but we know a lot more than we did a century ago. So, even though each new theory might one day be replaced, as Schulz reminds us, it is still closer to the truth than the one that came before. Rather than saying the latest theory is right, it might be more modest to say it is less wrong than the last one. But at least scientists tend to stand on their predecessors’ shoulders and see further. In the humanities the situation is more controversial. Some believe in the notion of progress, for example in economics or education. Matt Ridley, in his book The Rational Optimist, suggests the process of natural selection has occurred in culture as well as nature. In his formulation, ”ideas have sex with each other” and the most useful offspring survive.
He thinks we humans have got a lot better at doing a lot of things – such as governing and feeding ourselves – over the centuries. Others would disagree, for example pointing to secularization in the West or the big wars of the 20th century as indications of decline. But even Ridley would agree with the need for a certain amount of error. We need mistakes to thrive, in nature and culture. Genes need to mutate to give natural selection something to work on. And in culture, lots of ideas need to be dreamed up, most of them duds, for the few useful ones to be identified, nourished and distributed. Benjamin Franklin was wise on the subject of mistakes, writing in 1784, ”Perhaps the history of the errors of mankind, all things considered, is more valuable and interesting than that of their discoveries. Truth is uniform and narrow; it constantly exists, and does not seem to require so much an active energy, as a passive aptitude of soul in order to encounter it.
But error is endlessly diversified; it has no reality, but is the pure and simple creation of the mind that invents it. In this field, the soul has enough room to expand herself, to display all her boundless faculties, and all her beautiful and interesting extravagancies and absurdities.” In less whimsical vein, British economist William Stanley Jevons observed, ”In all probability the errors of the great mind exceed in number those of the less vigorous one. Fertility of imagination and abundance of guesses at truth are among the first requisites of discovery; but the erroneous guesses must be many times as numerous as those that prove well-founded.” Approving this, Steven Johnson wrote in Where Good Ideas Come From, ”error is not simply a phase you have to suffer through on the way to genius. Error often creates a path that leads you out of your comfortable assumptions … Being right keeps you in place. Being wrong forces you to explore.”
It’s often noted that Australians are less tolerant of failure – that personal form of error – than Americans, who believe passionately in the individual’s right to reinvention. Possibly this is one reason why, despite the occasional claim to the contrary, we are not a particularly innovative society. Hard to imagine an Australian using that great motto of American business start-ups: ”Fail faster.” Or as Tallulah Bankhead said, ”If I had to live my life again, I’d make the same mistakes, only sooner.” The sciences provide an interesting example of what can happen when we ignore the frequency of error. In December respected science writer Jonah Lehrer published an article in the The New Yorker about what he calls ”the decline effect”. Basically, a lot of stuff we thought we knew because of experiments and trials is turning out to be not true. Examples quoted included the efficacy of antipsychotic drugs and the theory that animals – and humans – are more attracted to members of the opposite sex if their features are symmetrical. ”All sorts of well-established, multiply confirmed findings have started to look increasingly uncertain,” Lehrer wrote.
”It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon [is] … occurring across a wide range of fields, from psychology to ecology. In the field of medicine, the phenomenon seems extremely widespread.” There appear to be several explanations for the decline effect. One is publication bias, the enthusiasm of scientific journals to publish positive data and unexpected and exciting results. Another is so-called ”selective reporting”, where scientists unconsciously allow their prejudices and their needs to shape their experiments. This is probably why between 1966 and 1995 each of 47 studies of acupuncture in Asia found it was an effective treatment, while only 56 per cent of the 94 studies in the West found it had any therapeutic benefit. According to John Ioannidis, an epidemiologist at Stanford University, selective reporting often occurs because of ”significance chasing”, when scientists scan huge amounts of data in a desperate search for the connections that could enable them to produce a paper for publication.
A final factor identified by Lehrer that feeds back into the others is sheer randomness. Often significance is attributed to results that are really just the outcome of chance. He tells the story of a scientist named John Crabbe who in the late 1990s decided to examine this and conducted identical studies using mice at three different locations. Each aspect of the studies was standardised, yet some of the results varied hugely – and for no apparent reason. Crabbe concluded that a lot of interesting scientific data are nothing but noise. Noted Lehrer, ”The problem, of course, is that such dramatic findings are also the most likely to get published in prestigious journals, since the data are both statistically and entirely unexpected. Grants get written, follow-up studies are conducted. The end result is a scientific accident that can take years to unravel.” Recently Lehrer told me that despite the criticisms he has marshalled, the scientific method is still the best we have for reaching for truth.
But even that method, with all its checks and balances, has been unable to extinguish our immense capacity for error. This frequency of mistakes in science is reflected in many other areas of knowledge. The ones least prone to error are those where theory grew from technology, rather than vice versa. Each time you build a bridge or set a bone or fly a plane you are effectively replicating an experiment to prove one or more important ideas about how the world works. These experiments have been done so often we’re now pretty sure about what’s going on. Elsewhere though, we need to make friends with our mistakes and understand them at least as well as we understand our successes, because they’re marbled through everything we do. Because they are so prevalent, it’s a fair bet they’re part of what makes us human, and maybe there’s a good reason for that. Theodore Roosevelt said, ”The man who makes no mistakes does not usually make anything.” Hence the advice of Samuel Beckett: ”Try again. Fail again. Fail better.”