The principle of ’cause and effect’ is an important one in science. Science is about understanding, and it is about understanding how events and eventualities arise. Science demands that the explanations it formulates should be founded upon evidence. Events and eventualities are ‘effects’ that can be witnessed and the presumption within science is that all ‘effects’ have ’causes’, so the task in hand is to determine the ’cause(s)’ wherever possible. On the occasions when science has the evidence before it then arriving at an interpretation of the evidence ought not to be too difficult.
Understanding doesn’t have to be scientific and it doesn’t have to resort to lots of data and equations. Understanding can be based upon reason. When Charles Darwin wrote and published ‘On The Origin Of Species’ his explanation contained little in the way of data and not a single equation, but the book was no less scientific as a result. Darwin pointed to something profound and there are ways that essential profundity is still rippling through science today.
Evolution is decreasingly thought to be the sole preserve of living things and diversity. Despite that Darwins theory was subjected to ridicule and was slow to catch on cosmologists now increasingly regard evolution as pertinent to the development of the universe; a facet acknowledged by Professor Brian Cox in The Wonders Of Life series. In turn economists could do well to look to Darwin and evolution for inspiration.
Daniel Kahneman is an accomplished academic that has spent a lifetime thinking about cause and effect and the ability to make predictions based upon understanding. Kahneman was awarded a Nobel Prize for one of his leading achievements and so he’s evidently no slouch. He has a book out and the book has been selling well. 
Kahnemen cultivated an off-shoot that was a persistent interest in the course of his career. He became intrigued by ‘heuristics’.
Heuristics are simple rules of thumb that often seem intuitive and that guide our thinking without us having to think too hard. “Red sky at night .. ,” is an example of a heuristic. Knowing this you’d easily recognise “Red sky in the morning .. ,” is another. ‘Coughs and sneezes spread diseases’ is one of the best.
Often heuristics are usefully summary, but often they are too summary to explain or reveal any rationale as may sit behind them. ‘Money is the root of all evil,’ can be argued to be a case in point, as money could be the root of possibilities, and my favourite is just four words long: ‘All flesh is grass’. At once this simple statement seems truly enigmatic, yet it is no more than a very summary description of a grand order in the great food chain as persists in any ecology. Creatures feed off plants, and some creatures feed of other creatures that in turn dined on plants.
Kahneman worked with a close colleague called Amos, and the relationship was one of those when two heads seemed better than one. They bounced ideas around and sparked off each other. The core theme they explored was whether heuristics could be deemed reliable. Did simple heuristics permit predictions, and how frequently would the predictions be correct? These questions annex with psychology because they help analyse the basis of human intuition and whether human intuition is reliable.
Kahnemans book, ‘Thinking, Fast and Slow’, explore his lifes’ obsession and makes for an interesting contribution to real world psychology. Our intuition can be reliable, he directs, but it can let us down without our being aware. We can think fast, but there are occasions when we need to grip the reins and think slow to avoid certain pitfalls.
People are often praised for being quick thinkers, and the workplace is an environment that encourages economy in execution, Kahneman suggests the psyche is essentially lazy, and if thinking fast rewards us we persist with the strategy despite the policy may sometimes let us down. Such instances are an entry point for bias. And Kahneman says people are ‘statistically punished for being nice and rewarded for being nasty.’ The feedback to which life exposes us is perverse, he directs, and that’s discounting the bias that’s introduced by aspects of the money system.
‘Lots of different examples of bias are described by Kahneman there is a discussion of whether rewards for improved performance work better than punishment for poor results – a proposition supported by much evidence, but not apparently by many bosses. Kahneman relays the story of how an instructor in the Israeli Air Force told him this didn’t work for flight cadets: those who he praised for good performance usually did worse the next time, whereas those who he screamed at for poor performance usually did better the next time. What the instructor was observing was regression to the mean, due to random fluctuations in performance. The chances are, when we do something really well, next time it won’t go so well and vice versa. The error the instructor made was ascribing a ‘causal influence to the inevitable fluctuations in a random process.’ ‘ 
When I read this it was in a short review and recommendation and not from the book itself. Something about it offended my intuition and yet I didn’t know why. I’ve underlined the offending statement. I sought out the book and wondered how it was wrong and why it offended. It took a while to figure, but I got there in the end. Kahneman says he experienced a moment of inspiration when he saw new light in a statistical principle he had been teaching for years. The method is known as regression to the mean and despite it sounds grandiose it can be explained simply.
Suppose you have dice and you roll it. You’ll return one score in the range 1 to 6. You could get lucky and roll a ‘6’. If you roll again you have five times greater prospects of rolling less than ‘6’ than you have of repeating a ‘6’. If you roll a ‘1’ the same applies but in the converse. So in erratic systems ‘good’ results are likely followed by ones that are less good. The thing about rolling a dice is that if you average successive rolls they will quickly trend towards the mean score possible, and that is 3.5.
Trainee pilots who put in a bum performance are erratic performers whose next performance has every prospect of being better. Ones who did well may not have full grasp of the skills, they maybe got lucky one time, so their next performance could well be worse. But although performance of anybody on a learning curve may be erratic it isn’t strictly random. Kahneman wasn’t wrong to apply regression to the mean to a pseudo-random setting but he wasn’t strictly correct either.
You see trainee pilots can be erratic but they favour lower performances and scores. Suppose you roll that dice but the results returned range only in the range 1 to 4. The results can still be erratic and pseudo-random but you’d soon become suspicious if ‘5s’ and ‘6s’ were never to arise. Regression to the mean could still apply in that the scores would quickly trend to an average of 2.5, but the system is not strictly random because two real possibilities have been excluded.
Kahneman suffered a rush of blood to the head. In his rush to apply regression to the mean he applied the principle correctly but then took his conclusions one step too far. A simple heuristic reveals the lie; ‘practice makes perfect’. Erratic performance by trainee pilots is simple evidence the cadets need more flying hours to hone new and demanding skills. Fluctuations in performance of newly acquired skills are inevitable, to a point, but they are not so random as Kahnemans’ conclusion summarily suggests.
So the flight instructor was right .. and completely wrong. He could bark at pilots who did badly and they would do better next time, but they may well have done so without the instructor doing or barking anything. The instructor has been placed in charge and he has responsibility to deliver results. He is acting under influence that encourages him to believe his involvement has more bearing upon outcome than it actually does.
In turn Kahneman was correct, but not so correct as it is possible to be. Erratic performance is inevitable in the instance he describes, but the Nobel Priize-winners mind made a leap too far when the application of a statistical principle had him confuse ‘erratic’ with ‘random’. None of us are perfect.
This example shows the difficulty of discerning ’cause’ in what are really fairly simple eventualities. Three people can arrive at three interpretations and descriptions rising from entirely the same set of observations. That casts light on the difficulty.
The harsh reality is that there are occasions when a human may be quite summary and quite correct; if not for all the right reasons. There are occasions when a human can be correct; yet not be so correct as it is possible to be, and there are occasions when humans can be just plain wrong. Recognising the distinctions in these cases isn’t so plain or easy as it may seem.
The introduction to Kahnemans book ought to be more interesting for anyone knowing in advance that Kahnemans rush to apply a statistical method overshadowed the simple heuristic that works fine in this case, ‘practice makes perfect’. Performance and ‘scores’ would attain higher ranges and be more consistent as the cadets advance up the learning curve. There’s no real need to take a sledgehammer to the task of cracking a nut.
Life has become involved with complications arising in many a corner. There’s a danger when we have a heavy work load and when complications abound around us that the simplest of things that can inform and simplify can get overlooked. The human is an intelligent species, but sometimes eventualities conspire that the human cannot readily access some of the best of the intelligence, or capacity for intelligent thinking, it possesses.
1, Thinking, Fast and Slow, Daniel Kahneman, Penguin, 2012
2, What’s your science book of the year?
Posted by Andrew Wadge on 21 December 2012 http://blogs.food.gov.uk/science/entry/what_s_your_science_book
Chris Palmer, 23rd May 2013.