I care whether what I believe is true. I care whether or not my perceptions equal reality. I want the correlation between how I think things are and how things really are to be 100%. This is much like approaching infinity, I know, but I am going there nonetheless.
As much as possible, I try to ‘decorrelate errors’ when forming an opinion, to be aware of personal biases, and to acknowledge the limitations of my current position. I read instead of watch news; I don’t multitask while listening to talks; I make efforts to read things I don’t agree with; all to ensure what goes into my mind passes through conscious thought and deliberation. (Of course, this also means my biases serves as a filter – it’s not a perfect mechanism). Reading something like Thinking, Fast and Slow, specifically on the downfalls of intuition and snap judgment, only affirms me in my scrutiny of information.
Certainly it’s not possible to live like this all the time, and it’s not good to do so. For high-stake matters, it’s not OK to let mere feelings guide judgments. But for most everyday things, reliance on System 1 is perfectly OK. Kahneman himself says that ‘System 1 is the hero of the book’ and is very efficient in its functions (e.g., compare distances, detect sounds, associate ideas, discern nuances, etc.). It has its limitations, but so does System 2.
Yet, the engineer in me is biased to think that System 2 must win; it is the way to get that maximum perception-reality correlation. If I could summarize my previous thought on this matter, it’d be something like this:
If only we could employ System 2 all the time, we’d get reality as it is. System 1 is permitted to operate, but it really is optional, because System 2 can do everything that System 1 does, though slower. In fact, using System 1 sometimes introduces errors, so whenever possible and practical, just bypass it and employ System 2.
So when Kahneman says matter-of-factly, “The world in our heads is not a precise replica of reality,” I paused.
I mean, I know the statement is true, but I still somehow thought there was a way around it if we perfected the coherence of our thinking. Moreover, the statement comes from someone who studies all these mechanisms. Isn’t it not OK to remain in an imperfect reality?
I guess, No, it’s not OK, and Yes, it’s OK.
It’s fine to live with an imprecise replica of reality and it’s also fine to strive for reality. It’s just the way things are. Different situations call for different intensities of pursuit. One must maintain both the quest for what’s true and accept the asymptotic nature of it, without descending to nihilism.
And oh yeah, System 1 is important too. It’s neither superior nor inferior to System 2; it just serves different functions. The two systems complement each other and more importantly, can correct each other.
So, this acceptance will not really change my pursuit for that unhindered perception. But it also gives me assurance that life is not inferior when I take a break from it. Embracing the tensions between ideas—isn’t that the wisdom of living in an imperfect world?
For now we see through a glass, darkly; but then face to face:
now I know in part; but then shall I know even as also I am known.
1 Corinthians 13:12, King James Version
I will live, for now, in between the OK and Not OK.
 “System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.” (Kahneman, from chapter 1 of Thinking, Fast and Slow).
Over the next year and an half, the North American news media will be inundated with polls. Who leads in what polls and what margin does one have over another. There is nothing wrong with polls, except when we believe strongly in their results without questioning their assumptions.
As the recipients of numeric information, especially when presented as percentages, we should all be aware of the quality of the number. Then, our trust in what the number says should be adjusted accordingly.
7 out of 10 is 70%.
70 out of 100 is also 70%.
And so is 700 out of 1000.
They are all equal to 70%, but we know that the quality of the 70% differs. We know (or should know) that larger sample sizes yield more trustworthy results. This is the more intuitive principle from chapter 10, The Law of Small Numbers. The second, however, is less intuitive, although completely equivalent to the first.
The best schools are small.
That is an example Kahneman uses in chapter 10. He then summarizes a research finding on the characteristics of the most successful schools. The study concludes that the most successful schools are small in size, which then motivates the Gates Foundation to make large investments to create small schools. Other organizations and government efforts also follow in the effort.
There are many reasons we can postulate on why small schools are more successful. Most of us would find this storyline perfectly acceptable or even subscribe strongly to the conclusion.
Unfortunately, as Kahneman suggests, this analysis is wrong. If the question were asked, what are the characteristics of the worst schools, they would have found that these schools also tend to be small. And as he mentions in another place, a cause that can explain 2 opposite outcomes means nothing at all.
The reason why small schools can produce two opposite and extreme results is because they have small populations. This smallness in numbers yields more variations, i.e., they swing more. You would most likely get a figure like 98% or 1.7% from small samples, not because they are the best or the worst, but because they are small. It is simply a feature of statistics. Any phenomenon that is very high or very low, best or worst, will most likely come from small sample sizes.
“But nooo, there must be a reason why small schools perform better.”
Anyone has that small voice speaking in his or her head? I do. There is something inside of me that needs an explanation of this phenomenon.
Kahneman says this is System 1 talking, which likes to find causal relationships to make a coherent belief system, even though some things are mere randomness. These random things can be marked as causal relationships, which lead to a belief that is in fact, completely spurious.
The subject of this chapter is not easy to explain and this post is definitely far from sufficient. It takes time to internalize the principles it describes, but its importance is paramount, if you care about the truthfulness of what you think is true.
Is this cause-and-effect that I believe true, or am I assigning causal relationships to something that is, in fact, random?
Always consider the sample size of any statistical study.
When someone claims a causal relationship, ask whether the same cause can give alternate/opposing/inconsistent outcomes. Then adjust the trustworthiness of that information accordingly.
I’m thoroughly enjoying my journey through Thinking, Fast and Slow by Daniel Kahneman. This book is a must-read for anyone who cares whether the mechanism by which they think, process information, and synthesize conclusions is valid or not.
Thinking, Fast and Slow is the kind of reading material that sparks fireworks in the readers’ brains. I could not reserve my thoughts only at the end of the book, so this is a first of what I expect will be a series (at least two) of reflections on the chapters.
Among its fascinating chapters—concise chapters I should add (yay)—one of my favorites so far is chapter 7. In this chapter titled “A Machine for Jumping to Conclusions”, Kahneman describes our brain’s tendencies to ignore ambiguities or suppress doubts in the information we are presented with.
One revealing example is the statement “Ann approached the bank.”
Most likely, the image prompted by the statement is a lady with some money business walking toward an edifice we call “banks.” For me, the moment I saw the statement, my mind did not conjure any other image—it was so plain.
Yet, if it was revealed that the statement preceding “Ann approached the bank” was “They were floating gently down the river”, the image would have to change drastically.
What just happened? It turns out that our mind has a mechanism, dubbed System 1 in the book, that likes to jump into conclusion in the absence of a clear context (System 2, on the other hand, is the mechanism responsible for deliberate thinking, logic, and reasoning). When there’s limited information, System 1 likes to generate its own context and is unaware of the ambiguity of the given information. Unless someone had just gone canoeing or read about rivers recently, it was unlikely that that he/she would have automatically applied the river-related meaning to the word ‘bank.’
This is so fascinating to me because it shows how interpretation can reveal more about the interpreter than the facts themselves. I automatically equated the money-related institution to the word ‘bank’ because, well, I’m a 21st century, urban woman who spends more times seeing Chase or Bank of America than canoeing in rivers. Thus my mind did not even search for other possible meaning and resolved the statement accordingly.
But the statement itself provided little information about what it actually meant.
I would admit that when the second statement was given, it disoriented me quite a bit—what was it talking about? I was that unaware of the alternate meaning.
To push it even further, if one insists that the word ‘bank’ means money-bank, he/she would have to get more creative in resolving the coherence between “They were floating gently down the river” and “Ann approached the bank.” Some things need to happen in the middle of the two sentences, and the interpreter would have to insert further speculations when no further information is provided. This may be correct, but without more information, it’s hard to say. It would just be another jump to another conclusion.
What I take from this chapter is that slowing down and suspending judgment is a good idea, especially for high-stake matters. If you don’t slow down, you may not perceive ambiguities and miss the true meaning altogether!