6 Things I Learned From the Book "Thinking Fast and Slow"
I've read 287 books in the past four years. Some were awful. Others were incredible. But I took notes on all of them.
I've never known what to do with these notes. Then I got an idea: I'll dump some of the highlights into a weekly article.
Thinking Fast and Slowis one of the best psychology books ever published. The life work of Nobel-prize winning psychologist Daniel Kahneman, it outlines, better than anything else, the ways in which humans fool themselves and err when thinking.
(Kahneman and I sat down for an hour-long interview last year. Watch it here, complete with transcript).
The book makes more thought-provoking points than I can count, but here are six passages that caught my attention:
1. The most important things in life are unpredictable:
The idea that large historical events are determined by luck is profoundly shocking, although it is demonstrably true. It is hard to think of the history of the twentieth century, including its large social movements, without bringing in the role of Hitler, Stalin, and Mao Zedong. But there was a moment in time, just before an egg was fertilized, when there was a fifty-fifty chance that the embryo that became Hitler could have been a female. Compounding the three events, there was a probability of one-eighth of a twentieth century without any of the three great villains and it is impossible to argue that history would have been roughly the same in their absence. The fertilization of these three eggs had momentous consequences, and it makes a joke of the idea that long-term developments are predictable.
2. We are gullible:
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact. But it was psychologists who discovered that you do not have to repeat the entire statement of a fact or idea to make it appear true. People who were repeatedly exposed to the phrase "the body temperature of a chicken" were more likely to accept as true the statement that "the body temperature of a chicken is 144 °" (or any other arbitrary number). The familiarity of one phrase in the statement sufficed to make the whole statement feel familiar, and therefore true. If you cannot remember the source of a statement, and have no way to relate it to other things you know, you have no option but to go with the sense of cognitive ease.
3. Our minds choose the familiar over the true:
People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media. Frequently mentioned topics populate the mind even as others slip away from awareness. In turn, what the media choose to report corresponds to their view of what is currently on the public's mind. It is no accident that authoritarian regimes exert substantial pressure on independent media. Because public interest is most easily aroused by dramatic events and by celebrities, media feeding frenzies are common. For several weeks after Michael Jackson's death, for example, it was virtually impossible to find a television channel reporting on another topic. In contrast, there is little coverage of critical but unexciting issues that provide less drama, such as declining educational standards or overinvestment of medical resources in the last year of life. (As I write this, I notice that my choice of "little-covered" examples was guided by availability. The topics I chose as examples are mentioned often; equally important issues that are less available did not come to my mind.)
4. Really smart people can fool themselves:
The author pointed out that psychologists commonly chose samples so small that they exposed themselves to a 50% risk of failing to confirm their true hypotheses! No researcher in his right mind would accept such a risk. A plausible explanation was that psychologists' decisions about sample size reflected prevalent intuitive misconceptions of the extent of sampling variation. The article shocked me, because it explained some troubles I had had in my own research.
Like most research psychologists, I had routinely chosen samples that were too small and had often obtained results that made no sense. Now I knew why: the odd results were actually artifacts of my research method. My mistake was particularly embarrassing because I taught statistics and knew how to compute the sample size that would reduce the risk of failure to an acceptable level. But I had never chosen a sample size by computation. Like my colleagues, I had trusted tradition and my intuition in planning my experiments and had never thought seriously about the issue.
5. We are blind to our own blindness:
The most dramatic demonstration was offered by Christopher Chabris and Daniel Simons in their book The Invisible Gorilla. They constructed a short film of two teams passing basketballs, one team wearing white shirts, the other wearing black. The viewers of the film are instructed to count the number of passes made by the white team, ignoring the black players. This task is difficult and completely absorbing.
Halfway through the video, a woman wearing a gorilla suit appears, crosses the court, thumps her chest, and moves on. The gorilla is in view for 9 seconds. Many thousands of people have seen the video, and about half of them do not notice anything unusual. It is the counting task— and especially the instruction to ignore one of the teams— that causes the blindness. No one who watches the video without that task would miss the gorilla . Seeing and orienting are automatic functions of System 1, but they depend on the allocation of some attention to the relevant stimulus. The authors note that the most remarkable observation of their study is that people find its results very surprising. Indeed, the viewers who fail to see the gorilla are initially sure that it was not there— they cannot imagine missing such a striking event . The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.
6. Studying our own faults is incredibly hard. This is probably the most important line in the book:
The premise of this book is that it is easier to recognize other people's mistakes than our own.
Go buy his book here. You won't regret it.
More from The Motley Fool Social Security plays a key role in your financial security, but it's not the only way to boost your retirement income. In our brand-new free report, our retirement experts give their insight on a simple strategy to take advantage of a little-known IRS rule that can help ensure a more comfortable retirement for you and your family. Click here to get your copy today.
The article 6 Things I Learned From the Book "Thinking Fast and Slow" originally appeared on Fool.com.Contact Morgan Housel at email@example.com. The Motley Fool has a disclosure policy.
Copyright © 1995 - 2014 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.