So, You Think You're Rational?
Israel and Palestine have been in bitter conflict for decades. Stanford psychologist Lee Ross once showed why problems like this last so long with so little progress.
Ross and his colleagues took peace proposals written by Israeli and Palestinian negotiators and swapped the authors' names. He then asked Israeli citizens what they thought of each proposal. "The Israelis liked the Palestinian proposal attributed to Israel more than they liked the Israeli proposal attributed to the Palestinians," Ross said. Palestinians analyzing proposals attributed to the wrong author did the same. The two sides in Ross's studies weren't fighting each other. They were fighting a more complicated enemy: their own opinions.
Another psychologist, Geoffrey Cohen, did a similar study in the U.S. He showed Democratic voters supported Republican proposals when they were attributed to fellow Democrats more than they supported Democratic proposals attributed to Republicans (and the opposite for Republican voters).
People disagree with each other because they think the other side is biased into making bad decisions. They rarely assume that they, themselves, might be just as biased. Psychologists have a name for this: blind-spot bias. It's a bias that prevents us from realizing how biased we are. And it is pervasive in investing.
Behavioral finance is one of the fastest-growing branches of psychology. People love reading about flaws people fall for when handling money. But few of them admit, or even realize, that they're reading about themselves.
Everyone wants to think they are rational, and biases are things that afflict other people. "The brain is designed with blind spots," Caroll Tavris and Elliot Aronson write in their book Mistakes Were Made (But Not by Me), "and one of its cleverest tricks is to confer on us the comforting delusion that we, personally, do not have any." This is why so many of us are not only bad with money, but make the same mistakes over and over again. We're blind to our blindness.
"People see themselves as less susceptible to bias than others," writes Princeton psychologist Emily Pronin. Part of this is because we judge others based solely on their actions, but when judging ourselves we're flooded with internal dialogue justifying our own bad decisions. If I see you buying stocks when the market is booming and selling after a crash, I assume you're an emotional klutz. But if I did the same thing, I could tell myself a story about how this new market is rigged, and it's rational to get out now before things go even lower. Because people can reason and tell themselves stories, they're able to make up all kinds of excuses to justify their mistakes - even the same mistakes they criticize others for.
Ironically, the smarter you are, the worse this gets. In one study, blind-spot bias was shown to be positively correlated with things like SAT scores and other measures of intelligence - the smarter you are, the more blind you are to your own biases. Why? Because the smarter you are, the more elaborate and sophisticated stories you can tell yourself to justify your bad decisions. An average investor could never convince themselves that leveraging your balance sheet 30-to-1 with subprime mortgages was good idea. You need to be Harvard stupid to do that. The more rational we think we are, the more self-deluding we engage in, and the more biased we become.
The sad truth is, there might not be much we can do about this. Some biases are hardwired from birth. Michael Foster of the University of Washington and a colleague compared 4,600 sets of identical twins, who are nearly genetic duplicates, to fraternal twins, who aren't. Looking at investing behaviors like risk aversion, lack of diversification, and portfolio turnover, identical twins were about twice as likely as fraternal twins to behave like their sibling. "Genetic differences explain up to 45% of the remaining variation across individual investors, after controlling for observable individual characteristics," the researchers wrote. "Investment biases are manifestations of innate and evolutionary ancient features of human behavior." Daniel Kahneman, who won the Nobel Prize studying biases, once wrote: "Despite 45 years of work in the field, I am still inclined to make over-confident predictions." It's just part of who he is, and who most of us are.
Those least susceptible to biases share a common trait: they're skeptical of their own beliefs. Humility is the ultimate antidote of biases, and the most rational people are those who realize how irrational they can be. Investors with the best track records are often the ones willing to say "I don't know," "I screwed up," "Maybe you're right," and "I was wrong." They're comfortable changing their minds and abandoning past beliefs. George Soros once said: "I think that my conceptual framework, which basically emphasizes the importance of misconceptions, makes me extremely critical of my own decisions. I know that I am bound to be wrong, and therefore am more likely to correct my own mistakes." It's a neat trick, if you can pull it off.
Check back every Tuesday and Friday for Morgan Housel's columns on finance and economics.
The article So, You Think You're Rational? originally appeared on Fool.com.Contact Morgan Housel at firstname.lastname@example.org. The Motley Fool has a disclosure policy.
Copyright © 1995 - 2014 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.