Daniel Kahneman: Four Keys to Better Decision Making
Nobel laureate Daniel Kahneman has transformed the fields of economics and investing. At their most basic, his revelations demonstrate that human beings and the decisions they make are much more complicated — and much more fascinating — than previously thought.
He delivered a captivating mini seminar on some of the key ideas that have driven his scholarship, exploring intuition, expertise, bias, noise, how optimism and overconfidence influence the capitalist system, and how we can improve our decision making, at the 71st CFA Institute Annual Conference in Hong Kong.
“Optimism is the engine of capitalism,” Kahneman said. “Overconfidence is a curse. It’s a curse and a blessing. The people who make great things, if you look back, they were overconfident and optimistic — overconfident optimists. They take big risks because they underestimate how big the risks are.”
But by studying only the success stories, people are learning the wrong lesson.
“If you look at everyone,” he said, “there is lots of failure.”
The Perils of Intuition
Intuition is a form of what Kahneman calls fast, or System 1, thinking and we often base our decisions on what it tells us.
“We trust our intuitions even when they’re wrong,” he said.
But we can trust our intuitions — provided they’re based on real expertise. And while we develop expertise through experience, experience alone isn’t enough.
In fact, research demonstrates that experience increases the confidence with which people hold their ideas, but not necessarily the accuracy of those ideas. Expertise requires a particular kind of experience, one that exists in a context that gives regular feedback, that is effectively testable.
“Is the world in which the intuition comes up regular enough so that we have an opportunity to learn its rules?” Kahneman asked.
When it comes to the finance sector, the answer is probably no.
“It’s very difficult to imagine from the psychological analysis of what expertise is that you can develop true expertise in, say, predicting the stock market,” he said. “You cannot because the world isn’t sufficiently regular for people to learn rules.”
That doesn’t stop people from confidently predicting financial outcomes based on their experience.
“This is psychologically a puzzle,” Kahneman said. “How could one learn when there’s nothing to learn?”
That sort of intuition is really superstition. Which means we shouldn’t assume we have expertise in all the domains where we have intuitions. And we shouldn’t assume others do either.
“When somebody tells you that they have a strong hunch about a financial event,” he said, “the safe thing to do is not to believe them.”
Even in testable domains where causal relationships are readily discernible, noise can distort the results.
Kahneman described a study of underwriters at a well-run insurance company. While not an exact science, underwriting is a domain with learnable rules where expertise can be developed. The underwriters all read the same file and determined a premium. That there would be divergence in the premium set by each was understood. The question was how large a divergence.
“What percentage would you expect?” Kahneman asked. “The number that comes to mind most often is 10%. It’s fairly high and a conservative judgment.”
Yet when the average was computed, there was 56% divergence.
“Which really means that those underwriters are wasting their time,” he said. “How can it be that people have that amount of noise in judgment and not be aware of it?”
Unfortunately, the noise problem isn’t limited to underwriting. And it doesn’t require multiple people. One is often enough. Indeed, even in more binary disciplines, using the same data and the same analyst, results can differ.
“Whenever there is judgment there is noise and probably a lot more than you think,” Kahneman said.
For example, radiologists were given a series of X-rays and asked to diagnose them. Sometimes they were shown the same X-ray.
“In a shockingly high number of cases, the diagnosis is different,” he said.
The same held true for DNA and fingerprint analysts. So even in cases where there should be one foolproof answer, noise can render certainty impossible.
“We use the word bias too often.”
While Kahneman has spent much of his career studying bias, he is now focused on noise. Bias, he believes, may be overdiagnosed, and he recommends assuming noise is the culprit in most decision-making errors.
“We should think about noise as a possible explanation because noise and bias lead you to different remedies,” he said.
Hindsight, Optimism, and Loss Aversion
Of course, when we make mistakes, they tend to skew in two opposing directions.
“People are very loss averse and very optimistic. They work against each other,” he said. “People, because they are optimistic, they don’t realize how bad the odds are.”
As Kahneman’s research on loss aversion has shown, we feel losses more acutely than gains.
“Our estimate in many situations is 2 to 1,” he said.
Yet we tend to overestimate our chances of success, especially during the planning phase. And then whatever the outcome, hindsight is 20/20: Why things did or didn’t work out is always obvious after the fact.
“When something happens, you immediately understand how it happens. You immediately have a story and an explanation,” he said. “You have that sense that you learned something and that you won’t make that mistake again.”
These conclusions are usually wrong. The takeaway should not be a clear causal relationship.
“What you should learn is that you were surprised again,” Kahneman said. “You should learn that the world is more uncertain than you think.”
So in the world of finance and investing, where there is so much noise and bias and so little trustworthy intuition and expertise, what can professionals do to improve their decision making?
Kahneman proposed four simple strategies for better decision making that can be applied to both finance and life.
1. Don’t Trust People, Trust Algorithms
Whether it’s predicting parole violators and bail jumpers or who will succeed as a research analyst, algorithms tend to be preferable to independent human judgment.
“Algorithms beat individuals about half the time. And they match individuals about half time,” Kahneman said. “There are very few examples of people outperforming algorithms in making predictive judgments. So when there’s the possibility of using an algorithm, people should use it. We have the idea that it is very complicated to design an algorithm. An algorithm is a rule. You can just construct rules.”
And when we can’t use an algorithm, we should train people to simulate one.
“Train people in a way of thinking and in a way of approaching problems that will impose uniformity,” he said.
2. Take the Broad View
Don’t view each problem in isolation.
“The single best advice we have in framing is broad framing,” he said. “See the decision as a member of a class of decisions that you’ll probably have to take.”
3. Test for Regret
“Regret is probably the greatest enemy of good decision making in personal finance,” Kahneman said.
So assess how prone clients are to it. The more potential for regret, the more likely they are to churn their account, sell at the wrong time, and buy when prices are high. High-net-worth individuals are especially risk averse, he said, so try to gauge just how risk averse.
“Clients who have regrets will often fire their advisers,” he said.
4. Seek Out Good Advice
Part of getting a wide-ranging perspective is to cultivate curiosity and to seek out guidance.
So who is the ideal adviser? “A person who likes you and doesn’t care about your feelings,” Kahneman said.
For him, that person is fellow Nobel laureate Richard H. Thaler.
“He likes me,” Kahneman said. “And couldn’t care less about my feelings.”
If you liked this post, don’t forget to subscribe to the Enterprising Investor.
All posts are the opinion of the author. As such, they should not be construed as investment advice, nor do the opinions expressed necessarily reflect the views of CFA Institute or the author’s employer.
Image courtesy of IMAGEIN