Practical analysis for investment professionals
08 June 2018

Daniel Kahneman: Four Keys to Better Decision Making

Nobel laureate Daniel Kahneman has transformed the fields of economics and investing. At their most basic, his revelations demonstrate that human beings and the decisions they make are much more complicated — and much more fascinating — than previously thought.

He delivered a captivating mini seminar on some of the key ideas that have driven his scholarship, exploring intuition, expertise, bias, noise, how optimism and overconfidence influence the capitalist system, and how we can improve our decision making, at the 71st CFA Institute Annual Conference in Hong Kong.

Subscribe Button

“Optimism is the engine of capitalism,” Kahneman said. “Overconfidence is a curse. It’s a curse and a blessing. The people who make great things, if you look back, they were overconfident and optimistic — overconfident optimists. They take big risks because they underestimate how big the risks are.”

But by studying only the success stories, people are learning the wrong lesson.

“If you look at everyone,” he said, “there is lots of failure.”

The Perils of Intuition

Intuition is a form of what Kahneman calls fast, or System 1, thinking and we often base our decisions on what it tells us.

“We trust our intuitions even when they’re wrong,” he said.

But we can trust our intuitions — provided they’re based on real expertise. And while we develop expertise through experience, experience alone isn’t enough.

In fact, research demonstrates that experience increases the confidence with which people hold their ideas, but not necessarily the accuracy of those ideas. Expertise requires a particular kind of experience, one that exists in a context that gives regular feedback, that is effectively testable.

Bursting The Bubble Tile

“Is the world in which the intuition comes up regular enough so that we have an opportunity to learn its rules?” Kahneman asked.

When it comes to the finance sector, the answer is probably no.

“It’s very difficult to imagine from the psychological analysis of what expertise is that you can develop true expertise in, say, predicting the stock market,” he said. “You cannot because the world isn’t sufficiently regular for people to learn rules.”

That doesn’t stop people from confidently predicting financial outcomes based on their experience.

“This is psychologically a puzzle,” Kahneman said. “How could one learn when there’s nothing to learn?”

That sort of intuition is really superstition. Which means we shouldn’t assume we have expertise in all the domains where we have intuitions. And we shouldn’t assume others do either.

“When somebody tells you that they have a strong hunch about a financial event,” he said, “the safe thing to do is not to believe them.”

House ad for Behavioral Finance: The Second Generation

Noise Alert

Even in testable domains where causal relationships are readily discernible, noise can distort the results.

Kahneman described a study of underwriters at a well-run insurance company. While not an exact science, underwriting is a domain with learnable rules where expertise can be developed. The underwriters all read the same file and determined a premium. That there would be divergence in the premium set by each was understood. The question was how large a divergence.

“What percentage would you expect?” Kahneman asked. “The number that comes to mind most often is 10%. It’s fairly high and a conservative judgment.”

Yet when the average was computed, there was 56% divergence.

“Which really means that those underwriters are wasting their time,” he said. “How can it be that people have that amount of noise in judgment and not be aware of it?”

Unfortunately, the noise problem isn’t limited to underwriting. And it doesn’t require multiple people. One is often enough. Indeed, even in more binary disciplines, using the same data and the same analyst, results can differ.

“Whenever there is judgment there is noise and probably a lot more than you think,” Kahneman said.

For example, radiologists were given a series of X-rays and asked to diagnose them. Sometimes they were shown the same X-ray.

“In a shockingly high number of cases, the diagnosis is different,” he said.

The same held true for DNA and fingerprint analysts. So even in cases where there should be one foolproof answer, noise can render certainty impossible.




“We use the word bias too often.”

While Kahneman has spent much of his career studying bias, he is now focused on noise. Bias, he believes, may be overdiagnosed, and he recommends assuming noise is the culprit in most decision-making errors.

“We should think about noise as a possible explanation because noise and bias lead you to different remedies,” he said.

Hindsight, Optimism, and Loss Aversion

Of course, when we make mistakes, they tend to skew in two opposing directions.

“People are very loss averse and very optimistic. They work against each other,” he said. “People, because they are optimistic, they don’t realize how bad the odds are.”

As Kahneman’s research on loss aversion has shown, we feel losses more acutely than gains.

“Our estimate in many situations is 2 to 1,” he said.

Yet we tend to overestimate our chances of success, especially during the planning phase. And then whatever the outcome, hindsight is 20/20: Why things did or didn’t work out is always obvious after the fact.

“When something happens, you immediately understand how it happens. You immediately have a story and an explanation,” he said. “You have that sense that you learned something and that you won’t make that mistake again.”

These conclusions are usually wrong. The takeaway should not be a clear causal relationship.

“What you should learn is that you were surprised again,” Kahneman said. “You should learn that the world is more uncertain than you think.”

So in the world of finance and investing, where there is so much noise and bias and so little trustworthy intuition and expertise, what can professionals do to improve their decision making?

Kahneman proposed four simple strategies for better decision making that can be applied to both finance and life.

Financial Analysts Journal Current Issue Tile

1. Don’t Trust People, Trust Algorithms

Whether it’s predicting parole violators and bail jumpers or who will succeed as a research analyst, algorithms tend to be preferable to independent human judgment.

“Algorithms beat individuals about half the time. And they match individuals about half time,” Kahneman said. “There are very few examples of people outperforming algorithms in making predictive judgments. So when there’s the possibility of using an algorithm, people should use it. We have the idea that it is very complicated to design an algorithm. An algorithm is a rule. You can just construct rules.”

And when we can’t use an algorithm, we should train people to simulate one.

“Train people in a way of thinking and in a way of approaching problems that will impose uniformity,” he said.

2. Take the Broad View

Don’t view each problem in isolation.

“The single best advice we have in framing is broad framing,” he said. “See the decision as a member of a class of decisions that you’ll probably have to take.”

Tile for The Future of Sustainability in Investment Management

3. Test for Regret

“Regret is probably the greatest enemy of good decision making in personal finance,” Kahneman said.

So assess how prone clients are to it. The more potential for regret, the more likely they are to churn their account, sell at the wrong time, and buy when prices are high. High-net-worth individuals are especially risk averse, he said, so try to gauge just how risk averse.

“Clients who have regrets will often fire their advisers,” he said.

4. Seek Out Good Advice

Part of getting a wide-ranging perspective is to cultivate curiosity and to seek out guidance.

So who is the ideal adviser? “A person who likes you and doesn’t care about your feelings,” Kahneman said.

For him, that person is fellow Nobel laureate Richard H. Thaler.

“He likes me,” Kahneman said. “And couldn’t care less about my feelings.”

If you liked this post, don’t forget to subscribe to the Enterprising Investor.


All posts are the opinion of the author. As such, they should not be construed as investment advice, nor do the opinions expressed necessarily reflect the views of CFA Institute or the author’s employer.

Image courtesy of IMAGEIN

Professional Learning for CFA Institute Members

CFA Institute members are empowered to self-determine and self-report professional learning (PL) credits earned, including content on Enterprising Investor. Members can record credits easily using their online PL tracker.

About the Author(s)
Paul McCaffrey

Paul McCaffrey is the editor of Enterprising Investor at CFA Institute. Previously he served as an editor at the H.W. Wilson Company. His writing has appeared in Financial Planning and On Wall Street, among other publications. He is a graduate of Vassar College and the Craig Newmark Graduate School of Journalism at CUNY.

8 thoughts on “Daniel Kahneman: Four Keys to Better Decision Making”

  1. Joseph Steiner says:

    Can you provide some definition around what Prof. Kahneman means by “noise”?

    1. Paul McCaffrey says:

      Hi Joseph,

      Thanks for your comment and apologies for not clarifying that in the article.

      Kahneman and company define noise as “the chance variability of judgments.” A little vague perhaps, but they dig into the concept in this Harvard Business Review article. https://hbr.org/2016/10/noise

      Best Regards,

      Paul

  2. Chuck t says:

    He exposes two key shortcomings tht humans possess. Our knowledge on any subject is more limited than we realize and our inability to predict the future because noise will always get in the way.

  3. There is no such thing as a Nobel prize in economic sciences. Beginning in 1901, Nobel Prizes have been awarded in the following five categories: literature, peace, physics, chemistry, and “physiology or medicine.” In 1969, in an effort to improve the image of economists, enthusiasts managed to establish the confusingly named “Bank of Sweden Price in Economic Sciences in Memory of Alfred Nobel.”

    The Wall Street Journal and The New York Times carefully refer to this award as the Nobel Memorial Prize in Economic Science. Recipients are not Nobel Laureates.

    Amartya Sen, after receiving the prize in 1998, said in an interview: “I’ve always been skeptical of the prize, but it’s difficult to express that until you get it because people think it’s sour grapes.”

  4. John LaVine says:

    I found this piece humorously informative, referring to the mental patterns that affect our risk taking and assessment. I do believe that my best successes in the market have come from some long shots, albeit with a reason for the gamble taken.

    Despite some losses, often at the suggestion of investment by learned professionals, rather than my own gut, I have to feel optimistic and think that since the market is so often driven by knee-jerk emotion, that we’re all better off by too much optimism than cautious pessimism, unless we’re set for life. I, of course, am, as long as I don’t live to see 2019.

  5. Martin Colwell says:

    I would suggest readers read Thinking Fast And Slow by Messrs Kahnemahn and Twersky to fully understand how to control bias and ‘gut feeling’ so that it is relative but not your main driver for decisions.

  6. Mark says:

    I am a convert to rule-based decision-making in wealth management and enjoyed this article. But I would have liked to see a better explanation of the “Test for regret” advice: how does one do this? What is actually meant by “regret”?
    Finally, I was surprised by Kahnemahn’s comment that “When somebody tells you that they have a strong hunch about a financial event,” he said, “the safe thing to do is not to believe them.”
    Assuming that one can boil down any such “hunch” to having a “win” or “lose” or “profit” or “loss” outcome, and assuming the largely unpredictable nature of human beings acting in stock and other markets, shouldn’t the actual “safe” thing to think here be not to disbelieve the hunch but to accept that it has a roughly 50/50 chance of being right or wrong?

    1. Bharadwaj Kulkarni says:

      I think what he meant by “not believe them” is that we should disregard this “hunch” – which is often just an assertion – and revert to the baseline probability of the event, which in this case as you have put it, is 50/50. The fact that the hunch itself then also has a 50/50 probability of being right is incidental.
      Or at least, that’s how I understand it.

Leave a Reply to Martin Colwell Cancel reply

Your email address will not be published. Required fields are marked *



By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close