Practical analysis for investment professionals
03 June 2019

Fabozzi: Finance Must Modernize or Face Irrelevancy

Frank J. Fabozzi, CFA, is one of the most prolific, compelling, and insightful voices in modern finance. As an academic, researcherauthor, and editor, he has helped shape our understanding of the discipline, and his contributions have earned him the James R. Vertin Award from the CFA Institute Research Foundation and the C. Stewart Sheppard Award from CFA Institute, among other accolades. Indeed, CFA charterholders and those who have studied for the exam will be familiar with his work. He is responsible for creating a sizable portion of the curriculum.

A common theme in his scholarship has been the state of academic finance and financial theory. He has long been an eloquent critic of how finance and economics are taught in colleges and universities and how conventional theory fails to explain actual market behavior. For more on his perspective, we spoke with him in person on the shortcomings he sees and their potential fixes.

Below is a lightly edited transcript of our conversation.

CFA Institute: Over the past two decades, you have been highly critical of academic economics and finance. What is wrong with these disciplines?

Frank J. Fabozzi, CFA: My criticism of academic economics is that the models built by economists basically treat market agents as robots. They make decisions according to defined rules, and the constructed models are labeled “rational models.” Since finance is a field within economics, the same criticism applies to the models built by financial economists. The key tools used by economists are calculus and higher-level mathematical analysis.

The “rational models” in finance have been attacked by the behavioral finance camp, which has demonstrated the disconnect between model behavior and real-world investor behavior. The concern with academic economics also comes from practitioners. For example, in 2003, Charlie Munger pointed to the failure to take psychology into account in the development of economic models: “If you want to go through life like a one-legged man in an ass-kicking contest, why, be my guest. But if you want to succeed, like a strong man with two legs, you have to pick up these tricks, including doing economics while knowing psychology.”

The problem with relying on rational models and treating them as the foundation of finance is that new findings that are inconsistent with the bedrock theories are dismissed. This is the major point that Sergio M. Focardi and I made when we argued that economics in its current form does not describe empirical reality but an idealized rational economic world. It is revealing that in financial economics, deviations in empirical prices or returns from theoretical models are referred to as “anomalies.” A true empirical science would revise its models so that they fit empirical data. Financial economics, however, takes the opposite approach and considers deviations from an idealized economic rationality to be anomalies of the true empirical price processes.

In the 1970s and 1980s, an academic couldn’t get published in a peer-reviewed finance journal if their research conflicted with prevailing theory, such as the capital asset pricing model (CAPM). For example, in the late 1970s, a prestigious financial journal sought papers written jointly by academics and practitioners. Thinking that the journal’s editorial board was sincere, I co-authored a paper with then-chairman of Merrill Lynch White Weld, Tom Chrystie. Our thesis was that securities can be structured/customized for investors using the asset side of the balance sheet. Basically, it provided the general blueprint for structured finance. The review we received in response was short and went something like — the ideas in the paper did not make any sense because they were inconsistent with CAPM!

Subscribe Button

Earlier, you described the misuse of calculus and higher-level mathematical analysis in economics. Why are these the wrong tools?

The over-reliance on calculus is symptomatic of the subject’s stagnation and a disservice to the students who aspire to work in asset management. Economists should combine sophisticated mathematical tools and empirical techniques while recognizing the limitations of a field where experiments are rarely possible. In “Who Needs a Newtonian Finance?” Marcos López de Prado and I explained why the adoption of calculus by economists was a historical accident and question economists’ mechanical vision of the world.

Basically, economists recognized that calculus was extremely successful in physics and engineering, where it acquired its track record. They hoped to repeat that extraordinary success by embracing the same conceptual framework. And the cumulative knowledge in applying calculus to real problems is impressive. Charlie Munger, in his list of academic economics’ weaknesses, referred to this as “physics envy.” He pointed out that “that term has been borrowed from [another type of] envy as described by one of the world’s great idiots, Sigmund Freud.”

Ultimately, calculus has not been effective in describing economic and financial phenomena. Focardi and I offer several explanations as to why economists seem to prefer the safe ground of calculus over the unsafe ground of reality. When a physicist inquired of Kenneth Arrow, a winner of the 1972 Nobel Prize in Economics, as to why economists used such sophisticated mathematics given that they have scarce supporting data, Professor Arrow responded, “It is just because we do not have enough data that we use sophisticated mathematics.” He went on to say, “We have to ensure the logical consistency of our arguments.” That proposition is more invalid today than ever, as all kinds of datasets have become available in recent years.

At present, there is no excuse for not using alternative datasets, which inform us in great detail about the daily activities of hundreds of millions of individuals.

Econometricians apply statistics to all sorts of data. Is their approach evidence based?

This is a false impression. Econometric models are utterly inappropriate to model the sheer complexity of economic systems. Economists cannot blindly adopt statistical techniques that were designed for experimental biology. As López de Prado and I explained, economics does not allow for experiments based on large, independently drawn samples of data from a stationary system. It takes 50 years to produce a new 50-year-long dataset, and by then, the system will have “evolved” much faster than natural systems.

The paradox in economics is that researchers either use non-empirical tools — calculus and sophisticated math — or paleo-statistical tools that were designed before the advent of computers. Compare a popular textbook in econometrics, like William H. Greene’s, with a chemometrics textbook, like Matthias Otto’s. Other fields have embraced machine learning and other computational methods. But these methods are rejected in economic journals as “black boxes.”

Econometrics has lost the train of innovation, and instead has become a stagnated subject, to the surprise of many statisticians outside our field. It is as if economists chose to use only econometrics because it’s the one toolkit that lets them confirm their CAPM or factor investing biases.

Theories in hard sciences — for example, Einstein’s theory of relativity — are models that predict and are not contradicted by reality. CAPM and other economic theories fail practitioners all the time. Why then does the Nobel Prize in Economics always have the word “Science” in the title?

What my co-author Sergio Focardi and I argued is that mainstream economics as it is known today is not a science in the sense of the physical sciences, because it does not describe the real-world economy but rather an idealized “economically rational” world.

The failure to popularize econophysics, a discipline championed by the physicist H. Eugene Stanley in the mid-1990s, is rather telling. Instead of embracing an interdisciplinary approach that adheres strictly to the principles of empirical science in its research, economists dismissed it as “non-mainstream.”

In the idealized pseudo-rational world of current economic theory, there is no real place for major crises. Financial economics, in particular, is based on the assumption that economic quantities might deviate from their theoretical value, but that market forces will quickly realign them with theoretical values. This assumption has proved to be inadequate. This failure prevented economics from helping asset management to establish itself as something different from a casino. An empirically validated, scientific view of economics is what’s required.

No wonder rebuilding investor confidence — as we demonstrated in Investment Management after the Global Financial Crisis from the CFA Institute Research Foundation — remains one of the profession’s greatest challenges.

Would you say then that economics is a science in the making?

Essentially, we need to rebuild economics as an empirical science. Some results have been obtained. Network theory has made significant progress in representing interactions among economic agents. Chaotic models and their relationships to statistics are now better understood. Machine learning methods have been able to deliver portfolios that outperform Markowitz’s solution out-of-sample. A new kind of statistics may be needed to work with the level of uncertainty that characterizes economics and finance. We make a distinction between robust statistics for the bulk of the data and extreme value theory to model the tails. We have learned how to make rough predictions of possibly very large outcomes never experienced in the past. But we do not have tools to deal with very high levels of uncertainty.

Recently, the emphasis has been on getting more data science into finance program curricula, what López de Prado, Joseph Simonian, and I refer to as “Financial Data Science.” We highlight some of the advantages of this field to practical investment management. This year, Marcos, Joe, and I co-founded the Journal of Financial Data Science, published by Pageant Media. The inaugural issue came out in January.

Machine learning, a branch of data science, comprises a family of computational techniques that facilitate the automated learning of patterns and the formation of predictions from data. Although there is no universal definition of data science, it combines statistics and computing to discover or impose order in complex data to enhance informed decision making. It is thus an inherently practical endeavor, just like finance, and so is especially suited to investment applications that should be in the curriculum of all finance programs.

As Marcos and I noted in our Newtonian finance editorial, there are some useful subjects in addition to data science that are rarely taught in economics and finance programs, including combinatorics, graph theory/networks, kernel theory, information theory, experimental math, algorithms, complexity theory, and data structures. We believe computer scientists may be better trained to deal with problems in finance than finance students. That is one reason why banks and hedge funds are hiring data scientists and physicists for positions previously reserved for finance graduates.

How should academia change how it teaches finance?

This is an open question that university economics and finance departments need to have a conversation about. Typically, university curricula in economics and finance today are divided: There are programs with mathematics and programs without mathematics. Those with mathematics teach sophisticated calculus and stochastic calculus. Those without mathematics still feel it’s necessary and try to teach diluted and simplified versions of calculus and stochastic calculus, mostly in the form of econometrics. This situation is unsatisfactory. Students of highly mathematical curricula end up feeling like they are in an ivory tower and do not develop the hard data discipline of the empirical sciences. In contrast, students of non-mathematical curricula come to believe that logic and mathematics are optional and do not apply to real life.

In the practice, both positions are unreasonable. In the practice of investment management, highly sophisticated calculus is used primarily in the financial derivatives business. Today, students who want to be “quants” need to know calculus and stochastic calculus. But they should keep in mind that the evolution of modern economies and theories of financial markets will likely require new, possibly different, mathematical concepts. They should keep a very open mind to new ideas.

But the opposite position, that mathematics is a useless option, is also very dangerous. Investment management requires rigorous logical thinking and the processing of huge amounts of unstructured data. The challenge to universities and business schools is, as Gilbert Strang, a world-renowned MIT mathematics professor says, to “present the mathematics that is most useful to the most students.” These teachings will help students reason rigorously without the constraining straitjacket of calculus.

For more from Frank J. Fabozzi, CFA, don’t miss Equity Valuation: Science, Art, or Craft? co-authored with Sergio M. Focardi and Caroline Jonas, the latest of many contributions to the CFA Institute Research Foundation.

If you liked this post, don’t forget to subscribe to the Enterprising Investor.

All posts are the opinion of the author. As such, they should not be construed as investment advice, nor do the opinions expressed necessarily reflect the views of CFA Institute or the author’s employer.

Image credit: ©Getty Images/hannahgleg

Continuing Education for CFA Institute Members

Select articles are eligible for continuing education (CE) credit. Record credits easily using the CFA Institute Members App, available on iOS and Android.

About the Author(s)
Paul Kovarsky, CFA

Paul Kovarsky, CFA, is a director, Institutional Partnerships, at CFA Institute.

2 thoughts on “Fabozzi: Finance Must Modernize or Face Irrelevancy”

  1. Jonathan G. Harris says:

    Machine learning merely extends the realm of functions considered from the typical linear or simple non-linear ones to more flexible ones built in a systematic fashion. For massive data sets this automates the identification of complex relationships

    Perhaps the author can explain how this would address the problem he cites:

    ” It takes 50 years to produce a new 50-year-long dataset, and by then the system will have “evolved” much faster than natural systems. “

  2. Asmarelda Chivore says:

    I support these insights. Increasingly we are faced with grey areas that cannot be absolutely captured by rigid quantitative tools.

Leave a Reply

Your email address will not be published. Required fields are marked *

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.