MIT’s Constantinos Daskalakis was awarded the Rolf Nevanlinna Prize by the International Mathematical Union last month, the highest recognition for computer scientists under 40, awarded once in only 4 years. The award was for fundamental results that he and his collaborators had proved in economic game theory. Interestingly, to most researchers in economics and finance, Daskalakis’ name hardly rang a bell when the award was announced! So what exactly did Daskalakis do and why do computer scientists think it will fundamentally change economics and finance?

**The Origins of Modern Economic Theory**

Finance and economics have been practiced for hundreds of years, and the history of these fields is replete with seminal books of persuasive prose making heuristic arguments. Kautilya’s *Arthashastra* or Adam Smith’s *Wealth of Nations* or John Maynard Keynes’ *General Theory* were all epoch making books in one way or another, yet to any well-trained modern economist they would read like the Aesop’s fables—astute observations, but with foundations that are vague at best. The origins of the rigor that make economics a modern science are usually traced to Leon Walras, a French economist in the late 1800s, but the most important contributions came in the early 20th century from a group of academics working at Princeton. Led by the polymath John von Neumann, this school included luminaries like Nash, Shapley, Bellman, Blackwell, Kuhn, Tucker, and others not physically present in Princeton but inspired by similar ideas nevertheless, like Arrow, Debreu or McKenzie. In a span of 3 to 4 decades, this group not only revolutionized economics, but many related disciplines too, like operations research and industrial organization.

A key approach advocated by adherents of this school was proving existence of equilibrium results—rigorously. Till this time, economists argued about what might or might not happen in various economic situations without ever bothering to check if the situation they were arguing about could actually ever come to be. So, for instance, economists would say that an “invisible hand” would balance out the forces in the market—without any clear notion of why this must be so. Often the predicted situation never came about, and economists were at a loss. Existence results showed clearly if economic forces balanced out and there was a consequent equilibrium. Only when one could first show that there existed an equilibrium for a situation under study was economic analysis meaningful and worthwhile.

Among the earliest of equilibrium existence results was von Neumann’s minimax theorem for zero-sum games. This was followed by existence proofs in many other domains—most famously Nash’s results for non-cooperative game theory and Arrow and Debreu’s general equilibrium results. Most top PhD programs in economics or finance nowadays begin their coverage of the field only at this point. The main tool for showing equilibrium existence was (and continues to be) what are called fixed-point theorems. These theorems were first discovered in the early 1900s in an area of mathematics called Topology.

Now, most fixed point theorems are, by nature, non-constructive. To see what this means, let’s imagine a simple example. Suppose I wrote numbers from 1 to 100 on pieces of paper, folded them, and asked you to choose any 10 pieces. Then, without allowing you to unfold and see the numbers, suppose I asked you two questions: 1. *Is there* a maximum value among the 10 pieces you selected; 2. *What is* the maximum value among the 10 pieces you selected? The first question you can answer without opening the folds and seeing the actual numbers. Because you know, of course, that there is going to be a maximum number in a set of 10 numbers, and you don’t need to know the actual numbers to be sure of that. This is analogous to the situation with non-constructive techniques in math—one can make claims without actually constructing anything explicitly. However, to answer the second question you need to open the folds and read the numbers. That is analogous to constructive techniques in math. In other words, constructive techniques not only make claims, but also give you an explicit way to make a construction that verifies the claim.

Thus, since fixed point theorems were non-constructive, the existence results which used them also ended up being non-constructive. In effect, economists had many situations where they knew that there must be an equilibrium, but weren’t sure how the economic forces actually got us there. Over the years, a number of constructive techniques were discovered for many equilibria. However, some equilibria remained stubbornly out of reach of constructive techniques. Chief among them was the Nash equilibrium, the bedrock of modern economic theory immortalized in the scene at the bar where Nash’s friends want to ask girls to dance, in the movie “A Beautiful Mind”.

**Computational Hardness**

John von Neumann, the brilliant polymath we met in the last section, was a pivotal figure not only in economics, but also in computer science, quantum physics and mathematical logic, among others. Princeton University was a leader in all these areas since the early 1900s, and many of the best minds in the world in these fields congregated in Fine Hall—the venue of the math department—every evening for tea. A distinct presence in that group was a PhD student named Alan Turing. Regarded by many as the father of modern computer science and immortalized in the movie “The Imitation Game”, Turing in those years was an awkward graduate student working under Alonzo Church, one of the giants of mathematical logic. Building on Kurt Godel’s earlier research, Turing, in his ground-breaking work, showed that there were problems that were inherently unsolvable. In other words, mechanical devices could only be expected to solve a limited subset of the problems that humans could formulate; outside of this subset, problems were undecidable and hopeless.

Over the years, computer scientists extended Turing’s results in many different directions, creating a vast sub-field of computer science called Computational Complexity. In fact, the last Rolf Nevanlinna Prize was awarded to an Indian computer scientist at New York University, Subhash Khot, for his fundamental work in this very area. Research in computational complexity classifies various problems according to their inherent complexity. Certain problems are easy to solve, others might take longer than the age of the universe! Work in this area has led to detailed dictionaries that tell us how to identify the complexity of a problem from tell-tale signs. Despite tremendous advances, however, many questions in this area still remain open; for instance, the famous P versus NP question.

**Daskalakis’ Contribution**

Christos Papadimitriou, Daskalakis’ PhD adviser at Berkeley, had already made many fundamental contributions to theoretical computer science when Daskalakis joined the graduate program in the early 2000s. Kenneth Arrow, the famous economist at Stanford was a good friend of Papadimitriou’s, and it was Arrow who introduced Papadimitriou to the peculiarities of economic equilibrium construction in the 1980s. Over the years, Papadimitriou had created a number of beautiful tools to address many open problems about the complexity of equilibrium constructions. However, the computational complexity of finding Nash equilibrium had continued to evade him, and when Daskalakis sought a problem to work on, it was this question that Papadimitriou posed to him.

The precise insights that led to Daskalakis’ result are hard to explain without advanced mathematics: in technical terms, he showed that finding Nash equilibrium is PPAD complete. In simple terms, this means that Nash equilibrium is very, very hard to compute.

**Why Computational Complexity Matters**

Large parts of economics and finance depend on the computation of Nash equilibrium and General equilibrium in realistic time-frames. For example, the area of Asset pricing in finance starts by assuming a general equilibrium framework and builds on it. Following Daskalakis’ work, other researchers showed that computing many types of general equilibria were also extremely hard. If it takes a computer longer than the age of planet Earth to compute the equilibrium, obviously human traders cannot compute them in real time in the din of financial markets! What, then, is the steady state that ensues during trading? How must traders price financial products if it’s not an equilibrium that they find themselves in? How should regulators create policy if they don’t know whether markets can ever be nudged to an equilibrium? It is these sorts of questions that Daskalakis’ beautiful result has suddenly made open. One thing is for certain: the coming few years promise to be an exciting time for researchers in finance, economics and computational theory, as they grapple with the many implications of Daskalakis’ work.

** **