by Roger Koppl
Ali Wyne of the big think blog “Power Games” recently posted an interesting set of comments on the theme “Empirics and Psychology: Eight of the World’s Top Young Economists Discuss Where Their Field Is Going.” George Mason’s own Peter Leeson was among the eight “top young economists” sharing their views.
Over at New APPS, the philosopher Eric Schliesser summarizes the eight comments. “Bottom line: due to low cost computing and a data rich environment the future of economics is data-mining (this was clear from at least four of the comments). This is especially so because the young stars have lost faith in homo economicus (due to behavioral work and the crisis).”
Eric’s summary seems about right to me. There were eight fine minds sharing eight different visions, but two related themes dominated the comments. 1) The old rationality assumption is in trouble and we don’t quite know what to do about it. 2) Economics should be more data-driven now that we have what William Brock has labeled “dirt-cheap computing.”
The “top young economists” seem to realize that our handling of these two issues will influence how we evaluate the consequences of market vs. control. Glen Weyl, for example, considers Hayek’s 1945 essay “The Use of Knowledge in Society” in light of modern information technology and concludes, “it is increasingly hard to see how dispersed information poses the challenge it once did to centralized planning.” Peter Leeson says that it matters whether economists interpret “economic crises as the product of markets responding rationally to poor policy” or as “the product of endemic irrational decision-making.” He says, “the way in which the status of the rationality postulate is resolved will not merely shape what economists are doing. It will shape the kind of society we inhabit.”
I think theoretical computer science, in particular computability theory (recursion theory) and the theory of computational complexity, can help us sort these issues out. The complexity revolution got going when professors got desktop computers. It is a bit of a truism, I suppose, that science is moving to ever more computational methods. Presumably, science for the foreseeable future will make ever-greater use of information theory, complexity theory, and computational methods. In such an environment, theoretical computer science should be an ever more important tool of any science, including economics.
We should think more about some of the results of theoretical computer science. For example, Rice’s theorem shows that it is impossible to write a program that can always decide whether an arbitrarily given computer program is a virus. Nor can we always decide ahead of time if a program has a specific bug. We may have to run the program to find out. Chaitin et al. say, “Rice’s theorem makes computer science into a near empirical, trial and error endeavor” (Chaitin, da Costa, and Doria. 2012. “Gödel’s Way” CRC Press). If we think of “regulation” as programming the economy, Rice’s theorem says that we cannot in general know what sort of affects the regulation will have. (In an earlier ThinkMarkets post I discuss the difficulty of distinguishing “regulation” from laissez faire.)
Computability theory may help to show why Weyl’s comment may be too optimistic on the potential of information technology. K. Vela Velupillai (2007. “The impossibility of an effective theory of policy in a complex economy.” In: Salzano, M., Colander D. (Eds.). Complexity Hints for Economic Policy. Springer, Berlin) shows that if the economy is complex enough to achieve computational universality, then an effective theory of policy is impossible. Basically, this means that you cannot really predict the consequences of policy if the economy is sufficiently complex. And that means that quicker and fancier computers won’t save the day for planning. There are several distinct results along these lines including
Tsuji, M., daCosta, N.C.A., Doria, F.A., 1998. The incompleteness of theories of games. Journal of Philosophical Logic 27, 553–564.
Markose, S. M., 2005. Computability and evolutionary complexity: Markets as complex adaptive systems. Economic Journal 115, F159-F192.
I would personally include
Wolpert, David H. 2001. “Computational Capabilities of Physical Systems.” Physical Review E, 65: 016128 1-27.
although it takes some arguing to draw the link.
Computabilty and the theory of computational complexity are important for the theory of rationality as well. Computability theory gives (relatively) absolute limits on rationality, and the surprise is how severe those limits are. (See the Chaitin et al. book mentioned earlier.) The theory of computational complexity provides insight on further restrictions on rationality. I suppose you could very loosely say that computability theory and the theory of computational complexity give us a relatively “rigorous” theory of “bounded rationality.” Indeed, Rob Axtell was a student of Herbert Simon and he has applied the theory of computational complexity to economics. (Axtell, R., 2005. The complexity of exchange. Economic Journal 115, F193–F210.)
Theoretical computer science can help us sort out what “dirt-cheap computing” can and cannot do, what the logical limits of rationality are, what policy makers can and cannot figure out, and what economic agents can and cannot know about one another. It is high time we give it a closer look.