by Roger Koppl
Ali Wyne of the big think blog “Power Games” recently posted an interesting set of comments on the theme “Empirics and Psychology: Eight of the World’s Top Young Economists Discuss Where Their Field Is Going.” George Mason’s own Peter Leeson was among the eight “top young economists” sharing their views.
Over at New APPS, the philosopher Eric Schliesser summarizes the eight comments. “Bottom line: due to low cost computing and a data rich environment the future of economics is data-mining (this was clear from at least four of the comments). This is especially so because the young stars have lost faith in homo economicus (due to behavioral work and the crisis).”
Eric’s summary seems about right to me. There were eight fine minds sharing eight different visions, but two related themes dominated the comments. 1) The old rationality assumption is in trouble and we don’t quite know what to do about it. 2) Economics should be more data-driven now that we have what William Brock has labeled “dirt-cheap computing.”
The “top young economists” seem to realize that our handling of these two issues will influence how we evaluate the consequences of market vs. control. Glen Weyl, for example, considers Hayek’s 1945 essay “The Use of Knowledge in Society” in light of modern information technology and concludes, “it is increasingly hard to see how dispersed information poses the challenge it once did to centralized planning.” Peter Leeson says that it matters whether economists interpret “economic crises as the product of markets responding rationally to poor policy” or as “the product of endemic irrational decision-making.” He says, “the way in which the status of the rationality postulate is resolved will not merely shape what economists are doing. It will shape the kind of society we inhabit.”
I think theoretical computer science, in particular computability theory (recursion theory) and the theory of computational complexity, can help us sort these issues out. The complexity revolution got going when professors got desktop computers. It is a bit of a truism, I suppose, that science is moving to ever more computational methods. Presumably, science for the foreseeable future will make ever-greater use of information theory, complexity theory, and computational methods. In such an environment, theoretical computer science should be an ever more important tool of any science, including economics.
We should think more about some of the results of theoretical computer science. For example, Rice’s theorem shows that it is impossible to write a program that can always decide whether an arbitrarily given computer program is a virus. Nor can we always decide ahead of time if a program has a specific bug. We may have to run the program to find out. Chaitin et al. say, “Rice’s theorem makes computer science into a near empirical, trial and error endeavor” (Chaitin, da Costa, and Doria. 2012. “Gödel’s Way” CRC Press). If we think of “regulation” as programming the economy, Rice’s theorem says that we cannot in general know what sort of affects the regulation will have. (In an earlier ThinkMarkets post I discuss the difficulty of distinguishing “regulation” from laissez faire.)
Computability theory may help to show why Weyl’s comment may be too optimistic on the potential of information technology. K. Vela Velupillai (2007. “The impossibility of an effective theory of policy in a complex economy.” In: Salzano, M., Colander D. (Eds.). Complexity Hints for Economic Policy. Springer, Berlin) shows that if the economy is complex enough to achieve computational universality, then an effective theory of policy is impossible. Basically, this means that you cannot really predict the consequences of policy if the economy is sufficiently complex. And that means that quicker and fancier computers won’t save the day for planning. There are several distinct results along these lines including
Tsuji, M., daCosta, N.C.A., Doria, F.A., 1998. The incompleteness of theories of games. Journal of Philosophical Logic 27, 553–564.
and
Markose, S. M., 2005. Computability and evolutionary complexity: Markets as complex adaptive systems. Economic Journal 115, F159-F192.
I would personally include
Wolpert, David H. 2001. “Computational Capabilities of Physical Systems.” Physical Review E, 65: 016128 1-27.
although it takes some arguing to draw the link.
Computabilty and the theory of computational complexity are important for the theory of rationality as well. Computability theory gives (relatively) absolute limits on rationality, and the surprise is how severe those limits are. (See the Chaitin et al. book mentioned earlier.) The theory of computational complexity provides insight on further restrictions on rationality. I suppose you could very loosely say that computability theory and the theory of computational complexity give us a relatively “rigorous” theory of “bounded rationality.” Indeed, Rob Axtell was a student of Herbert Simon and he has applied the theory of computational complexity to economics. (Axtell, R., 2005. The complexity of exchange. Economic Journal 115, F193–F210.)
Theoretical computer science can help us sort out what “dirt-cheap computing” can and cannot do, what the logical limits of rationality are, what policy makers can and cannot figure out, and what economic agents can and cannot know about one another. It is high time we give it a closer look.
This is all very embarrassing for economic science .. more evidence of a ‘science’ in a death spiral due to it’s phobic hostility to thinking seriously about the nature of the problem raising patterns is has to explain and the sort of contingent explanatory causal mechanism appropriate to making sense of this problem raising phenomena.
As a side note, many of the elements of “the complexity revolution” were known more than a 100 years ago — the story of _why_ these elements and this knowledge was suppressed or forgotten in not unrelated to the incentives and perspectives that gave us an economic science which suppressed many of the core insights & the causal explanatory core of Menger, Smith, Mises, Knight, Hayek, and others. In both cases the loss of knowledge was the product jointly of a false image of ‘science’ derived from the philosophical tradition & an incentive structure which rewards work on simple phenomena, and reproduced problem set for the textbooks which included only simple, non-complex, problems.
Follow up on complexity.
The issue of sensitivity to initial conditions was known at least since the time of work on the 3 body problem — a problem which defeats Laplacean determinism and the simple model of “prediction”, and ‘science” as prediction or testing according to simple linear ‘laws’.
But into the 1960s ‘explanation’ itself was defined as the prediction of particular events according to law.
This tradition was part and parcel of the dogma dictating what ‘science’ and ‘knowledge’ was in the field of economics — helping produce the dogmas of “how to produce science in economics” of Friedman, Samuelson, and the whole profession.
I am with Greg that at least some of this is re-discovering old truths. The comment that computers obviate Hayek’s knowledge problem has been around my entire professional career. I was dealing with that argument in the 1970s.
Hayek’s decentralized knowledge exists only if dispersed individuals have an incentive to create it. The incentive is the ability to capture profits from utilizing it, i.e., there is a system of private property rights. The argument has absolutely nothing to do with the cost of computing. This is Lange warmed over.
Cross-post from Big ideas thread:
George Stigler argued that if the problems of economic life changed frequently and radically, and lacked a large measure of continuity in their essential nature, there could not be a science of economics.
He said that an essential element of a science is the cumulative growth of knowledge, and that cumulative character could not arise if each generation of economists faced fundamentally new problems calling for entirely new methods of analysis. A science requires for its very existence a set of fundamental and durable problems.
Stigler proposed that a viable and healthy science requires both the persistent and almost timeless theories that naturally ignore the changing conditions of their society, and the unsettled theories that encounter much difficulty in attempting to explain current events.
He concluded that without the base of persistent theory, there would be no body of slowly evolving knowledge to constitute the science. Without the challenges of unsolved, important problems, the science would become sterile.
Empirical data is history encapsulated in numbers. Economics is a apriori knowledge, but combined with empirical knowledge, such as the diversity of resources and institutions. (See Rothbard, Man, Economy, and State.)
Economic theory can help understand history and data, but empirical data is useless in building economic theory.
Prove the fallacy of econometrics.
1. Econometrics is mathematical history.
2. Mathematics is a priori knowledge.
3. Therefore, econometrics is a priori history.
4. A priori history is a fallacy.
5. Therefore, econometrics is a fallacy.
Class dismissed.
Greg & Jerry:
Yes to rediscovery. There is a sense in which you could view Adam Smith as a complexity theorist, as others have noted. There is a great line in Smith’s discussion of “woollen coat” of the “common artificer.” (Click here: http://www.econlib.org/library/Smith/smWN1.html#I.1.11) Smith says:
——-
Observe the accommodation of the most common artificer or day-labourer in a civilized and thriving country, and you will perceive that the number of people of whose industry a part, though but a small part, has been employed in procuring him this accommodation, exceeds all computation. The woollen coat, for example, which covers the day-labourer, as coarse and rough as it may appear, is the produce of the joint labour of a great multitude of workmen.
———–
He says the division of labor “exceeds all computation.” What a wonderful coincidence of language! In modern terms, there is a sense in which the division of labor is mathematically uncomputable. Beyond this charming coincidence of language, Smith is clear about many points that we can now back up with complexity theory. His famous discussion of the “man of system” is an example in my opinion. (Click here: http://www.econlib.org/library/Smith/smMS6.html#VI.II.42)
Eric Schliesser’s post comments on the Lange connection. He says, “But without apparently realizing it, Weyl rediscovers Oskar Lange’s famous response to Hayek.” I think we should take this argument seriously. In November 2010, David Warsh called Weyl “the hottest prospect on the job market this year.” (Click here: http://www.economicprincipals.com/issues/2010.11.22/1202.html) Warsh says Weyl great uncle was none other than Hermann Weyl. If someone like that is saying that modern information technology makes central planning viable, we’d better take it seriously. I can’t help thinking that theoretical computer science may be a way to show the argument to be mistaken and that this way of doing so may be more persuasive to the elites of the profession than many other equally valid arguments. You know: you gotta speak a language your audience can understand.
A few random comments.
1. I met Glen Weyl a couple of times. He is probably more informed about the history of economic thought than 90% of economists his age.. So if he doesn’t know the calculation debate in a satisfactory way, that is very poor news indeed.
2. The failure to know the history of economics is particularly evident in the current discussion about rationality and psychology.
3. People are always, it seems, looking for some magical new technique that will solve intellectual problems. The first step, I believe, has not yet been made. This is “simply” clear thinking about basics. Unfortunately, many dismiss this as “philosophy” or “methodology” — two of the most insulting words top economists can use against each other.
4. In view of much of the above I see too many top economists (by no means all) as idiot savants.
David Colander in a series of studies has documented that fact the Econ grad students effectively are idiot savants — and the 1992 report on Econ Grad School education by the AEA from a committee of top economists called them exactly that.
Studies show that the graduates of top econ PhD programs in economics know almost nothing about:
1. how businesses work
2. most of the basics of the everyday facts about the economy you can read in a newspaper
3. Anything about the economics produced by economists prior to, say, 1970.
4. Anything about the philosphy or history of economic science.
5. Much of anything about hisotry, political philosphy, law, the history of psychology, neuroscience, population biology, sociology, or the history of ideas, etc. — all of the conjoined domains relevant to understanding social phenomena and the full context of the human sciences.
Greg,
Axel Leijonhufvud taught a very theory-rich graduate macro course. He was so concerned about the issues you raise that he insisted we all sunscribe to St. Louis Fed publications. One class he told us to bring our chart books in and we reviewed what was going on in the actual economy — not the model of the economy taught in the classroom.
Bob Clower was a big advocate of stock-flow analysis, and the importance of analyzing changes in wealth. His classes have stood me in good stead during the financial crisis. In fact my whole UCLA education has proved valuable, as valuable in some ways as Hayek (not that they were in conflict), for understanding the financial crisis.
[…] Computability problems screw up econometrics more than most economists think. […]
Xavier Gabaix (one of the young economists commenting): “Low levels of growth are in part due to misapplied cognitive heuristics that lead people to be timid, inert, and gullible.”
So–ups and downs in economic growth are caused by changing fashions in cognitive heuristics? Maybe sunspots induce cycles of timidity and gullibility that modulate economic vitality?
Glen Weyl: “While these information systems are mostly nongovernmental, they are sufficiently centralized that it is increasingly hard to see how dispersed information poses the challenge it once did to centralized planning.”
This comes remarkably close to suggesting that the ability of Soviet planers to use market prices from the West to do economic calculation shows that central planning can work. Suppose the central planners take over and continue to implement what entrepreneurs already created; where do the new ideas, possibilities, and methods come from? How do the incentives work in a political system? A centrally planned stagnant economy might be possible. So what? Who wants that? Does Weyl see what is at stake?
[…] Top Young Economists consider Their Future, by Roger Koppl […]
Sounds like all of these stars could use a course in the philosophy of science.
@ Greg: at UCLA 20 years ago Demsetz was frustrated that grad students had no sense of what their calculations meant in terms of simple behavior. There was a very strong theory camp at UCLA then. But, when Demsetz asked the students to explain simple things like constant returns to scale they had no idea what he was talking about. He’d get really angry and spell it all out.
Interesting to see someone who’s job is basically predicting the future…predict THEIR future.
Today, I went to the beach with my children. I found a sea shell and gave it to
my 4 year old daughter and said “You can hear the ocean if you put this to your ear.” She put the shell to
her ear and screamed. There was a hermit crab inside and
it pinched her ear. She never wants to go back! LoL I know
this is entirely off topic but I had to tell someone!