Is economics looking at itself?

Patricia Cowen recently wrote a piece for the New York Times:  “Ivory Tower Unswayed by Crashing Economy

The article contains precisely what you might expect from a title like that.  This snippet gives you the idea:

The financial crash happened very quickly while “things in academia change very, very slowly,” said David Card, a leading labor economist at the University of California, Berkeley. During the 1960s, he recalled, nearly all economists believed in what was known as the Phillips curve, which posited that unemployment and inflation were like the two ends of a seesaw: as one went up, the other went down. Then in the 1970s stagflation — high unemployment and high inflation — hit. But it took 10 years before academia let go of the Phillips curve.

James K. Galbraith, an economist at the Lyndon B. Johnson School of Public Affairs at the University of Texas, who has frequently been at odds with free marketers, said, “I don’t detect any change at all.” Academic economists are “like an ostrich with its head in the sand.”

“It’s business as usual,” he said. “I’m not conscious that there is a fundamental re-examination going on in journals.”

Unquestioning loyalty to a particular idea is what Robert J. Shiller, an economist at Yale, says is the reason the profession failed to foresee the financial collapse. He blames “groupthink,” the tendency to agree with the consensus. People don’t deviate from the conventional wisdom for fear they won’t be taken seriously, Mr. Shiller maintains. Wander too far and you find yourself on the fringe. The pattern is self-replicating. Graduate students who stray too far from the dominant theory and methods seriously reduce their chances of getting an academic job.

My reaction is to say “Yes.  And No.”  Here, for example, is a small list of prominent economists thinking about economics (the position is that author’s ranking according to ideas.repec.org):

There are plenty more. The point is that there is internal reflection occurring in economics, it’s just not at the level of the journals.  That’s for a simple enough reason – there is an average two-year lead time for getting an article in a journal.  You can pretty safely bet a dollar that the American Economic Review is planning a special on questioning the direction and methodology of economics.  Since it takes so long to get anything into journals, the discussion, where it is being made public at all, is occurring on the internet.  This is a reason to love blogs.

Another important point is that we are mostly talking about macroeconomics.  As I’ve mentioned previously, I pretty firmly believe that if you were to stop an average person on the street – hell, even an educated and well-read person – to ask them what economics is, they’d supply a list of topics that encompass Macroeconomics and Finance.

The swathes of stuff on microeconomics – contract theory, auction theory, all the stuff on game theory, behavioural economics – and all the stuff in development (90% of development economics for the last 10 years has been applied micro), not to mention the work in econometrics; none of that would get a mention.  The closest that the person on the street might get to recognising it would be to remember hearing about (or possibly reading) Freakonomics a couple of years ago.

Formalism and synthesis of methodology

Robert Gibbons [MIT] wrote, in a 2004 essay:

When I first read Coase’s (1984: 230) description of the collected works of the old-school institutionalists – as “a mass of descriptive material waiting for a theory, or a fire” – I thought it was (a) hysterically funny and (b) surely dead-on (even though I had not read this work). Sometime later, I encountered Krugman’s (1995: 27) assertion that “Like it or not, … the influence of ideas that have not been embalmed in models soon decays.” I think my reaction to Krugman was almost as enthusiastic as my reaction to Coase, although I hope the word “embalmed” gave me at least some pause. But then I made it to Krugman’s contention that a prominent model in economic geography “was the one piece of a heterodox framework that could easily be handled with orthodox methods, and so it attracted research out of all proportion to its considerable merits” (p. 54). At this point, I stopped reading and started trying to think.

This is really important, fundamental stuff.  I’ve been interested in it for a while (e.g. my previous thoughts on “mainstream” economics and the use of mathematics in economics).  Beyond the movement of economics as a discipline towards formal (i.e. mathematical) models as a methodology, there is even a movement to certain types or styles of model.  See, for example, the summary – and the warnings given – by Olivier Blanchard [MIT] regarding methodology in his recent paper “The State of Macro“:

That there has been convergence in vision may be controversial. That there has been convergence in methodology is not: Macroeconomic articles, whether they be about theory or facts, look very similar to each other in structure, and very different from the way they did thirty years ago.

[M]uch of the work in macro in the 1960s and 1970s consisted of ignoring uncertainty, reducing problems to 2×2 differential systems, and then drawing an elegant phase diagram. There was no appealing alternative – as anybody who has spent time using Cramer’s rule on 3×3 systems knows too well. Macro was largely an art, and only a few artists did it well. Today, that technological constraint is simply gone. With the development of stochastic dynamic programming methods, and the advent of software such as Dynare – a set of programs which allows one to solve and estimate non-linear models under rational expectations – one can specify large dynamic models and solve them nearly at the touch of a button.

Today, macro-econometrics is mainly concerned with system estimation … Systems, characterized by a set of structural parameters, are typically estimated as a whole … Because of the difficulty of finding good instruments when estimating macro relations, equation-by-equation estimation has taken a back seat – probably too much of a back seat

DSGE models have become ubiquitous. Dozens of teams of researchers are involved in their construction. Nearly every central bank has one, or wants to have one. They are used to evaluate policy rules, to do conditional forecasting, or even sometimes to do actual forecasting. There is little question that they represent an impressive achievement. But they also have obvious flaws. This may be a case in which technology has run ahead of our ability to use it, or at least to use it best:

  • The mapping of structural parameters to the coefficients of the reduced form of the model is highly non linear. Near non-identification is frequent, with different sets of parameters yielding nearly the same value for the likelihood function – which is why pure maximum likelihood is nearly never used … The use of additional information, as embodied in Bayesian priors, is clearly conceptually the right approach. But, in practice, the approach has become rather formulaic and hypocritical.
  • Current theory can only deliver so much. One of the principles underlying DSGEs is that, in contrast to the previous generation of models, all dynamics must be derived from first principles. The main motivation is that only under these conditions, can welfare analysis be performed. A general characteristic of the data, however, is that the adjustment of quantities to shocks appears slower than implied by our standard benchmark models. Reconciling the theory with the data has led to a lot of unconvincing reverse engineering

    This way of proceeding is clearly wrong-headed: First, such additional assumptions should be introduced in a model only if they have independent empirical support … Second, it is clear that heterogeneity and aggregation can lead to aggregate dynamics which have little apparent relation to individual dynamics.

There are, of course and as always, more heterodox criticisms of the current synthesis of macroeconomic methodology. See, for example, the book “Post Walrasian Macroeconomics: Beyond the Dynamic Stochastic General Equilibrium Model” edited by David Colander.

I’m not sure where all of that leaves us, but it makes you think …

(Hat tip:  Tyler Cowen)

Post Walrasian Macroeconomics

In part because it’s the sort of stuff that I’ve always been interested in anyway, in part because people like Crighton, Luke and Nic (you know who you are) have always advocated this sort of stuff and in part because it relates closely as a pratical example of my thoughts on moving the mainstream, I have picked up (well, borrowed) a copy of “Post Walrasian Macroeconomics: Beyond the Dynamic Stochastic General Equilibrium Model”, edited by David Colander [Amazon, Cambridge].

I’ve not had any serious exposure to DSGE models (LSE touches on them briefly at the M.Sc. level when giving pen-and-paper examples of Real Business Cycle Theory. It’s only at the M.Res. level (this coming year) that we get to put some teeth on it), but I’ve always been attracted to agent-based modelling in economics since I did my Computer Systems Engineering degree when artificial neural networks and the like were attracting attention.

The first 80 pages or so seem to be trying to recast the Classical economics movement of the start of the 20th Century as a precursor, not of modern neoClassical/neoKeynsian hybrids that still take formal Walrasian general equilibrium as their basis, but instead of what they call Post-Walrasian thinking, where nonlinear dynamics and the multiple equilibria they imply are entry requirements, and where institutions and nominal frictions serve to constrain the chaos instead of simply limiting the move to an intertemporal general equilibrium as they do in DSGE work.

No, I’m not sure I understand all of that either. I certainly need to find a decent (and ideally, neutral) summary of mainstream economic thought over the last century. If anybody has any suggestions, I’d be grateful.

Update: Well, it turns out that there was indeed a neoClassical/neoKeynesian synthesis, but it is by no means current mainstream thinking, which is — according the authors — described better as a New Classical/New Keynesian synthesis. More to come …