Wednesday, 12 March 2014

Papers from the macroeconomics frontier

I spent last Friday and Saturday at the Southern Workshop in Macroeconomics (SWIM), held at the University of Auckland. It's an event that has attracted some very high level speakers since it got underway in 2005 - Prescott (2006), Lucas (2008), Barro (2011), Turnovsky (2012) - and I thought it would be a good opportunity to catch up with the state of play towards the bleeding edge of modern macro. And I'd like to acknowledge the sponsors who helped make it happen: our Reserve Bank, the Australian National University's Centre for Applied Macroeconomic Analysis, and our Productivity Commission, as well as the organising team from the University of Auckland (El-hadj  Bah, Debasis Bandyopadhyay, Prasanna Gai, Daryna Grechyna) and from Victoria University (Martin Berka).

What struck me most about the current state of play, going by the papers presented, is how much more theoretical and microeconomics-grounded macro has become since the days I first studied it (a long time ago). Partly, that reflected the nature of SWIM, which prefers "papers in any area of macroeconomics that has a strong theoretical foundation", but it also extends to macro as a whole. Back in the days, we tended to work with the aggregates or the sub-aggregates in the national accounts (consumption, investment, stocks, what have you): now, models are typically micro based with (for example) the consumer modelled as a "representative agent", explicitly optimising a lifetime series of choices between work, consumption, and leisure, subject to a budget constraint. As an aside, I was also struck by the high intellectual calibre of the presenters, who were a seriously bright bunch, as they need to be these days to handle these highly complex models.

As you may know, there's a lively debate about these modern families of models - Dynamic Stochastic General Equilibrium (DSGE) models being the archetype - and whether they are any good, a debate principally fuelled by their alleged inability to spot the GFC before it happened, or even to explain it after the event inside the model.

I lean towards the sceptical view myself, and not only because I'm a "show me the R squared and are the signs on the coefficients right" sort of guy. These models are intellectually elegant, even beautiful (if I can get carried away for a moment), but they tend to come with two disadvantages: some of their predictions are at odds with the real world being modelled, and they tend to oversimplify (or ignore) important aspects of the world (like credit or finance).

To be fair, many modern macro modellers know this, but their typical solution (introduce a "friction" of some kind into one of more of the sectors in the model) comes with two more problems. There is an arbitrarily large number of frictions that can be introduced, and how the model subsequently behaves is very heavily dependent on exactly how each friction is characterised. The upshot is that you can make a DSGE model say anything you like depending on how you've modified it. Some might see that as a plus: I don't.

All that said, there were some papers that especially appealed even to an empirically minded curmudgeon. I liked Diego Restuccia's paper on land misallocation and productivity, which among other things showed the inefficiencies wrought by land "reforms" in Malawi and the Philippines (I blogged earlier about his presentation on productivity to the Government Economics Network). Also the work that Robert Ductor at Massey (with a co-author) is doing on establishing the degree and pathways of global business cycle interdependencies, and the work that Tim Robinson at the Reserve Bank of Australia (again with co-authors) presented on the effect of "forward guidance" (central banks' new practice of signalling when interest rates are likely to change) on market participants' expectations.

Top billing, for me, went to the paper "A theory of factor shares", by Sephorah Mangin from Monash University. Yes, it was one of those mathematically complex models with an elaborate microfoundation and a "friction" (the process of matching workers and firms). But not only was it consistent with the microeconomics, it did a fine job (when calibrated to US factor shares 1951-2003) of explaining the real world (though it did appear to break down in 2004 and 2005).

As it happened, the results from the fully microeconomics-consistent model boiled down to what you would have got, had you simply regressed factor shares on the unemployment rate (which proxies the bargaining power of employers) and eligibility for unemployment insurance (which proxies the reservation wage of the employed, in that if insurance is sufficiently available, they will tell an employer, pushing his luck with a low-ball wage offer, to get stuffed). So for once the fancy model and the palaeolithic econometrics got you to the same place.

No comments:

Post a Comment

Hi - sorry about the Captcha step for real people like yourself commenting, it's to baffle the bots