SED 2022 - Notes on contemporary macro

I just got back from an excellent meeting of the Society for Economic Dynamics, a top conference for work in dynamic economics, principally but not exclusively in macroeconomics. As one of the first in-person conferences I’ve been to since 2020 (last year they were hybrid and I presented from home), it was a chance to catch up not just with colleagues and friends but also with the state of modern academic macro, after some time focusing more on other things. While the conference is fresh in my mind, I thought I’d jot down a few bigger picture notes to start thinking about where the field is and where it might be headed.

Of course, the conference has so many parallel sessions that I’m sure no two people had the same experience, aside from the plenary talks, and my particular focus, mostly on computational and econometric methods, is a specialized niche within the whole, but given that it’s particularly valuable for methodologists to have a sense of what applied problems people currently work on and how they’re going about it, I did try to explore a little more. That said, even within what I saw these are just first impressions and themes.

HANK models have matured

Research literatures in macroeconomics seem to have a life cycle that goes in stages.

First, some creative thinkers come up with a concept and implement an early version showing it can be done. For heterogeneous agent New Keynesian (HANK) literature, around the early to mid 2010s, the idea was to merge our benchmark incomplete markets models of inequality and individual spending and saving behavior with our benchmark New Keynesian models of monetary policy, inflation, and business cycles to start to answer questions about how they interact.

Second, the research community enters a stage of twiddling with all the knobs, investigating all the features of a type of model and understanding which features are important for which outcomes and why, and how they interact. Some of the choices in the initial model may have been just placeholder first guesses, that after a period of trial and error over different specifications will be swapped out for something more robust or tractable until the literature settles down on a small set of benchmarks. HANK had been actively in this stage in the late 2010s, with many competing variations working out questions about specification in terms of fiscal transfers, portfolio choices, preferences and so on, along with the role of methods (discrete vs continuous time, MIT shocks vs Krusell-Smith vs perturbation, sequence versus state space, etc). There’s still some of this settling of basic questions going on, but there seemed to be more of this in the previous 2 or 3 SED meetings.

Third, after enough knob twiddling, people understand the framework well enough to put the model to work, as a tool for basic measurement and for policy analysis. HANK now seems to be entering in this mature stage, what Kuhnians would call “normal science”, with lots of applications to understanding the effects of particular policy proposals or shocks, measuring and quantifying different sources of inequality, and as a baseline for incorporating new proposed deviations or frictions. I went to a lot of talks where the format was “here’s a policy issue or fact to explain. We motivate with some simple empirics and maybe a toy model with just the one force, then embed it in a quantitative HANK model to measure that it explains x percent of this trend, or implies that this policy is x percent more/less effective” and fewer that were trying to resolve basic issues of the form “what happens if we take a HANK model and swap out sticky prices vs wages, or real vs nominal debt, etc”.

I suppose beyond those stages, other literatures in macroeconomics that have tread the path before (representative agent DSGE?) maintain a long lifetime of continued routine use, often fading a bit from the academic spotlight but continuing to be useful to policy-makers, both for day-to-day measurement and as a mature and reliable way to get a first pass at the pressing issues of the day. Beyond that, either continued probing of points of empirical dissatisfaction or merging with ideas from some previously disjoint strand lead to new strands of research ideas. For HANK, I suspect much of the original interest was from the desire to reconcile the profession’s incompatible benchmark models of individual behavior and of business cycle aggregates, with the most notable empirical problem being dissatisfaction with the MPC implications of the typical representative agent Euler equation. I think there are many years left of both basic exploration and normal science left to do with HANK, though pattern matching suggests that something will eventually come along to form the next generation. It seems too early to specify what that might be. Despite widespread dis-ease with other aspects of the New Keynesian paradigm for monetary economics and many proposals for modifications or replacements, it has proved remarkably persistent by serving as a baseline framework for encompassing divergent views on mechanisms and policies. While disagreements remain, the days when “freshwater” and “saltwater” macroeconomists expressed disagreement mostly through laughter have largely been replaced by conversations about parameter values in shared model classes between researchers who cannot be clustered nearly so easily into ideological camps.

Plenaries

The plenary talks were a good chance to get bigger picture overviews of different subfields, at a range of stages of maturity.

  • Giuseppe Moscarini gave a talk on his work over the past decade in the area of cross-sectional wage dynamics, which has developed since the seminal work of Burdett and Mortensen into a mature area providing a foundation for studies of employer-firm matched data, wage inequality, monopsony, career progression, and so on. He started with an overview of his work with Postel-Vinay on the role of job-to-job switching in wage growth. Then, continuing the theme of the New-Keynesianization of everything, while the more tractable Diamond-Mortensen-Pissarides model of aggregate labor market flows had been merged into monetary models, he presented new work merging the disaggregated “job-ladder” style models into a NK framework suggesting that aggregate job-to-job recruiting, as opposed to just unemployment, is an important and cyclically distinct determinant of aggregate wage inflation.

  • Esteban Rossi-Hansberg presented work that strikes me as very much in the new paradigm stage, on integrating regional heterogeneity into integrated assessment climate models. The question is compelling: while the carbon cycle is global, impacts and adaptation efforts are highly diverse across places, and figuring out how locations which may face very different flooding, extreme weather, temperature changes, and so on will adapt economically is important for measuring global costs and coordinating mitigation efforts. With recent progress in quantitative spatial economics and computational methods applicable to high resolution heterogeneity, models can now incorporate detailed spatial economic data along with high resolution climate data and simulations, and Rossi-Hansberg and collaborators have provided some noteworthy examples. But as he emphasized in the talk, there’s still a lot to learn about how the basic economic mechanisms work, given their current difficulty, and I suspect there is a lot of “knob-twiddling” work to do just to figure out what are the important aspects to put into such a model and how to specify and solve them, before these reach the normal science stage where we can just focus on arguing over a few crucial parameters like climate macroeconomists working with aggregate models have been up to for years now. This talk inspired me, though I currently don’t do any work in climate, to attend some of the climate sessions later on in the conference, where young researchers are working hard to figure it out.

  • On the last day, IMF chief economist Gita Gopinath gave a talk on how open economy macroeconomic research informs the current work of the Fund and its policy framework. The talk was surprisingly academic in style, with a discussion of models and empirics in a way that you don’t usually get out of public-facing speeches by policy-makers, but directed at informing working economists about the role the work plays in the policy process. This involves aggregating a long history of work on available policy tools to synthesize policy recommendations, not any single model or study but a systematic review of many, with some modeling work done mainly to quantify and reconcile models of competing effects each described individually. The resulting framework reflects a very gradual evolution of the Fund’s views, from its 1990s Mundell-Fleming inspired recommendations for exchange rate flexibility as a stabilizing buffer, to incorporating decades of work since the Asian Financial Crisis of 1997-98 on models of borrowing constraints, sudden stops, and financial frictions suggesting that in some contingent cases capital controls may be a desirable measure. This stance broadly was already conventional wisdom by the time I graduated with an International Economics degree in 2009, but putting it in an official IMF policy document collecting a large number of careful studies of pros and cons represents a long process. As a bonus, she also gave a brief overview of one of her pre-IMF era research contributions, on dominant currency pricing in open economy New Keynesian models. This is a clear example of valuable knob-twiddling research, showing that the symmetric pricing assumption used in early models largely out of convenience was not only implausible but also consequential, with likely implications for global trade volumes during the current Fed tightening cycle.

Miscellaneous thoughts

  • On the econometrics side, after watching a presentation by Ashesh Rambachan on IRF interpretation (paper, my notes), I saw the implications all over other talks. Micro people have been reckoning with the need to precisely define the counterfactual path of a shock in dynamic models, as measured IRFs can be a mixture of other things. Some talks with IRFs gave this serious thought, formally or not, others not so much; gratefully, audiences seemed willing to provide helpful feedback in those cases.

  • Sequence space methods for heterogeneous agents models are seeing really fast adoption, from a fairly technical 2021 Econometrica paper to a relatively common approach. In addition to speed, I think this in part reflects interpretability, since it lets economists derive equilibrium conditions which can be informative even before fully solving numerically.

  • Wisconsin cheese curds are better than I expected.

Related