The Crisis & Economics, Part 5: “Shhh! We’re Working On It”

This is part 5 in my series on how the financial crisis is relevant for economics (parts 1, 2, 3 & 4 are here). Each part explores an argument economists have made against the charge that the crisis exposed fundamental failings of their discipline. This post explores the possibility that macroeconomics, even if it failed before the crisis, has responded to its critics and is moving forward.

Argument #5: “We got this one wrong, sure, but we’ve made (or are making) progress in macroeconomics, so there’s no need for a fundamental rethink.”

Many macroeconomists deserve credit for their mea culpa and subsequent refocus following the financial crisis. Nevertheless, the nature of the rethink, particularly the unwillingness to abandon certain modelling techniques and ideas, leads me to question whether progress can be made without a more fundamental upheaval. To see why, it will help to have a brief overview of how macro models work.

In macroeconomic models, the optimisation of agents means that economic outcomes such as prices, quantities, wages and rents adjust to the conditions imposed by input parameters such as preferences, technology and demographics. A consequence of this is that sustained inefficiency, unemployment and other chaotic behaviour usually occur when something ‘gets in the way’ of this adjustment. Hence economists introduce ad hoc modifications such as sticky prices, shocks and transaction costs to generate sub-optimal behaviour: for example, if a firm’s cost of changing prices exceeds the benefit, prices will not be changed and the outcome will not be Pareto efficient. Since there are countless ways in which the world ‘deviates’ from the perfectly competitive baseline, it’s mathematically troublesome (or impossible) to include every possible friction. The result is that macroeconomists tend to decide which frictions are important based on real world experience: since the crisis, the focus has been on finance. On the surface this sounds fine – who isn’t for informing our models with experience? However, it is my contention that this approach does not offer us any more understanding than would experience alone.

Perhaps an analogy will illustrate this better. I was once walking past a field of cows as it began to rain, and I noticed some of them start to sit down. It occurred to me that there was no use them doing this after the storm started; they are supposed to give us adequate warning by sitting down before it happens. Sitting down during a storm is just telling us what we already know. Similarly, although the models used by economists and policy makers did not predict and could not account for the crisis before it happened, they have since built models that try to do so. They generally do this by attributing the crisis to frictions that revealed themselves to be important during the crisis. Ex post, a friction can always be found to make models behave a certain way, but the models do not make identifying the source of problems before they happen any easier, and they don’t add much afterwards, either – we certainly didn’t need economists to tell us finance was important following 2008. In other words, when a storm comes, macroeconomists promptly sit down and declare that they’ve solved the problem of understanding storms.  It becomes difficult to escape the circularity of defining the relevant friction by its outcome, hence stripping the idea of ‘frictions’ of predictive power or falsifiability.

There is also the open question of whether understanding the impact of a ‘friction’ relative to a perfectly competitive baseline entails understanding its impact in the real world. As theorists from Joe Stiglitz to Yanis Varoufakis have argued, neoclassical economics is trapped in a permanent fight against indeterminacy: the quest to understand things relative to a perfectly competitive, microfounded baseline leads to aggregation problems and intractable complexities that, if included, result in “anything goes” conclusions. To put in another way, the real world is so complex and full of frictions that whichever mechanics would be driving the perfectly competitive model are swamped. The actions of individual agents are so intertwined that their aggregate behaviour cannot be predicted from each of their ‘objective functions’. Subsequently, our knowledge of the real world must be informed by either models which use different methodologies or, more crucially, by historical experience.

Finally, the ad hoc approach also contradicts another key aspect of contemporary macroeconomics: microfoundations. The typical justification for these is that, to use the words of the ECB, they impose “theoretical discipline” and are “less subject to the Lucas critique” than a simple VAR, Old Keynesian model or another more aggregative framework. Yet even if we take those propositions to be true, the modifications and frictions that are so crucial to making the models more realistic are often not microfounded, sometimes taking the form of entirely arbitrary, exogenous constraints. Even worse is when the mechanism is profoundly unrealistic, such as prices being sticky because firms are randomly unable to change them for some reason. In other words, macroeconomics starts by sacrificing realism in the name of rigour, but reality forces it in the opposite direction, and the end result is that it has neither.

Macroeconomists may well defend their approach as just a ‘story telling‘ approach, from which they can draw lessons but which isn’t meant to hold in the same manner as engineering theory. Perhaps this is defensible in itself, but (a) personally, I’d hope for better and (b) in practice, this seems to mean each economists can pick and choose whichever story they want to tell based on their prior political beliefs. If macroeconomists are content conversing in mathematical fables, they should keep these conversations to themselves and refrain from forecasting or using them to inform policy. Until then, I’ll rely on macroeconomic frameworks which are less mathematically ‘sophisticated’, but which generate ex ante predictions that cover a wide range of observations, and which do not rely on the invocation of special frictions to explain persistent deviations from these predictions.

About these ads

, , , ,

  1. #1 by J. Edgar Mihelic on July 24, 2014 - 6:20 pm

    Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is long past the ocean is flat again?

    • #2 by Unlearningecon on July 24, 2014 - 6:45 pm

      Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us the season is tempestuous.

      • #3 by yorksranter on July 25, 2014 - 8:16 pm

        That said, a cow that sits down when it starts raining has actually achieved what it set out to – sitting down while the grass is dry.

        Similarly, in tempestuous seasons, it is neither too useless nor too easy a task to observe that a tempest has begun and shorten the sails.

      • #4 by Unlearningecon on July 25, 2014 - 8:51 pm

        Yeah, my response was tongue in cheek – obviously the analogy has its limits. Responding once a crisis happens is useful, but theoretically, attributing each crisis to frictions largely specific to that crisis does not tell us much.

  2. #5 by Bill Janeway on July 24, 2014 - 11:13 pm

    By Bill Janeway. These compromise an elegant summary of the challenges posed by the GFC and the GR to EMH, RMH and much of Modern Macro. It is why “2008 is the gift that keeps on giving” as many economists do seek to “work on it,” often deploying tools such as global games and network theory and even agent-based models that were available prior thereto but off the main sequence. Most importantly, 2008 is motivating the critically needed re-integration of finance and economics at every level of analysis.

  3. #6 by Min on July 25, 2014 - 2:36 pm

    Shouldn’t frictions be revealed by looking at the things that are affected by them? Take wages, for example. Occasionally in economics blogs I have seen graphs of year over year changes in nominal wages. These graphs have been for years both before and after the recent financial crisis. They all look similar, and there is some evidence of friction. Using ascii characters, here is something like how they look. (I am using periods to try to get the spacing right after wordpress processes this.)

    ……….|
    ……….|
    ……….|/.\
    ………/….\
    ……../……\

    I don’t know how that turned out, but the graphs show what looks like a normal distribution with a positive mean and a high spike at zero. It is this spike at zero that seems to be evidence for friction. The normal distribution shows no other sign that wages are being held down or pulled up.

    The thing is, these graphs all look similar, regardless of macroeconomic conditions, both before and after the crisis. If there are frictions holding wages down under some conditions or boosting them under others, why do these graphs look alike?

    What accounts for the spike at zero? My guess is multi-year contracts plus minimum wages (and near minimum wages). And my guess is also that without the spike, we would get a normal looking distribution with a lower mean. My guess is also that wages are being held down via the zero spike because of the market power of employers, with general economic conditions having little effect.

    • #7 by Unlearningecon on August 15, 2014 - 7:24 pm

      Perhaps this will help answer your question. It uses extensive survey evidence to argue that the major reason wages are sticky is due to considerations of fairness, loyalty and morale inside firms.

  4. #8 by Boatwright on July 26, 2014 - 3:08 pm

    “Two states differing by imperceptible amounts may eventually evolve into two considerably different states … If, then, there is any error whatever in observing the present state — and in any real system such errors seem inevitable — an acceptable prediction of an instantaneous state in the distant future may well be impossible….” Edward N Lorenz, MIT

    This is the fundamental principle of mathematical analysis of complex systems — now accepted as an axiom in all physical science.

    As long as economics is stuck in a fundamentalist, Newtonian determinism it will remain stunted and tautological.

  5. #9 by Boatwright on July 26, 2014 - 3:18 pm

    I should add that Lorentz and his many colleagues did not say that complex, chaotic systems are immune to analysis. There is now a large body of work called “Chaos Theory” that addresses this problem.

  1. Serial bloggage | Alex's Archives
  2. The Crisis & Economics, Part 6: “Oh, You Just Mean Macro” | Unlearning Economics
  3. The Crisis & Economics, Part 7: A Case of Myopia? | Unlearning Economics
Follow

Get every new post delivered to your Inbox.

Join 1,023 other followers