Saturday 18 January 2014

some great tune

Here's a piece by Charlie Haden called "Nightfall". The piano solo is really beautiful (and so simple!).

Friday 17 January 2014

macro models taken at face value

I've been thinking a bit more about complete markets. Funnily, it was the first economic concept that I got in touch with in my math undergrad, although all I learnt about it back then was that it meant injectivity of the mapping from the probability space to the payoff space, which was good for throwing some measure theory at it.

Basically, in an economy with complete markets it is possible to costlessly write perfectly enforcible financial contracts on any current or future event and trade them on a competitive market. It's hard to explain in basic language, because it's pretty far from reality. If the world was a roulette game, then complete markets would be the existence of a competitive exchange in which a set of 37 assets are traded before each spin of the wheel, each paying off a fixed sum in the event that the roulette ball falls into the corresponding pocket.

The concept is applied quite pervasively in macroeconomics, international economics and finance, and forms the starting point for almost every analysis (at least that's what I've been taught). Imposing it has some very strong consequences. One is that, assuming standard preferences, consumption growth equalises among all agents. So consumption comoves perfectly between any two agents inside a country, and consumption per capita comoves perfectly across countries. In other words, the ratio of consumption per capita never changes across countries and within countries.

Of course, such a model prediction is just nuts. You needn't even look at data to know it won't fit. Just for fun, I made a few graphs anyway showing consumption per capita across countries. If two countries participate in complete markets, then the prediction is that the ratio of their consumption per capita is constant. So here are the relevant ratios of some countries with respect to Germany (sorry for the home bias in choosing a reference point). The data are in PPPs straight out of the Penn World Tables.

Here are some Western economies:


And some Emerging Markets:


Of course, consumption per capita relative to Germany is not constant (not within the Eurozone and not outside of it). A country like Singapore has caught up enormously while the difference to France has come down (if you believe the PWT of course).

Clearly, complete markets are not a good idea to describe long-run consumption growth across countries. I think that's mostly due to the implicit assumption that agents know, observe and can contract on exogenous shocks, whereas in reality we don't even know what those shocks are (see also my earlier post).

Unfortunately, complete markets lead to bad predictionsat business cycle frequency, too (see Heathcote & Perri ). And in closed-economy macro, they don't depict reality well either, since they essentially move a model towards a representative consumer environment which has been countlessly criticised (Noah Smith's latest post on the Euler equation is a case in point).

And yet, complete markets are the assumption you start with in macro and international macro. Why is that? The easiest answer is that it simplifies things in a model. It essentially allows you to have a representative consumer, which is more convenient for anyone who i) doesn't want to spend much time on getting a model started, ii) cares only about the supply side, or iii) wants to draw easy policy conclusions without distributional issues. In any case, you can focus on other things you want to analyse. That's a reasonable point of view: As economists, we produce models as stories to explain certain aspects of the economic sphere, and some models fit some purposes and others fit others. After all, we economists are not trying to have a unified theory that is consistent with all aspects of our data at once. Of course physics, the science we are said to envy, is aiming at exactly that, but that's different.

What I want to get at is that this scientific leap of faith is not particular to complete markets. As economists, we often build models to shed light on some aspect of the economy, complete of course with testing, calibrating, estimating against selected aspects of economic data. But the building blocks of these models, if taken at face value, provide many more predictions about the data than those we analyse, and a lot of them are completely at odds with the data! Calvo pricing, the exogenous TFP process, Modigliani-Miller, the Euler equation, default rates with credit frictions, the representative agent, the CRS production function, Nash equilibrium, even expected utility maximisation, you name it.

Now here's something odd: Sometimes, people use counterfactual model predictions to strike down a paper, and sometimes they are just okay with them. I cannot discern a pattern in this! It makes me feel I'm completely missing something in my field.

Take the attacks on New Keynesian models. Many people say these models aren't valid because they imply that individual firms change prices very infrequently, which is not true in the data. On the other hand, I have never seen anybody criticise a model because it has an Euler equation, although that also implies counterfactual model predictions on consumption growth. Is there a rule that justifies which type of counterfactual model predictions are acceptable and which aren't? Or maybe it's just okay as long as the paper is "convincing"?

One candidate for a rule could be that counterfactual predictions are okay as long as they're "microfounded". I'm not sure what microfoundations are but I think it means that a model outcome is the result of agents' choices in some constrained optimisation problem in which there is no a priori restriction on choices, only a weighing of costs and benefits. Is this the right way to go? There has been a formidable exchange of views on this position recently (e.g. here, here and here); in any event, the complete markets predictions would pass this test.

Another rule could be that models delivering counterfactual predictions are acceptable if the predictions would actually be accurate under some ideal conditions. That is a very powerful argument, and part of the bread and butter of natural sciences. One standard comparison is to the concept of gravity in elementary physics. The theory states that the gravitational force is such that any object is pulled towards the ground with constant acceleration. In particular, a feather and a cannonball thrown horizontally from the same height are predicted to reach the ground at the same time. That's of course completely at odds with what we usually observe, but it is only so because of air friction (and, as I just learned from Wikipedia, buoyancy). In a vacuum, the "model prediction" fits the data perfectly. So it makes sense to start from the constant acceleration model and then factor in other effects afterwards, which involve more complex calculations.

In the same way, we can argue that consumption behaviour under complete markets, or the representative agent Euler equation etc., would be accurate depictions of reality in an idealised economic setting. Then, if we want to make predictions for the world we live in, we start from there and incorporate all kinds of frictions. Such is indeed the agenda of a good part of  macroeconomics, where entire research programmes study how to extend the standard neoclassical models by including heterogenous agents, occasionally binding borrowing constraints, private information, endogenously incomplete markets, rational inattention and much more.

But it's not so clear whether this is a good strategy in economics, even if it is successful in physics. There, we can actually create the idealised conditions and do experiments - we can generate a vacuum and test constant gravitational acceleration - or in their absence we can make repeated experiments to measure the deviation from the prediction and find regularities in that. This is difficult to do in economics, and particularly macroeconomics, where we can rarely do experiments. As a consequence, the idealised conditions and the predicted behaviour are often not the product of experimental research but of some axiomatic approach, like expected utility maximisation or Nash bargaining. To my taste, the justification that the data don't fit because of some missing "friction" becomes a bit weak.

But there is still a very pragmatic merit to thinking in these terms: They provide a common, simple framework that we can use to start thinking about things. We as economists might not understand every paper with its particular setup, but we all understand complete markets, and so we can exchange ideas starting from there. And when we want to analyse a new problem, we have a baseline from which to start exploring. In this sense, we're functioning a bit like an expedition into some unknown land: we build a basecamp where we are comfortable and safe, and then explore the territory from there. When we get stuck and the terrain becomes too difficult, we can always return to the basecamp to devise a new strategy and try again the next day. That way, when we want to think about inflation inertia, we can start with Rotemberg or Calvo pricing and develop our ideas from there, even if we don't think it's a good description of the world - at least it's a starting point. I can see nothing wrong with this approach. The only caveat with it is that we have to be ready to move our basecamp as we explore our territory further. There is no guarantee that the initial spot we picked will remain ideal forever (it might only be a local optimum, so to say). I guess that is what behavioural economists are trying to convince the rest of the profession of for some while now. They yet have to show that their "basecamp" is the more useful one, but usefulness should really be the main criterion. To use a really heavy-handed analogy: You can perfectly describe the motions of planets using a geocentric system by putting in lots and lots of frictions. But what people eventually discovered was that it is much easier to start from a heliocentric system, for which you need much fewer "frictions" to get rid of counterfactual predictions. Of course moving that basecamp wasn't particularly easy either.

In any case, criticising fundamental building blocks is easy, but coming up with alternatives is hard.  The alternatives to the neoclassical DSGE "basecamp" that are out there are obscure and/or unwieldy (naturally, since the status quo has been worked on more). Certainly, I don't really have the competence to plant a new basecamp. Nor would it be wise to start working on that: First I have to finish my PhD...

Wednesday 15 January 2014

"Grüner Wirtschaftsliberalismus"? Loske in der FAZ

Heute ist in der FAZ ein nicht nur innovativer und unkonventioneller, sondern auch exzellent geschriebener Artikel von Prof. Reinhard Loske (U Witten/Herdecke) erschienen.

Loske skizziert darin eine Vision, wie der Liberalismus in Deutschland, der spätestens mit dem Rausschmiss der FDP aus dem Bundestag kein politisches Zuhause mehr hat, in der Partei der Grünen wieder zu neuem Leben finden könnte. Die Verbindung von ökologischen Zielen und liberalen Grundwerten könne einen neuen Gesellschaftsentwurf produzieren, und nebenbei einen Kontrast bilden zur grossen Koalition mit "ihrem Hang zu Etatismus, Korporatismus und „großen Lösungen“". Diese Idee, so Loske, werde derzeit verstärkt bei den Grünen intern diskutiert.

Für mich ist der Artikel eine Art Offenbarung gewesen, denn mit Liberalismus habe ich die Grünen wirklich nie assoziiert. Tatsächlich sind die Grünen in meiner Vorstellung immer eine Verbotspartei aus wirtschaftsfremden Geisteswissenschaftlern, Alt-Achtundsechzigern und zuckerfreien Müttern. Das entspricht wahrscheinlich nicht der objektiven Realität, wohl aber meinen subjektiven Erfahrungen. Nur ein Beispiel: Die B31 ist eine wichtige Verkehrsachse in Süddeutschland und führt mitten durch Freiburg im Breisgau, einer Hochburg der Grünen. Pläne, die Stadt vom Verkehr zu entlasten, gibt es seit den siebziger Jahren. Aber erst 2002 wurde der östliche Teil Freiburgs mit einem Tunnel versehen, und der westliche Teil ist immer noch in Planung. Daran sind nicht nur knappe öffentliche Kassen schuld, sondern auch Widerstand der Grünen. Diese wollten die durch den Tunnelbau bedingten Eingriffe in die Natur nicht hinnehmen - unvergessen die Baumbesteiger, die den Tunnelbau in letzter Minute verhindern wollten - und argumentierten ausserdem, durch den Tunnelbau würden mehr Anreize zum Autofahren gesetzt, wo die richtige Lösung doch im Umstieg auf ökologischere Verkehrsmittel wie die Schiene bestünde. Sie setzten weiterhin im noch oberirdisch verlaufenden Teil ein Tempolimit 30 km/h von 22 Uhr bis 6 Uhr durch. Damit ist die B31 vielleicht die einzige vierspurig ausgebaute Strasse mit Tempolimit 30 in Deutschland. Weitsichtige Stadtplanung sieht anders aus.

Dieses Beispiel reiht sich ein in eine Reihe von Fällen, in denen die Grünen vor allem als Partei des Stillstands in Erscheinung getreten sind. Dazu gehört der desaströse Protest gegen Stuttgart 21 aus Gründen des Baumerhalts und Borkenkäferschutzes, gleichartige Proteste gegen das Hamburger Airbuswerk, sowie eine feindliche Grundeinstellung gegen die meisten Hochtechnologien wie Atomkraft, Gentechnik, Automobil und Mobiltelefonie. Auch Freihandel ist nicht gerade ein hehres Ideal der Grünen und ihrer Anhänger. Mittel zur Durchsetzung der politischen Ziele sind oft Blockade, Verbot oder Vorschrift. Letztere zeigt sich z.B. im Kampf der Grünen für die Frauenquote. Ich bin bei weitem kein Gegner grüner Ziele wie einem vorsichtigeren Umgang mit der Atomkraft oder mehr Chancengleichheit für Frauen. Aber das Bild, das ich bisher von den Grünen habe, hat mit Liberalismus so viel zu tun wie ein G8-Protest mit einem G8-Gipfel.

Umso vielversprechender ist es, wenn Reinhard Loske nun die Synthese aus Ökologie und Liberalismus fordert. Ist es mit der Partei zu machen? Ich weiss es nicht. Aber dieser Gegensatz erscheint mir als einer der ganz grossen unserer Zeit, und seine Synthese hätte das Potential für eine grosse politische Strömung.

Den Gegensatz stellt Loske pointiert heraus:
Kann es nicht auch sein, dass manche ökologischen Einsichten und Notwendigkeiten mit dem politischen Liberalismus gar nicht zu vereinbaren sind? Muss nicht, wer für eine Regionalisierung des Wirtschaftens eintritt, globale Freihandelsregime per se kritisch sehen? Muss nicht, wer den Überkonsum der reichen Industriestaaten als eine der Hauptursachen der Umweltkrise ausgemacht hat, der gleißenden Warenwelt und ihrer stetigen Expansion schon im Grundsatz ablehnend gegenüberstehen? Kurzum: Muss, wer die planetaren Grenzen als Realität erkannt hat, „Freiheit in Verantwortung“ nicht ganz anders buchstabieren, als der Liberale es tut, dem individuelle Selbsterfüllung das Höchste ist?
Die Synthese, die er vorschlägt, ist der Größe des Gegensatzes entsprechend nur bruchstückhaft. Gut gefallen hat mir aber der Abschnitt über die Energiewende:
Obwohl allen klar ist, dass die größten und kostengünstigsten Potentiale zur Kohlendioxid-Vermeidung in der Einsparung von Energie liegen, fließen Unmengen von Geld in den unkoordinierten Ausbau der erneuerbaren Energien, die sich überdies zunehmend als Problem der Landschaftsverschandelung erweisen. [...] Eine Politik, die ökologische und freiheitliche Ziele verbindet, wird hier ansetzen. Sie wird einerseits klare Klimaschutzziele über einen langen Zeitraum verlässlich festlegen, damit alle Akteure wissen, woran sie sind. Zum anderen wird sie die Subventionierung der fossilen Energieträger beenden und die der erneuerbaren Energieträger schneller und deutlicher zurückfahren; sie wird der ökologischen Steuerreform eine zweite Chance geben und den Emissionshandel wieder zu einem scharfen Schwert der Klimapolitik machen, kurzum: Sie wird alles dafür tun, dass die Preise die ökologische Wahrheit sagen und so Anreize zu intelligenterer Energienutzung gegeben werden.
Diesen Ansatz würde ich sofort unterschreiben. Der Liberalismus ist von der Realität zunehmend in eine Ecke gedrängt worden. Das Problem der Umweltverschmutzung, aber auch die Finanzkrise der letzten Jahre, zeigen der Vision von der "unsichtbaren Hand" immer schärfere Grenzen auf. Gleichzeitig hat der Zerfall der Sowjetunion aber auch die Unumgänglichkeit marktwirtschaftlicher und liberaler Prinzipien für das Funktionieren einer modernen Gesellschaft aufgezeigt. Keine Frage, dass der Liberalismus weiterhin Teil des Fundaments unserer Gesellschaft bleiben muss.


Ökonomen drücken die Grenzen des Liberalismus mit den Konzepten von "unvollständigen Märkten" und "Externalitäten" aus. Diese führen dazu, dass die unsichtbare Hand von Adam Smith in Gestalt des "ersten Wohlfahrtstheorems" nicht funktioniert. In meiner Interpretation eines ökologischen Liberalismus ist diese Erkenntnis der Ausgangspunkt. Die richtige Antwort darauf besteht aber nicht darin, den Bürger durch Verbote in seinem Selbsttrieb in die Schranken zu weisen, sondern durch geschickte gesellschaftliche Mechanismen die richtigen Anreize zu setzen. Im Beispiel der Energiewende bedeutet das, Energiepreise die ökologischen Kosten widerspiegeln zu lassen, und im Übrigen es den Bürgern und ihren Märkten die Frage zu überlassen, wie Energieeffizienz am besten erreicht werden kann. Es würde bedeuten, den Co2-Emissionshandel ernst zu nehmen (was zur Zeit nicht der Fall ist, siehe hier), Wettbewerb zwischen Energieanbietern zu erhöhen und im Gegenzug Verbraucher steigende Energiepreise nicht zu verheimlichen. Dieser ökologische Liberalismus erkennt die zentrale Botschaft Friedrich Hayeks voll an: dass staatliche, bürokratische Planung nie die Informationen verarbeiten kann, die Märkte dezentral koordinieren. Dort, wo Märkte von freien Bürgern versagen, müssen sie behutsam umstrukturiert, aber nicht ersetzt werden. Die Freiheit des Bürgers darf nicht beschnitten werden, sondern es muss ihm schmackhaft sein, seine freie Entscheidung im Sinne der Gesellschaft zu treffen. Wenn das auch ohne staatlichen Eingriff der Fall ist, umso besser.

Wednesday 8 January 2014

business cycle insurance

I am trying to write a paper on the economic impact of an EU-wide unemployment insurance mechanism (joint with Stephane Moyen and Nikolai Staehler from the Bundesbank). Such a mechanism has been suggested, among others, by EC President van Rompuy and IMF staff. The basic idea is that unemployment insurance might be a simple and implementable way of sharing cross-country risk. Country-specific booms and recessions could be mitigated through transfers among EU member states which take place through differences in contributions paid to and benefits drawn from a common unemployment insurance scheme.

But after a few discussions, I have had to learn that the appeal of cross-country insurance is difficult to sell. This could be a result of my meager sales skills, but I suspect there is something deeper in the resistance to this type of business cycle insurance.

That insurance in general is desirable is the same as saying that people are risk averse, which is not all too disputable. If you know that your house can be destroyed in a fire, then you probably like taking out a fire insurance contract. Most of the time, you pay an insurance company money without getting anything in return, but in the event of a fire you get a lot of money to compensate for the damage to your house. This prevents you from having to make big cuts in your consumption and to run down your lifetime savings when you need to rebuild your house. On the other side, the insurance company is able to provide you such a contract because it pools risk: It receives small payments from a large number of people whose house is not burning and makes large payments to a few people whose house is burning.

The last paragraph is so commonplace it's almost a waste of space. But now replace "house" with "country" and "fire" with "recession", and it becomes a highly controversial issue. Why? What's wrong with building a cross-country insurance mechanism in the EU, for example through a common unemployment insurance system, which every year pays out transfers to countries in a recession and collects premia from countries with healthy economies?

There are numerous attacks on the proposal: the difficulty of measuring recessions as opposed to structural changes, the political viability, the size of welfare gains, and correlation of business cycles across countries, to name just some of the most frequent ones. But the one argument that dominates any debate about a European "transfer union" is moral hazard: Germans don't want to pay for Spaniards and Greeks because they suspect that doing so prevents reforms in those countries and perpetuates the alleged laziness of their workers and politicans.

Moral hazard in general is a powerful argument against insurance. It can also be applied to fire insurance, where it amounts to saying that taking out insurance will make homeowners set fires on purpose or make them less willing to take precautionary measures to prevent fires . The first problem is not very large: few people want to commit insurance fraud if it means setting their own house on fire. The second problem is more relevant, and insurers write clauses that preclude payment when obvious fire prevention measures haven't been taken. But nobody questions the usefulness of fire insurance in general. This is because fire is regarded as an exogenous event: we believe it generally happens unexpectedly to people and without them being able to affect its occurence.

Modern macroeconomic modeling practice sees a recession much in the same way as a fire: it is caused by an exogenous shock, outside of the control of agents in the model. Unexpectedly, productivity drops, credit tightens, people become thrifty or lazy. Yet many people, including economists I have spoken to about our paper, are worried that introducing insurance against business cycle risk would somehow lead to more, longer or deeper recessions, as if this was something in people's control.

So do economists really believe that business cycles are caused by exogenous shocks? I don't think so. It's simply too much to stand up and say that a recession is just "bad luck". It's also not what economic advisors are paid for. Instead, economists spend most of our times explaining the latest business cycle episode as an endogenous build-up: now, the financial crisis was caused by reckless overborrowing and taking on systemic financial risk, whereas before the crisis, the preceding boom was said to be the endogenous product of successful financial deregulation, "anchored expectations" and lower trade costs, among other things. To get more in the European context, the fact that Germany is not in a recession but Spain is is most often attributed to differences in labour market, industrial and general fiscal policies. It is not attributed to Spain getting some random bad shock and Germany getting a random good shock. And if business cycles are the products of the vice and virtue of governments and citizens, then obviously you shouldn't insure them against bad outcomes!

The problem is that any narrative about endogenous causes of business cycles runs into problems with modern macroeconomic and macroeconometric methods. There, we model business cycles not as "cycles", but as outcomes of a small set of random shocks. Econometrically, it is hard to predict business cycles, hence the estimations attribute them to random shocks. Theoretically, endogenous business cycles are really hard to produce as well: them being endogenous implies that their existence is due to agents' choices. But standard theories abound with rational expectations, optimising agents, perfect information and efficient markets, which means that agents disliking large fluctuations will not make such choices.

In principle, the same urge for endogenisation applies to houses on fire. If your house burns down because of a leaking gas pipe, even if this gas pipe was properly and regularly checked, you will probably blame it on the gas engineer, or on your decision to have a house with gas heating. But at some point, at least from a social perspective, we accept that fires just sometimes happen, that it is bad luck, and that people should be able to insure against it. The only place where we think otherwise is insurance fraud. But as mentioned, this is a rare thing. Likewise, it would probably be strange to argue that the main moral hazard issue in cross-country insurance were that it would induce the governments of Southern Europe to trigger recessions on purpose in order to claim transfers from the North. After all, how many people would want to set their own house on fire even if they get some insurance money for it?

But of course, it's not only about what caused the fire, but what you did to prevent it, or mitigate its spreading through your house. In the same spirit, one can argue that recessions are indeed caused by exogenous shocks, but that the reaction of some country's economy to those shocks is endogenous. For example, Spain could make itself more resistant to business cycles by reforming its labour market towards more flexibility; a European insurance mechanism might destroy the incentives to do so. That is a valid point.

There is a great Econometrica paper by Persson and Tabellini that makes this point theoretically, using a stylised and static political economy model. Unfortunately though, it is hard to make it within the context of business cycle macroeconomics, which hardly ever considers endogenous government policies. Bridging this gap between political economy and business cycle theory seems a daunting task. And how would one go about calibrating moral hazard of national governments to data? The deeper problem here, I think, is that economics offers no good framework how governments actually make choices; and the only framework of how they should make choices, namely as benevolent, rational, optimising social planners, is inadequate to the problem.

Monday 6 January 2014

Mann ohne Eigenschaften 28

Unfortunately nothing is so difficult to represent by literary means as a man thinking. A great scientist, when he was once asked how he managed to hit upon so much that was new, replied: “By keeping on thinking about it.” And indeed it may safely be said that unexpected inspirations are produced by no other means than by the expectation of them. To no small extent they are a success due to character, permanent inclinations, unflagging ambition and persistent work. How boring such persistence must be! And then again, from another aspect, the solution of an intellectual problem comes about in a way not very different from what happens when a dog carrying a stick in its mouth tries to get through a narrowdoor: it will go on turning its head left and right until the stick slips through. We do pretty much the same, only with the difference that we do not go at it quite indiscriminately, but from experience know more or less how it should be done. And although of course a head with brains in it has far more skill and experience in these turnings and twistings than an empty one, yet even for it the slipping through comes as a surprise, it is something that just suddenly happens; and one can quite distinctly perceive in oneself a faintly nonplussed feeling that one’s thoughts have created themselves instead of waiting for their originator. This nonplussed feeling refers to something that many people nowadays call intuition,whereas formerly it used to be called inspiration, and they think they must see something supra-personal in it; but it is only something nonpersonal, namely the affinity and kinship of the things themselves that meet inside one’s head.