And the 2017 Economics Nobel Prize goes to ...

thaler.jpg

Today’s announcement of a Nobel Prize for Richard Thaler is richly deserved and I congratulate the Nobel committee for recognising the importance of the growing influence of behavioural economics that Richard helped to create. This is a significant ‘nudge’ towards recognising the importance of beliefs as fundamental, an idea that I use in my own work in a macroeconomic context. In July, I co-organized a conference at the Bank of England on the connection between behavioural economics and macroeconomics so I am pleased that the connection of psychology to economics will be more widely perceived as significant with the award of this year’s Nobel Prize.

Richard Thaler’s work is widely cited as recognising that human beings are not rational and in a very narrow sense, that is true. On hearing that he had won the Nobel prize, Richard is quoted as saying the most important impact of his work is the recognitions that “economic agents are humans” and “money decisions are not made strictly rationally”.  

Rationality means many things to many people and there are both broad and narrow definitions of what exactly it means. Under the broad definition, one that I have always liked, it is an organising principle that categorises human action. Rationality means that we always choose our preferred action. What is our preferred action? It is the one we choose. This idea is captured in Samuelson’s discussion of revealed preference in Foundations of Economic Analysis. Although rationality by this definition is a tautology, it is a useful tautology that plays the same role in economics as the Newtonian concept of “action at a distance”.

There is another, much narrower definition of rationality, that is formalized in a set of axioms that was introduced by John Von Neumann and Oscar Morgenstern in their magisterial tome, the Theory of Games and Economic Behavior”. Those axioms make a great deal of sense when applied to choice over monetary outcomes. They make much less sense when applied to complex decisions that involve sequential choices and payoffs of different commodities at different point in time.  It is this second definition of rationality that has shown to be violated in experimental situations and that is the take-off point for Thaler’s work on how best to present choices to people that help them make ‘good’ decisions.

If you want to know more about Richard’s work, I highly recommend the book Nudge, where you will learn about them in Richard’s own words along with that of his co-author Cass Sunstein. That work has already found its way into public policy decisions and, in the U.K., led to the creation of the ‘Nudge’ unit, an arm of the U.K. government that uses Thaler’s work to influence public decisions.

Keynesian Economics Without the Phillips Curve

Policy makers at central banks have been puzzled by the fact that inflation is weak even though the unemployment rate is low and the economy is operating at or close to capacity. Their puzzlement arises from the fact that they are looking at data through the lens of the New Keynesian (NK) model in which the connection between the unemployment rate and the inflation rate is driven by the Phillips curve.

inflationunem.jpg

In a recent paper joint with Giovanni Nicolò, we compared two models of the interest rate, the unemployment rate and the inflation rate.  One theory, the NK model, consists of a demand equation, a policy rule and a Phillips curve. The other, the Farmer Monetary (FM) model, replaces the Phillips curve with a new equation: the belief function. We show that the FM model outperforms the NK model by a large margin when used to explain United States data. 

To make this case, we ran a horse race in which we assigned equal prior probability to two models. One was a conventional New Keynesian model that consists of a demand equation, a policy rule and a Phillips curve. The other was the FM model. The FM model shares the demand curve and the policy rule in common with the NK model but replaces the Phillips curve with a new equation; the belief function.

The belief function captures the idea that psychology, aka animal spirits, drive aggregate demand. It is a fundamental equation with the same methodological status as preferences and technology.  To operationalise the belief function, we assumed that people make forecasts of future nominal income growth based on observations of current nominal income growth. If x is the percentage growth rate of nominal GDP this year and E[x’] is the expected rate of growth of nominal GDP growth next year we assumed that x = E[x’].

We estimated both models using Bayesian statistics and we compared their posterior probabilities. Our findings are summarised in Table 2, reproduced from our paper.  The table reports what statisticians call the posterior odds ratio. As is common in this literature, we compared the models over two separate sub-samples; one for the period from 1954 to 1979 and the other from 1983 to 2007. Our findings show that an agnostic economist who placed equal prior weight on both theories would conclude that the FM model knocks the NK model out of the ball park. The data overwhelmingly favours the FM model.

table2.png

We explain our findings in the paper by appealing to a property that mathematicians call hysteresis.

Conventional dynamical systems have a stable steady state that acts as an attractor. The economy will converge to that steady state, no matter where it starts. The FM model does not share that property. Although the economy follows a unique path from any initial condition, the FM model has a continuum of possible steady states and which one the economy ends up at depends on initial conditions.

The FM model explains the data better than the NK model because the unemployment rate in US data does not return to any single point. In some decades, the average unemployment rate is 6%: in others, it is 3%. And in the Great Depression it did not fall below 15% for a decade. The unemployment rate, the inflation rate and the interest rate are so persistent in US data that they are better explained as co-integrated random walks than as mean-reverting processes.  The FM model captures that fact. The NK model does not.

What does it mean for two series to be co-integrated? I have explained that idea elsewhere by offering the metaphor of two drunks walking down the street, tied together with a rope. The drunks can end up anywhere, but they will never end up too far apart. The same is true of the inflation rate, the unemployment rate and the interest rate in the US data.

As I have argued on many occasions, the NK model is wrong and there has been no stable Phillips curve in the data of any country I am aware ever since Phillips wrote his eponymous article in 1958. My paper with Giovanni provides further empirical evidence for the Farmer Monetary Model, an alternative paradigm that I have written about in a series of books and papers. Most recently, in Prosperity for All, I make the case for active central bank intervention in the asset markets as a complimentary approach to interest rate control.

In a separate paper, Animal Spirits in a Monetary Model, Konstantin Platonov and I have explored the theory that underlies the empirical work in my joint work with Giovanni. The research programme we are engaged in should be of interest to policy makers in central banks and treasuries throughout the world who are increasingly realising that the Phillips curve is broken. In Keynesian Economics Without the Phillips Curve, we have shown how to replace the Phillips curve with the belief function, an alternative theory of the connection between unemployment and inflation that better explains the facts.  

On Refereeing: Do we have Confidence in our Economic Institutions?

referee.jpg

Like most academics, I spend much of my time asking for money from research councils. So, it is a welcome change for me to sit on the other side of the table in my role on the management team of Rebuilding Macroeconomics. This is an initiative located at the National Institute of Economic and Social Research in the UK and funded by the Economic and Social Research Council. Our remit is to act as gatekeepers to distribute approximately £2.4 million over the next four years to projects that have the potential to transform macroeconomics back to a truly policy relevant social science. We are seeking risky projects that combine insights from different disciplines that would not normally be funded and we expect that not all of them will succeed. It is our hope that one or more of the projects we fund will lead to academic advances and new solutions to the pressing policy issues of our time. 

In addition to my role on the management team of Rebuilding Macroeconomics, I am Research Director at NIESR. I have already learnt a great deal from discussions with the other members of the team. The committee consists of myself, Angus Armstrong, of Lloyds Bank,  Laura Bear of LSE, Doyne Farmer, of Oxford University,  and David Tuckett, of UCL. Laura is a Professor of Anthropology who has worked extensively on the anthropology of the urban economy and she brings a refreshing perspective to the sometimes-insular world of economics. Doyne Farmer, no relation, is a complexity theorist who runs the INET Complexity Economics Centre at the Oxford Martin School. Doyne was trained as a physicist and he has a long association with the Santa Fe Institute in New Mexico. And last, but by no means least, David Tuckett is a psychologist at University College London where he directs the UCL Centre for the Study of Decision Making Under Uncertainty. As you might imagine, conversations among this diverse group have been eye-opening for all of us.

We have chosen to allocate funds by identifying a number of ‘hubs’ that are loosely based around a set of pressing public issues. So far, we have identified three: 1) Can globalisation benefit all? 2)  Why are economies unstable? and 3) Do we have confidence in economic institutions? In this post, I want to focus on the third of these questions which evolved from conversations between those of us on the management team and that Laura and I have spent quite a bit of time refining. 

We can break institutions into two broad groups: Academic institutions that shape the culture of economists. And government and policy institutions that transmit this culture to the wider public sphere. Research on academic institutions involves the organisation of economic education in universities, the journal structure, the rules for promotion and tenure in academic departments and the socialisation and seminar culture of the tribe of the Econ. Research on policy making institutions like the Bank of England, the Treasury and the IMF involves the way that insular thinking, learned in graduate schools, is transmitted to society at large. 

Insular thinking is reflected, for example, in economic journal publishing, a process that is highly centralised around five leading journals. These are the American Economic Review, the Quarterly Journal of Economics, the Review of Economic Studies, Econometrica and the Journal of Political Economy. For a young newly appointed lecturer, publishing a paper in one of these top five journals is a pre-requisite for promotion in a leading economic department in the United States, the United Kingdom and many of the top Continental European departments. This process is more often than not depressingly slow. Even for a well-established leading economist, publication in a top five journal is never guaranteed. And when a paper is finally published, it is after rejection from three or more other journals and the collective efforts of a coterie of referees. This experience, as I learned from Doyne, is not characteristic of the natural sciences.

Normal
0




false
false
false

EN-US
X-NONE
X-NONE

 
 
 
 
 
 
 
 
 
 


 
 
 
 
 
 
 
 
 
 
 


 <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false"
DefSemiHidden="false" DefQFormat="false" DefPriority="99"
LatentStyleCount=…

                         From Redpen/Blackpen twitter feed

In economics, the expected time from writing a working paper to publication in a journal is around four years. That assumes that the researcher is shooting for a top journal and is prepared to accept several rejections along the way. When a paper is finally accepted it must, more often than not, be extensively rewritten to meet the proclivities of the referees. In my experience, not all of the referee reports lead to improvements. Sometimes, the input of dedicated referees can improve the final product. At other times, referee comments lead to monstrous additions as the editor incorporates the inconsistent approaches of referees with conflicting views of what the paper is about. 

It is not like that in other disciplines. I will paraphrase from my memory of a conversation with Doyne, so if you are a physicist or a biologist with new information, please feel free to let me know in the comment section of this blog. In physics, a researcher is rightfully upset if she does not receive feedback within a month. And that feedback involves short comments and an up or down decision. In physics, there is far less of a hierarchy of journals. Publications are swift and many journals have equal weight in promotion and tenure decisions.

I do not know why economics and physics are so different but I suspect that it is related to the fact that economics is not an experimental science. In macroeconomics, in particular, there are often many competing explanations for the same limited facts and it would be destructive to progress if every newly minted graduate student were to propose their own new theory to explain those facts. Instead, internal discipline is maintained by a priestly caste who monitor what can and cannot be published. 

The internal discipline of macroeconomics enables most of us to engage in what Thomas Kuhn calls ‘normal science’. But occasionally there are large events like the Great Depression of the 1930’s, the Great Stagflation of the 1970’s or the Great Recession of 2008, that cause us to re-evaluate our preconceived ideas. A journal culture that works well in normal times can, in periods of revolution, become deeply suppressive and destructive of creative thought. Now would be a very good time to re-evaluate our culture and perhaps, just perhaps, we can learn something from physics.

Where's the Inflation? Where's the Beef?

In a 1984 advertising campaign, Wendy’s Hamburgers featured the character actress Clara Peller.  Clara peers disappointedly at a burger from a rival chain that, while well stocked with bread, has remarkably little meat. Her rallying cry: Where’s the beef? was taken up as a political slogan by Vice Presidential Candidate Walter Mondale and it captured the imagination of a generation. 

beef.jpg

Today, as we stare at a Fed balance sheet of $4.5 trillion and rates of price change at or below 2% one can envisage a millennial Clara Peller metaphorically peering at a bloated Fed balance sheet and pleading; Where’s the inflation?

In a 2009 review of Akerlof and Shiller’s book, ‘Animal Spirits’, Greg Hill pointed out that I made the following claim: “History has taught that a massive expansion of liquidity will lead to inflation”. My review was designed to be critical of slavish applications of 1950s Keynesian remedies to twenty-first century problems.  I stand by that critique. There is a reason we rejected Keynesian economics in the 1970s. It didn’t work the way it was supposed to. In particular, Keynesian economics had nothing to say about the most important economic issue of the 1960s and 1970s: the simultaneous appearance of inflation and unemployment for which the British politician, Ian Macleod coined the term ‘stagflation’.

In the 1960s, the U.S. government borrowed to pay for the Vietnam war, and rather than raise politically unpopular taxes, it paid for new military expenditures by printing money. Milton Friedman pointed out correctly, that printing money would eventually lead to inflation. If printing money leads to inflation, why has a more than fivefold expansion of the Fed balance sheet, from $800 million in 2006 to $4,500 million in 2017, not been accompanied by an increase in prices?

Modern theories of inflation are based on Milton Friedman’s celebrated restatement of the quantity theory of money. (Aside: If you are a student of macroeconomics and you have not read Friedman’s essay; you are being short-changed by your professor). Friedman was building on the earlier work of quantity theorists (see for example, Hume’s essay; Of Money) who built a theory of inflation around the definition of the velocity of circulation, v, as the ratio of nominal GDP to the stock of money:

(1)      v = (P x Y)/M

Here, P is a price index,  Y is real GDP and M is the quantity of money.  According to the Quantity Theory of Money, Y is equal to potential GDP, Y*

(2)      Y = Y* 

and  v is constant. If  Y is growing at the growth rate of potential GDP and if v  is a constant then the rate of price inflation is, mechanically, equal to the rate of money creation minus the growth rate of potential GDP. It was that fact that led Friedman to proclaim that “inflation is always and everywhere a monetary phenomenon”. But what if the velocity of circulation is not a constant?

Friedman’s restatement of the quantity theory of money improved over earlier versions of the quantity theory by recognizing formally that the velocity of circulation is a function of a spectrum of interest rates on alternative assets. In its simplest form, Friedman’s restatement implies that money is like a hot potato that is passed from hand to hand more quickly when the interest rate increases.

Screen Shot 2017-09-21 at 7.20.31 PM.png

Figure 1 plots the velocity of circulation on the horizontal axis against the interest rate on three month treasuries on the vertical axis. This graph is upward sloping as long as the interest rate is positive. It is horizontal when the interest rate is zero, a feature that Keynes referred to as ‘the liquidity trap’.

The graph of velocity against the interest rate flattens out as the interest rate approaches zero because at zero rates, money and bonds become perfect substitutes. Like a glutton who has eaten so much he cannot stomach one more hamburger, at zero interest rates people are satiated with liquidity and have no further use for cash for day-to-day transactions. If the Fed buys T-bills and replaces them with dollar bills people will be content to hold the extra cash rather than spend it. This observation leads me to remark that what I should have said in my 2009 review of Akerlof and Shiller was that: “History has taught that a massive expansion of liquidity will lead to inflation: [except when the interest rate is zero]”.

A final word of caution. When reserves of private banks at the Fed pay interest, as they do now, the opportunity cost of holding money is not the T-bill rate. It is the T-bill rate minus the reserve rate. If the Fed raises the interest rate and continues to pay interest on excess reserves, the connection between velocity and the interest rate will remain permanently broken. In that case, the graph that I plotted in Figure 1 will not continue to characterize future data, even if the T-bill rate increases above zero. I wrote about that issue here where I pointed out that the impact of monetary tightening on inflation will depend very much on how central banks tighten. Stay tuned to this spot and don’t trust your favourite interpreter of the doctrine of Keynes. When the Keynesian prophets call for more of the same without explaining why their policies failed us in the great stagflation; take your cue from Clara Peller and ask them loudly: Where’s the beef?

Indeterminacy, the Belief Function and Reinventing IS-LM

This is my final post featuring research presented at the conference on Applications of Behavioural Economics and Multiple Equilibrium Models to Macroeconomics Policy Conference held at the Bank of England on July 3rd and 4th 2017.

Today I will talk about the work of two of my graduate students and co-authors, Giovanni Nicolò and Konstantin Platonov. Both of them gave presentations at the conference.

multeq.jpg

Giovanni is in his final year of the Ph.D. programme at UCLA and he will be looking for a job this coming January at the annual ASSA meetings. This year they will be held in Philadelphia. He has already published one paper in the Journal of Economic Dynamics and Control, co-authored with myself and Vadim Khramov. He has a co-authored paper with Francesco Bianchi that is under revision for Quantitative Economics, and a third  paper co-authored with me, Keynesian Economics without the Phillips Curve, that we wrote for an upcoming conference at Gerzensee Switzerland in October. At the Bank of England Conference, Giovanni presented a fourth paper. This is his single-authored job-market paper “Monetary Policy, Expectations and Business Cycles in the U.S. Postwar Period”.


Giovanni’s research is on the empirics of models with multiple equilibria and sunspots. He began working on this topic when Vadim and I invited him to join us on the project, “Solving and estimating indeterminate DSGE models”, (Farmer Khramov and Nicolò FKN) that now appears in the JEDC. In that paper, we showed how to use standard software packages such as Chris Sim’s matlab code, GENSYS, and the computational package DYNARE, to solve models in which the steady state of the model is indeterminate. This has been a hot topic for empirical macroeconomics ever since Thomas Lubik and Frank Schorfheide showed, in 2004, that the Federal Reserve Board, prior to 1979, followed a policy in which the equilibrium of the economy was indeterminate and subject to non-fundamental belief shocks, aka, sunspot fluctuations.

In order to estimate a model driven by sunspots, the researcher must make distributional assumptions about the nature of non-fundamental uncertainty and how it co-varies with other fundamental shocks to demand and supply. The parameters of this distribution are part of what I call the belief function.  Before we wrote our paper (FKN 2015), researchers who wanted to estimate an indeterminate model by applying the Lubik-Schorfheide method were faced with a complicated programming problem. We showed how to side-step this computational problem and instead to estimate an indeterminate model using the widely-used software package, DYNARE.

In his paper with Francesco Bianchi, Giovanni took this agenda one step further. To estimate an indeterminate DSGE model using the FKN method, the researcher needed first to know if a particular parameterization of the model is determinate or indeterminate. For a simple model such as the three equation New-Keynesian model, it is possible to partition the parameter space into determinate and indeterminate regions analytically. For more complicated models, no such analytic partition is possible. Bianchi and Nicolò (BN 2017) develop a computational method for which no analytic expression is needed. Their work allows researchers to estimate medium to large scale models without imposing the assumption, a priori, that all of the shocks to the model are fundamental.

Models of indeterminacy are identified in data by the fact that they have richer propagation mechanisms than models with a unique determinate steady state. Giovanni’s independent work takes off from the observation (Beyer and Farmer 2004) that it may in practice be difficult to tell the difference between models with an indeterminate steady state and models with a unique steady state but richer internal propagation mechanisms. To put his method through its paces, he estimates a complete medium scale DSGE model of the type constructed by Frank Smets and Raf Wouters. He finds that the Lubik-Schorfheide result carries over to the complete Smets-Wouters model, a result that could not have been discovered without the method that Giovanni developed with Francesco. This is a very nice piece of work and if you are looking to hire an exceptionally smart young macroeconomist with strong theoretical and empirical skills, Giovanni comes highly recommended!  You can hear him discuss his research in the attached video clip.


The final conference paper that I will discuss in this series, “Animal Spirits in a Monetary Economy”, was co-authored by myself and Konstantin Platonov. Konstantin presented our paper at the conference and we wrote about our work for VOX here.

I have been critical of the IS-LM model in several of my posts. My paper with Konstantin  fixes some of the more salient problems of IS-LM by reintroducing two key ideas from Keynes. 1. The confidence fairy is real. 2. If confidence remains depressed, high unemployment can exist forever.  Our Vox piece presents the key findings of the paper in simple language. Here are some excerpts...

islmnac.png
“Larry Summers has argued that market economies may get stuck in permanently inefficient equilibria. He calls this 'secular stagnation' (Summers 2014). In this equilibrium, unemployment may be permanently ‘too high’ and output may remain permanently below potential, because private investors are pessimistic about the prospects for future growth. Our most recent research attempts to explain why secular stagnation occurs and how economic policy may be used to escape it (Farmer and Platonov 2016).
In the wake of the Great Recession, macroeconomic orthodoxy is under attack. Paul Krugman (2011) has called for a return to the IS-LM model, an approach that was developed by Sir John Hicks (1937). We are sympathetic to that call but we believe that the IS-LM model needs to be redesigned. We suggest a different way of thinking about the effect of monetary policy that we call the 'IS-LM-NAC' model. It is part of a broader research agenda ( Farmer 2010201220142016a2016b) that studies models in which beliefs independently influence outcomes...” continue reading

Konstantin has another year at UCLA but he will be on the market in January of 2019. He is an exceptionally talented young economist who also comes highly recommended. In addition to his co-authored paper with me, he has some exciting new work of his own that extends Farmer’s Keynesian search model to an international framework with two or more countries. Konstantin presented his single-authored paper at the European Economic Association meetings in Lisbon last summer.

This brings me full circle and ties together my own research with the other pieces you have heard in the series of linked video clips. Stay tuned to this spot; the home for new approaches to macroeconomics!