Pages

Monday, December 11, 2017

Yes, Occupational Licensing is Making the U.S. Economy Less of an OCA

From a new working paper by Janna E. Johnson, Morris M. Kleiner:
Occupational licensure, one of the most significant labor market regulations in the United States, may restrict the interstate movement of workers. We analyze the interstate migration of 22 licensed occupations. Using an empirical strategy that controls for unobservable characteristics that drive long-distance moves, we find that the between-state migration rate for individuals in occupations with state-specific licensing exam requirements is 36 percent lower relative to members of other occupations. Members of licensed occupations with national licensing exams show no evidence of limited interstate migration.
Not only does this development have implications for workers, it also has macroeconomic implications. For the decline in interstate labor mobility, caused in part by occupational licensing, is making the U.S. economy less of an optimal currency area. From an earlier post:
So why does the decline in labor mobility matter for the U.S. economy? To answer this question, recall that the Fed is doing a one-size-fits-all monetary policy for fifty different state economies. That is, the Fed is applying the same monetary conditions to states that often have very different economies, both structurally and cyclically. For example, Michigan and Texas have had very different trajectories for their economies. Does it really makes sense for them both to get the same monetary policy?  
According to the OCA, the answer is yes under certain circumstances. The OCA says it makes sense for regional economies to share a common monetary policy if they (1) share similar business cycles or (2) have in place economic shock absorbers such as fiscal transfers, labor mobility, and flexible prices. If (1) is true then a one-size-fits-all monetary policy is obviously reasonable. If (2) is true a regional economy can be on a different business cycle than the rest of currency union and still do okay inside it. The shock absorbers ease the pain of a central bank applying the wrong monetary policy to the regional economy.  
For example, assume Michigan is in a slump and the Fed tightens because the rest of the U.S. economy is overheating. Michigan can cope with the tightening via fiscal transfers (e.g. unemployment insurance), labor mobility (e.g. people leave Michigan for Texas), and flexible prices (workers take a pay cut and are rehired).  
To be clear, a regional economy is not making a discrete choice between (1) and (2) but more of a trade off between them. Michigan, for example, can afford to have its economy a little less correlated with the U.S. economy if its shock absorbers are growing and vice versa. There is a continuum of trade offs that constitutes a threshold where it makes sense for a regional economy to be a part of a currency union. That threshold is the OCA frontier in the figure below: 


Circling back to the original OCA question, the decline in labor mobility documented above matters because it means that certain regions in the United States are becoming less resilient to shocks. This is especially poignant given the findings in Blanchard and Katz (1992) that interstate labor mobility has been the main shock absorber for regional shocks. Consequently, monetary policy shocks may prove to be more painful than before for some states. Unless increased fiscal transfers and price flexibility make up for the decline in labor mobility, the implication is clear: the U.S. is gradually moving away from being an OCA.
Johnson and Kleiner provide evidence that that suggests more licensing reciprocity agreements among states could increase interstate mobility. That, in turn, would help push the U.S. economy back in the direction of an OCA. 

HT Tyler Cowen

PS Here is my interview with David Schleicher on declining labor mobility. We discuss its implications, including the gradual retreat of the U.S. economy from the OCA criteria as noted above. Our conversation was based on his paper "Stuck! The Law and Economics of Residential Stability".

Why You Should Care about Divisia Monetary Aggregates

I recently had Bill Barnett on my podcast to discuss his work on Divisia monetary aggregates. Below the fold is a tweetstorm by Josh Hendrickson on why we should care about this work.

Thursday, December 7, 2017

Clashing Over Commerce


Doug Irwin's new book on the history of U.S. trade policy, Clashing Over Commerce, is now available for purchase. You may recall that I interviewed him about the book in this recent podcast. The podcast is embedded below. My colleague Dan Griswold has a nice review of the book over at National Review.  I learned a lot from the book and my conversation with Doug. I highly recommend it.

Wednesday, November 29, 2017

Hypothermia, Inflation, and the Fed's Epistemological Jam

Imagine you fall into a freezing lake and get hypothermia. You are rushed to the ER and receive good service initially, but your body temperature continues to remain below 98.6 Fahrenheit. The doctor says he is not sure why you are so cold. It is a puzzle to him and everything he thought he knew about body temperatures seems to be wrong. He says not to worry, though, as he turns on the air-conditioner. All should be well soon, he thinks, once the room starts to cool down. 

The doctor leaves your room and comes back to check on you after 15 minutes. He finds that your body temperature has dropped even more and that you are shivering. He concludes the room was not cool enough so he dials up the air conditioner even more to really get the cold air blowing. 

The doctor leaves and returns after another 15 minutes have passed. You are now unconscious, turning blue, and barely clinging to life. The doctor is now even more baffled about body temperature. Oh well, he concludes, there must be some transitory one-off factors affecting your body temperature. Not much the doctor can do about them as he heads out the room and dials up the AC a bit more. Eventually you die.

This story is an analogy of how the Fed has been handling inflation over the past decade. Just like falling into a freezing lake is a shock to your body temperature, that Great Recession was a shock to the inflation rate. And just like you being stabilized in the ER, the economy was initially stabilized by the Fed. After being stabilized, though, your body temperature never fully recovered just like the inflation rate never returned on a consistent basis to 2%. And just like the doctor seems to have forgotten the basics of body temperature, the Fed seems to have forgotten the basics of inflation. Moreover, the doctor is adding to his own confusion by turning up the air conditioner to cooler temperatures just like the Fed is increasingly perplexed as to why inflation remains low as it pushes up interest rates. 

If the hypothermia story seems absurd to you then the recent Fed behavior toward inflation should also be absurd to you. FOMC members are increasing puzzled by the stubbornly low inflation rate and yet continue to talk up rate hikes on the top of ones they have already done. 

Caroline Baum has a piece at MarketWatch on this tension. First, she notes the FOMC's inflation confusion:
[T]he most significant take away — the new news, if you will — was Yellen’s response to an audience question on why inflation remained so low at a time when the unemployment rate was hovering just above 4%. 
After running through a “whole range of idiosyncratic kind of factors, most of which may be temporary/transitory things that affect inflation,” Yellen admitted she was “no longer certain” about inflation’s eventual rise. “My colleagues and I are not certain that it is transitory,” she said, referring to the chronic undershoot of the 2% inflation target. 
Not transitory? Will this turn out to be another “conundrum” for the Fed? At her Sept. 20 press conference, Yellen elevated the chronic inflation undershoot to a “mystery,” a term Powell invoked at his confirmation hearing
She then explains what the Fed is doing in response to this inflation mystery:
So what’s the Fed’s approach to dealing with the chronic inflation undershoot? Why, raise interest rates and pare the balance sheet. 
If this seems counterintuitive, it is. I have written that the Fed should either put up — run a more expansionary monetary policy to boost inflation — or shut up. Policy makers can’t continue to fret over low, stable inflation, on the one hand, and, on the other, implement policies that, all things equal, will slow economic growth and depress inflation further.
David Harrison makes a similar point over at the Wall Street Journal 
[Fed] officials remain perplexed by the past year’s surprising weakness in inflation. And yet there is something truly strange about that. How can the Fed continue to expect rate increases when it has no idea what’s going on with inflation? How can you know the economy will behave in a way that justifies rate increases while simultaneously admitting you don’t know how the economy is behaving? 
The central bank appears to have put itself in an epistemological jam.
I like the epistemological jam framing a lot. The Fed is speaking out of both sides of it mouth. The Fed claims it does not understand the persistently low inflation and yet Fed officials make statements like this to justify the rate hikes:


Call me crazy, but if the Fed feels it needs to raise interest rates because it is "worried about trends that could push inflation above [its] 2% objective" then maybe, just maybe its past rate hikes and signaling of future rate hikes might explain the low inflation over the past decade. Who knows, maybe monetary policy matters for long-run inflation trends after all. Or as Aaron Klein says:

Monday, November 27, 2017

Abenomics Update

So a quick update on that grand monetary experiment in Japan known as Abenomics. 

Prime Minister Shinzo Abe and his party were returned to power in a decisive October election. This means the Bank of Japan will continue to expand the monetary base, peg the 10-year government bond at 0%, and strive for 2% inflation. 

I was an early fan of Abenomics, but have become a bit more skeptical over time. Others, like Noah Smith, are convinced it is working and are glad to see it continue. Mike Bird of the Wall Street Journal is also a fan. They make a reasonable argument that the real side of the economy has benefited from the Bank of Japan's policies. 

Maybe so, but what about the nominal side of the economy? Yes, we ultimately care about the real side, but the central bank can only directly affect the nominal economy. Its influence on the real economy is a by-product of this influence. Moreover, getting the nominal side of the economy to rapidly expand is needed to offset the real burden of the growing stock of nominal debt. 

So how is the nominal side of the economy doing? Okay, but not great. Inflation is above zero but nowhere near its 2% target. This is true even if we look at core measures of inflation that account for the 2014 changes in the consumption tax. Below is a chart from the Bank of Japan:


Nominal GDP (NGDP) in Japan--a measure of total nominal demand--does show more progress under Abenomics than with the original QE of 2001-2006:


This progress of NGDP is an improvement, but if we step back and look at it from a broader historical perspective it is actually underwhelming. All Abenomics has done is return NGDP to a flat trend growth path. Nominal demand growth in Japan is still far below what it was before the 1990s. This has big implication for Japan's debt burden and suggests its real growth could be higher.


So why is Abenomics failing to pack a big punch? There is both an economic and a political answer. The former is a technical one that can be summarized in the chart below. It shows the actual monetary base and its permanent portion. (The permanent portion is proxied by currency and coins in circulation since they tend to drive the long-run path of the monetary base.)


The expected path of the permanent part of the monetary base and by implication the expected path of the price level is what drives current inflation. If the expansion of the monetary base under Abenomics is expected to be unwound in the future then it should have little effect on the price level today. The permanent portion of the base suggests it will be unwound. (I have a forthcoming paper that explains in more detail why this permanent-temporary distinction matters so much.)

Another way of saying this is that market participants expect the Bank of Japan to do what it did after the initial QE program--reverse it. Michael Woodford, in his 2012 Jackson Hole speech, commented on the 2001-2006 episode:
The Japanese monetary base resumed a path that was close to a continuation of its trend prior to the QE period; hence, market participants who had continued to hold expectations about the long-run Japanese monetary base that were unchanged as a result of the QE policy would not have been that far off in their prediction (p. 241).
Woodford also notes that this experience comes “fairly close to providing an illustration of the kind of policy to which the irrelevance results of Krugman (1998) and Eggertson and Woodford (2003) should apply". This is the economic answer and the reason I have become more skeptical of Abenomics.

The political answer, in my view, is that the Bank of Japan will not make its monetary expansions permanent and significantly increase the inflation rate is because politically it cannot do so. Japan has an aging population that holds a lot of government debt and lives off of fixed income. Raising the inflation rate would harm them and create a political firestorm. I believe this is what ultimately is keeping Japan from getting robust nominal demand growth. 

Wednesday, November 1, 2017

Monetary Regime Change Update

I recently made the case that we got a monetary regime change in 2008 that explains the stubbornly low inflation since that time:
A monetary regime change has occurred that has lowered the growth rate and growth path of nominal demand. Since the recovery started in 2009Q3, NGDP growth has averaged 3.4 percent. This is below the 5.4 percent of 1990-2007 period (blue line in the figure below) or a 5.7 percent for the entire Great Moderation period of 1985-2007. Macroeconomic policy has dialed back the trend growth of nominal spending by 2 percentage points. That is a relatively large decline. This first development can be seen in the figure below.


The figure above also speaks to the second part of this regime change: aggregate demand growth was not allowed to bounce back at a higher growth rate during the recovery like it has in past recessions. Historically, Fed policy allowed aggregate demand to run a bit hot after a recession before settling it back down to its trend growth rate.  This kept the growth path of NGDP stable. You can see this if the figure above by noting how the growth rate (black line) typically would temporarily go above the trend (red line) after a recession.  
Had macroeconomic policy allowed aggregate demand growth to follow its typical bounce-back pattern after a recession, we would have seen something like the blue line in the figure. This line is a dynamic forecast from a simple autoregressive model based on the Great Moderation period. This naive forecast shows one would have expected NGDP growth to have reached as much as 8 percent during the recovery before settling back down to its average. Instead we barely got over 3 percent growth. This is why NGDP has never caught back up to its pre-crisis trend path.  
Again, these two developments are, in my view, the real story behind the drop in trend inflation. And to be clear, I think both the Fed's unwillingness to allow temporary overshooting and the safe asset shortage problem have contributed to it. 
I went on to say this is the monetary regime change no one asked for. It is also one that many observers seem to miss in their analysis of Fed policy since the crisis.  

Well, I was on twitter discussing long-term treasury yields and monetary policy with Tim Duy and Joel Wertheimer. I decided to whip up some charts comparing 1-year head NGDP forecasts from the Philadelphia Fed's Survey of Professional Forecasters against 10-year treasury yields. The results, in my view, are consistent with the claim that there was a monetary regime change in 2008. 

First, here is the chart for the 1980:Q1-2007:Q4 period. It shows a fairly strong relationship between expected NGDP growth and long-term treasury yields:


The next figure shows the relationship for the period since 2008:Q1. Say goodbye to that relationship:

Just to be robust, I took out the outliers in the above chart (which are for the periods 2008:Q4-2009:Q2) and got the following chart:


So something has changed in the relationship between expected NGDP growth and long-term treasury yields. The monetary regime change story outlined above coincides closely with this breakdown. Here is one way to connect these two developments. Before 2008 the Fed allowed its tightening to follow the pace of recovery, whereas afterwards the Fed has tended to get ahead of the recovery in its desired and actual rate hikes. If so, the Fed's tightening post-2008 would have slowed down the recovery and lowered the expected future path of short-term interest rates. This overreacting by the Fed would have kept long-term interest rates from rising with expected rises in NGDP growth. This type of behavior would also be consistent with a monetary regime change where only low rates of NGDP growth and inflation are tolerated. This is what appears to have happened in 2008. 

Saturday, October 21, 2017

The Financial Regulatory Laffer Curve

Lawrence J. White has an interesting article where he considers the optimal size of our financial regulatory structure. He acknowledges that the structure it is "maddenly complex" and that it "easy to make a case for drastic simplification." Larry also notes, however, that there are benefits to having some regulatory diversity. We need to recognize this tradeoff, he contends, when considering the simplification of our financial regulatory system. 

To help us better understand this tradeoff, Larry lays out the case for reducing the number of financial regulators:
Regulatory decisions could be made faster, especially in a crisis, when policymakers need timely access to sensitive, proprietary information, and must coordinate actions both domestically and internationally. There would be fewer “turf wars” that can delay decisions. There would be less duplication and redundancy and less need for coordination among separate regulatory agencies... Regulatory costs would decrease, both for government (and thus for taxpayers) and for regulated firms. And there would be fewer opportunities for a race to the bottom, whereby a financial services firm tries to avoid (or reduce the burden of) regulation by “forum shopping” among regulators with parallel responsibilities who must compete for regulatees (their fee-paying clients). There also would be fewer incentives for one regulator to impede competition from financial firms under a different regulator.
He then discusses the costs to streamlining the number of financial regulators:
But there are also potential downsides. To see this, let’s really go to the limit:  Suppose that there were only a single regulator for all of the financial system. And suppose someone has a new idea for the kinds of financial services that could be made available, or for how certain services can be more effectively delivered to users. 
With a single regulator, there is an obvious risk: If that regulator has the authority to reject the idea and does so before it is implemented, the game is over. There is no place else for the innovator to turn (except, perhaps, to regulators in another country).  But with multiple regulators, there is an increased chance that—if the idea is worthwhile—one or more of the regulators will see the merit in the idea. 
In essence, an important assumption that underlies the potential benefits to simplification is that regulators will “get it right”: that they won’t make mistakes. By contrast, the argument for multiple regulators is an argument for diversity: that in a world where mistakes can be made, having some diversity can reduce the costs of error and increase the likelihood that worthwhile ideas will be able to take root.
This is good economic analysis. It recognizes the tradeoff to simplifying the U.S. financial regulatory structure. Mark Calabria, Norbert Michel, and Hester Pierce similarly note this tradeoff in their call to reform U.S. financial regulation. They see the need for reform, but also want to avoid going to a single 'super' regulator for the reasons outlined above. 

It hit me when reading these two pieces this financial regulatory tradeoff could be summarized with a Laffer curve-type framework. I sketched it out below with the rate of financial innovation on the vertical axis and the number of financial regulators on the horizontal axis. The number of financial regulators ranges from 1 (a single 'super' regulator) to N. The optimal number of financial regulators is the peak of this curve.

The framework should be uncontroversial. What is controversial is where we are at on the financial regulatory Laffer curve. Are we closer to point A or point B? I suspect we are closer to point B.