GDP data last week was slightly disappointing, with the headline measure of national activity coming in at just 1.4%, half as strong as expectations. This suggests that Q4 labor productivity data will not show a massive quarterly jump in output per worker. We care about productivity because much of the discussion on the impact of artificial intelligence on efficiency has touted its revolutionary impact. This belief has given central bank doves comfort that higher productivity resulting from the AI boom will allow equilibrium policy rates to fall. We’re not as convinced. We argue that higher productivity from AI adoption and innovation could lead to higher rates once it finally shows up in the numbers, which we do think will eventually occur, although not immediately.
We begin by looking at a five-year rolling average of quarterly data on annualized U.S. total factor productivity. We wish to identify periods in the past several decades in which it was elevated as evidence of a productivity boom. We prefer total factor productivity (TFP) over the more common measure of simple labor productivity. TFP adjusts the contribution to output by adding one more unit of labor with technological changes (both the quantity of capital input and the changes in composition that improve productivity). Workers can be made more productive on the margin without technological improvements merely by increasing existing capital stock. Adjusting the output per unit of input measurement by changing capital stock accounts for this process and provides a more comprehensive picture of productivity improvements from technological change.
We use a widely accepted historical data series from the San Francisco Fed and plot its five-year moving average against the New York Fed’s estimate of the real neutral policy rate over time since 1967 (nearly 60 years). Two episodes stand out. The first is in the second half of the 1980s, when TFP rose markedly from its earlier growth rate. This period was accompanied by relatively steady and somewhat high estimated neutral rates. This may be a disingenuous result, given the decade still witnessed relatively high real rates, a hangover of the inflationary 1970s and the Volcker Fed’s reaction to that episode.
More recently, the late 1990s were characterized by very strong and sustained TFP growth. This period is often cited as an analogy to the current AI boom. Note that until the recession of 2001, real rates of r* were similarly elevated. It’s true that then-Fed Chairman Greenspan saw that rapid U.S. economic growth could coexist with a similarly rapid increase in inflation. Nominal rates between 1995 and 1999 were still between 5 and 5.5%, and real rates were rarely below 2% and as high as 4% or more during the period.
We think that the expectation of lower policy rates due to increasing productivity could be off-base. Higher productivity likely increases the demand for capital, raising interest rates in markets that could be crowded out by AI (and other technology) investment. Furthermore, in equilibrium, higher productivity should raise real wages, as workers are compensated for the increased output they can make with the same hourly input. This can raise consumption and lower savings, requiring higher equilibrium rates. Finally, higher growth expectations due to increased technological adoption should raise asset values, making them more impervious to higher interest rates.
In short, we’re not sure that higher productivity axiomatically leads to lower policy rates. We show empirically that this wasn’t clearly the case in the late 1900s, and we don’t think it will be going forward. This debate should prove lively going forward as we think about what a new Fed Chair and the productivity boom likely to be unleashed will mean for the real economy and financial markets.