Fed’s Shift to Data Dependence Foreshadowed in 2010 Projectionsby
New York Fed’s ‘Blackbook’ projections show wide misses on GDP
Ex-research chief says lessons from forecasts were humbling
Federal Reserve Bank of New York economists struggled to offer accurate guidance to senior policy makers on the direction of the U.S. economy in the wake of the financial crisis and Great Recession, according to newly obtained records.
The documents from 2010 also help to illustrate, with the benefit of hindsight, why the Fed has increasingly emphasized “data dependence” in its recent policy strategy. That’s another way of saying central bankers have little confidence in their ability to predict economic conditions more than a few months into the future, and so are less willing to commit to a longer-term policy path.
The New York Fed’s so-called “Blackbook” briefings, prepared by staff economists for President William Dudley ahead of meetings of the policy-setting Federal Open Market Committee -- where he serves as vice chairman -- include forecasts for fundamental economic measures that, even in the near term, frequently missed their mark by a wide margin. Expansion in the two years following December 2010 projections would be off by more than half; advisers also failed to foresee important underlying trends such as the looming collapse in productivity growth.
“There were consistent ‘misses’ in our forecast, which was very humbling and led us to question the precision of our models, which is scary for people who deliver advice,” said James McAndrews, who was co-director of research at the New York Fed in 2010 and has since left the bank.
The records were released to Bloomberg News under a Freedom of Information Act request.
What emerges from the documents is a picture of some of the world’s ablest economists flummoxed by the most disruptive series of financial and economic events in their lifetimes. That disruption continues to bedevil many economists today, making some aspects of forecasting historically difficult. By extension that makes monetary policy mistakes more likely, causing the Fed to delay decisions until more supporting data has piled up.
“It’s like driving on the edge of a dark cliff on a foggy night,” said Kermit Schoenholtz, director of the Center for Global Economy and Business at New York University’s Stern School of Business. “The typical advice is to go slowly.”
The 2010 Blackbooks illuminate a period when forecasts were crumbling. Over the course of 2010, the New York Fed went from predicting the liftoff of the benchmark borrowing rate away from zero in the first half of 2011, to predicting a move at the end of 2012. In the end, it remained frozen near zero until a quarter-point increase in December 2015, and hasn’t been raised since.
In December 2010, the New York staff forecast the economy would expand by 4 percent in 2011 and by 4.1 percent in 2012. Actual growth for those years ended up at 1.7 percent and 1.3 percent. The New York staff also got it badly wrong on inflation and unemployment.
Just as striking was the group’s exercise in presenting Dudley with alternative scenarios to their overall forecast, a way of advising him on plausible ways the economy might perform better or worse than predicted. Throughout 2010, the Blackbook identifies about a 30 percent to 35 percent chance -- the single most likely alternate scenario --- of an approaching “productivity boom.”
The importance of productivity -- a measure of output per hour worked -- is hard to overstate. Fed Chair Janet Yellen has called it “the key determinant of improvements in living standards.” It hasn’t boomed, but cratered, averaging gains of 0.5 percent annually since 2010, compared with 2.5 percent in the previous decade.
Significant forecast errors can merit criticism, but some economists caution that the events of 2008-09 have made the exercise more difficult.
David Stockton, who spent 20 years at the Fed, including nine as director of research and statistics, said it’s important to understand that when economists make a forecast, they aren’t claiming to see the future, but rather declaring which version of the past is most likely to repeat. They may use increasingly sophisticated models, but it’s all about identifying what Stockton called “past empirical regularities.”
“The art of using those models is in understanding how you think those past empirical regularities will still hold,” said Stockton, now chief economist at LH Meyer Inc., a consulting firm in Washington.
An economist can fumble a forecast by missing important factors, mismeasuring others or mistaking correlation for causation. But even when he or she gets everything right, the world might simply change in unexpected ways, as it did during the Great Recession, altering the way households, businesses, investors and governments behave.
“The shock we went through was the largest since the Great Depression, and it was worldwide,” NYU’s Schoenholtz said. “The reality is even short-run forecasts became difficult.”
Data from the Philadelphia Fed’s quarterly Survey of Professional Forecasters, which began in 1968, partly, but not completely, support that point. When projecting unemployment one year out, survey participants have done measurably worse since the Great Recession, compared with non-recessionary periods from 1985 to 2007.
For inflation and gross domestic product forecasts, however, error rates have not been meaningfully different since the crisis, according to analysis by Tom Stark, who compiles the survey for the Philadelphia Fed.
Stockton said he believes forecasting has, in general, been more difficult since the crisis and will only improve once new, predictable patterns emerge from the post-recession data, a process that can take years.
“That’s hard, obviously, trying to pick up on structural changes in the economy in real time,” Stockton said. “The data are noisy. It takes a fair amount of time to see in the data these new emerging trends.”