This research project is supported by a generous grant from the Alfred P. Sloan Foundation. The winter 2015 meeting was hosted by the Volatility Institute at New York University’s Stern School of Business.

Agenda

Friday, March 13, 2015

Session 1: Leverage Restrictions in a Business Cycle Model

Can “frothy “ financial markets with elevated asset valuations have implications for future economic performance? If so, which markets are significant, and what are the measures of froth? What are the channels of transmission of volatility from markets to the broader economy?

Jeremy Stein presented work with Federal Reserve colleagues David López-Salido and Egon Zakrajšek that explored these broad questions.

In a bit of a behavioral finance approach, Stein defined sentiment as attitudes toward the future, reflected in future returns to assets bearing credit risk. Sentiment can be measured in the level of credit spreads and the share of high-yield bonds (vs. safer treasuries) issued.

“The story we’re trying to tell—but maybe not prove—is that elevated credit market sentiment today leads to a reversal and inward shift in credit supply, and tighter credit leads to an economic contraction,” Stein said.

“When we say sentiment is high, it means spreads will widen in the future and the share of high-yield bonds is high,” Stein said.  “Our novel finding is that when spreads widen, we also start to see a fairly significant decline in economic activity; we can start to predict a downturn.”

In their model, a swing in sentiment from 25th to 75th percentile of distribution implies a cumulative decline in real GDP growth of about 4 percentage points two to four years (or periods) later. High credit market sentiment is also associated with increased unemployment later.

While the predictive effect appears strong, Stein cautioned that there is more work to do to show causality and understand the transmission mechanism.

One alternative story to explain their observations is that elevated sentiment reflects overly optimistic investment. “A lot of dumb stuff gets [financed], there is overinvestment, and later there’s an adjustment,” he said. In that case, however, there might be a slowdown in homebuilding but not necessarily an inward shift in mortgage lending.

If credit supply factors prove to be an indicator of future economic performance, they also might be able to forecast firms’ future financing mix, and this turned out to be true, Stein showed.  In periods of elevated sentiment today, equity issues rose relative to debt issuance at the time of future economic contraction.

The relationship stood up over time; when the Great Depression and the recent recession were excluded from the data, the same effect was observed over the period from 1952 to 2007, and with a 40-year rolling window.

Exuberance in stock markets does not show the same predictive pattern; only credit market spreads seem to have influence on the economy, Stein said.

Audience members questioned whether the findings were simply reflecting volatility or just reversion to the mean.

Stein said there was more work to do in several areas. One is determining what drives the changes in credit market sentiment. There is some evidence that monetary policy plays a role via reaching-for-yield mechanism. If true, that might shift central banks’ focus to a new and interesting tradeoff: not the traditional inflation vs. unemployment, but unemployment today vs. unemployment tomorrow.

“That’s totally speculative; we’d have to figure out a way to show that monetary policy is the instrument that moves these credit spreads around,” Stein said, adding that it’s an interesting avenue for exploration.

—Toni Shears

Lunch Remarks: Credit Market Sentiment and the Business Cycle

Session 2: Macroeconomic Models for Monetary Policy: A Critical Review from a Financial Perspective

The recent financial crisis and the Great Recession of 2007–2009 posed new challenges to the current macroeconomic models designed to provide guidance for monetary policies. On the one hand, central banks have actively taken unconventional monetary measures, and the financial sector becomes a crucial channel for those monetary transmissions. On the other hand, common macro models lack of analytical specificity to account for important financial sector influences on the aggregate economy.

Policymakers and researchers are demanding a new generation of enhanced models and advanced empirical and quantitative methodologies to better study the impact of shocks that are initially large or build endogenously over time through the financial sector.

In this session, Winston Wei Dou of MIT provided a critical survey on the macro models for monetary policy from a finance perspective. Dou and coauthor Andrew Lo first review the history of the monetary policy modeling. Then, they set forth a simple canonical framework that incorporates the key up-to-date theoretical advances. This surfaces the challenges for the existing models and quantitative methodologies. Finally, they review the current core monetary models employed by the major central banks.

The authors conducted a cross-sectional comparison among the models currently popular across major central banks. The comparison focuses on three classes of models: Large-scale macroeconometric (LSM) models, structural vector autoregressive (SVAR) models, and dynamic stochastic general equilibrium (DSGE) models.

While SVAR models can be viewed as linear econometric approximations of the fully-specified DSGE models, the LSM models have several important disadvantages relative to the DSGE models. First, LSM models depend on ad-hoc short-run adjustment dynamics. As discussant Lars Peter Hansen pointed out, the short-run dynamics in DSGE models are also “ad hoc” to some extent, even though the individual optimization is explicitly specified there ( i.e. “micro-founded”). But the explicit modeling of individual optimization is useful because it allows the linkage between data and deep structural parameters.

Second, LSM models are more subject to the Lucas Critique and the Sims Critique, while at the same time they have similar flexibility, extensibility and capacity of estimation and prediction compared to DSGE models.

Given the advantages of DSGE models, the authors study a canonical New Keynesian DSGE model. Using this model as a concrete example, the authors demonstrate the limitations of current DSGE models and provide insights into what the models lack could be important.

Their proposed model starts with a traditional New Keynesian DSGE framework with several typical components: monopolistic competitive firms, Cobb-Douglas production, Calvo nominal rigidity in prices and wages, capital accumulation, and external habit formation in consumption. This follows the seminal works by Christiano, Eichenbaum, and Evans (2005) and Smets and Wouters (2003, 2007).

Then the authors incorporate some new components, include a stylized financial sector, time-varying disaster risks, and a government credit policy. This model shows the importance of the imperfect financial sector in generating the nonlinearity and skewness in risk premia and, in turn, causing dramatic consequences in the real economy. Log-linearization suppresses those key features of nonlinearity and skewness in quantitative analysis of the model.

Key ingredients missing from this model include: 1) government balance sheet and active fiscal policies; 2) heterogeneity and reallocation; 3) macroeconomic impact of sizable and nonlinear risk premium dynamics; 4) time-varying uncertainty; 5) financial sector and systemic risks; 6) imperfect product market and markups dynamics; and 7) solution methods for dynamic quantitative structural models which allow for nonlinearity and skewness.

The discussants, Lars Peter Hansen and Harald Uhlig, offered several insightful comments for further improvement. First, they pointed out that it is valuable to distinguish the price channel and the exposure channel of asset pricing in the model. Second, it is important to further clarify or demonstrate why the financial sector is relevant for monetary policy analysis when confronting financial market fluctuations, particularly for the unconventional credit policies.

Discussants also noted that it would be useful to show in the model that the connection between the key financial market phenomena and real quantities is not appropriately accounted for. Finally, they called for a serious understanding of the sources of uncertainty shocks and more specific discussion of the redistribution channel.

Session 3A: MFM Young Scholars

Emil Siriwardane presented an empirical paper aimed at understanding how asset prices behave when capital cannot move freely among different investment opportunities. This is a response to the increasing growth in popularity of asset pricing models that feature limited capital and whose basic prediction is that risk premiums should increase when aggregate capital available for risk taking increases. To accomplish his goal, Emil used detail data on the credit default swap (CDS_ market provided by the Depository Trust & Clearing Corporation). The data contains the positions of the market participants, which provide a way to compute the capital fluctuations experienced by the different traders in the CDS market.
The first finding of the paper is that the market is highly concentrated, with very few sellers controlling 50 percent of the market and very few buyers controlling 20 percent of the market. Having determined the existence of these big players, Siriwardane investigated how the capital of these big players affects asset prices, particularly prices of CDS spreads. The empirical issue is that CDS spreads could be causing capital fluctuations and not the other way around. To get around this problem, Emil develops a measure of capital fluctuations that are not related to the industry of the entity upon which the CDS is written.
The main finding is that capital fluctuations for the top five sellers of CDS can explain about one-ninth of the variation in CDS spreads. In addition, losses matter more than gains for prices, so the relationship between capital fluctuations and CDS spreads is nonlinear.
Another important finding is that volatility of risk premiums increases as concentration increases. This suggests that not only aggregate capital matters, but also how it is distributed. Siriwardane discussed some policy implications of his findings. First, it is possible that the post-crisis regulation has had the unintended consequence of creating a large concentration of CDS positions in some hedge funds, which are not regulated.
Second, the distribution of bailouts matters since the effect of giving capital to an institution on prices changes depending on how big the institution is.
— Juan Ospina

Executive Session (Working Group Only)

Session 3B: MFM Young Scholars

Matteo Crosignani presented “Why Are Banks Not Recapitalized During Crises?”

Juliane Begenau presented “Capital Requirement, Risk Choice, and Liquidity Provision in a Business Cycle Model”

Session 4: Stress Tests to Promote Financial Stability: Assessing Progress and Looking to the Future

The papers presented in this session explored ways to improve bank stress testing, allowing regulators to better understand where the financial system is weak and how they might fix it.

Richard Bookstaber’s approach included three distinctive features. First, unlike past work where stress tests of individual banks took the status of the whole system as given, the model in this paper allowed the health of the wider financial system to be affected by the position of the distressed bank, potentially amplifying the effects of the shock.

Second, the model of the financial sector was not an abstract network, but instead attempted to faithfully replicate the structure of the US financial system. The cost of this added realism was that the behavioral rules of the model agents were heuristic—not necessarily the outcome of an optimization problem.

Third, the model features two separate channels for shock transmission: a leverage channel that affects bank liabilities and a liquidity channel that impacts bank assets.

The model has an arbitrary number of assets and three types of actors. Cash providers—representing money market funds, pension funds, and the like—provide short-term funding to the system but may also attempt to withdraw their funds when the system is under stress. Hedge funds take levered short and long positions on the various assets. Bank/dealers provide the financial services to allow hedge funds to operate, lending cash or securities to cover the hedge funds’ positions.

The central example presented in the session featured three assets: one cash provider, two hedge funds, and two bank/dealers. The network of assets and liabilities was quite modular; it could be split into two almost distinct halves. In each half was a hedge fund and bank/dealer pair that only dealt with each other and an asset held uniquely by that pair. The two halves were connected only by loans from the cash provider or holdings of the third asset.

The point of this structure was to highlight how shocks to one part of the financial network can spread throughout the whole system even in the absence of direct links. This is a mechanism that would be missed by previous attempts to conduct stress tests, as they omit the feedback loop between the distressed bank and the system as a whole.

In this model, a shock to one of the assets in one of the halves of the system ends up propagating to the other half via removal of liquidity provision by cash providers and fire sales of the jointly-held asset. The policy implication of this model is that it can be very difficult to identify the source of a shock when the whole network is under stress. The model also clarified the role of liquidity in propagating shocks. As prices fall, the sluggish response of cash providers means that new orders are slow to come in, depressing the price further.

In the discussion, Ta-Chung Liu Distinguished Visitor Robert Engle set out a wish list of the desirable features of the next round of stress tests. Foremost among these was that the model could be updated very quickly and that it be able to consider a very large number of scenarios. This would allow regulatory evaluation of stressed environments in real time and would give a superior measure of tail outcomes, both areas of which current practice is weak.

Engle also argued that the correct measure of the vulnerability of the financial system as a whole was the sum of the capital injections required to stabilize each of the individual banks if they failed alone. If all banks were recapitalized to this degree, then the system as a whole would be stable.

To illustrate this, the discussion then presented measures of this number based on the Volatility Institute’s V-LAB model, which can be updated weekly. These showed that the riskiness of the world financial system has fallen by around a quarter since the financial crisis, but is still far above pre-crisis levels. The largest courses of this risk are China, Japan, and France.

Audience questions focused on the mechanics of the network model, covering leverage, liquidation rules for the hedge funds, and the basis of the heuristic decision rules. The broad tone of the questions agreed with the need for faster regulatory assessments and for consideration of more scenarios. In fact, there were some suggestions that this might be taken further to include almost instantaneous assessment of the stability of the financial system.

—   Philip Barrett

Lunch Session 5: Monetary and Macroprudential Policy: Mode Comparison and Policy Robustness

This session introduced tools that researchers will find useful for comparing difference policy rules across different models. Professor Wieland presented his work in creating the Macroeconomic Model Database. This is a tool that takes a standard model description and, using Dynare to solve the model, allows users to compare either the effects of one policy rule in many models or to test many different policy rules in one model.

Wieland showed an example using this tool to compute the effects of a monetary policy shock to a common Smets-Wouter policy rule in nine different macro-financial models:

  • three variants of the Bernanke-Gertler-Gilchrist financial accelerator;
  • three models where housing is an important part of consumers’ collateral, as in Iacoviello (2005); and
  • three with agency problems in banking, as in Gertler-Karadi (2011).

Directly comparing each model with the same policy rule highlighted some of the key differences between the models. For example, the models produce different output responses because the specification of adjustment costs on either investment or capital has differing effects on the response of investment.

The second major example Wieland demonstrated evaluated macroprudential policy rules. Here, households borrow from the financial sector to finance purchases of housing. The interest rate on their borrowing is determined both by monetary policy (which set the risk-free rate) and by macro-prudential policy (which drives the spread between risk-free rates and the mortgage rate). As a result, monetary policy and macro-prudential policy are partial substitutes for stabilizing the macroeconomy.

The toolkit presented allowed a particularly clear illustration of this. In the model, a monetary policy rule which “leans against the wind,” raising interest rates during a lending boom, can stabilize the economy just as well or better than a monetary policy which does not react to lending quantities, even if the macro-prudential policy tool does.

By providing a standard vocabulary for describing models in code, this project also ties into efforts to improve the standard of computational work in economic research more broadly. The standardized representation makes model documentation and replication much easier, as it gives individual researchers a “common language” for expressing their work.

Simon Gilchrist’s discussion extended the methodology presented in the session to consider how a financial shock might propagate through the economy and how the choice of policy rule might determine the effects of the shock. A third question explored whether any policy rules are particularly robust. The benchmark model for this exercise is that of Gilchrist & Zakrajsek (2014), which features a financial accelerator with a risk shock.

Professor Gilchrist then performed a “one model, many rules” exercise. He compared the impulse responses to a financial shock under a persistent Taylor rule targeting inflation and output growth to a rule augmented by a term which dictated interest rate cuts when spreads rose. In all dimensions, the augmented rule performed better, reducing the falls in output and prices, and lowering credit spreads.

The intuition is that by committing to reduce interest rates if spreads were to rise, the central bank provides insurance to a leveraged financial sector, mitigating the rise in their borrowing costs and maintaining the supply of credit. In equilibrium, this promise alone does much of the work; the commitment to insure to the financial sector inhibits the amplification of credit spreads, causing a smaller reduction in equilibrium nominal interest rates under the augmented rule.

This analysis raised two further interesting points. First, it implied that commitment to a monetary policy rule might be even more important in a world with financial shocks. Second, the inclusion of output growth, which delivers reasonably good results in typical sticky-price models when the output gap is so mismeasured as to be unusable, performs worse than output alone when there are financial frictions. This last result was not obvious ex ante, and made clear as a result of the rule-comparison methodology.

Much of the questions this session addressed included the practicalities of the macro model comparision tool, such as how the database is updated or how users can write their own models.

—   Philip Barrett