Loading Events
  • This event has passed.

The 2008 world wide financial crisis as well as the 2011 sovereign debt crisis in Europe highlighted the need to better understand the precarious interaction between the financial sector and the economy as a whole. At this joint event, leaders of the Macroeconomic Financial Modeling (MFM) initiative and organizers of Macroeconomic Fragility conference brought together scholars focused on macroeconomic fragility, systemic risk, financial-macro interactions and related topics. New research offering theoretical insights as well as empirical work towards the measurement of systemic risk were presented.

Macroprudential policy dominated much of the discussion at the first half of the October 2013 Macro Financial Modeling and Macroeconomic Fragility Conference. This is an important area for crafting policy tools that can mitigate systemic risk within the financial system before it can spiral out into a full-blown meltdown.

Attendees heard from representatives from the Bank of England, the European Central Bank and the Federal Reserve, as each presented their own proposed methodology for mitigating and testing for systemic risk within their country’s macroeconomy.

Sponsors

We gratefully acknowledge generous support for this event from:

  • Alfred P. Sloan Foundation
  • CME Group Foundation
  • Institute for New Economic Thinking

Agenda

Perspectives from the Bank of England

The Bank of England’s Sujit Kapadia and Vasileios Madouros reviewed  the macroprudential tools under consideration by Great Britain’s central bank , as well as the role that stress testing of the UK economy factors into how those tools are applied.

Kapadia explained that the Financial Policy Committee (FPC) —a counterpart to the Financial Stability Oversight Council in the US or the European Systemic Risk Board in Europe more broadly— was set up in the UK to “remove or reduce systemic risks with a view to enhancing and protecting the resiliencies of the UK financial system. It aims to re-integrate  regulation of insurance, banking regulation, financial conduct and oversight over funds and smaller non-deposit taking institutions under the regulatory umbrella of the Bank of England. “It was set up to take a system-wide perspective, plugging the gap between micro-prudential regulation and monetary policy,” said Kapadia.

Sujit Kapadia

The FPC has members from both within and without the banking sector, as well as a non-voting Treasury member. Its charge Is to slow the spread of risk without halting the financial system altogether.

The committee relies on  a delicate balance of compulsory “directions” established through legislation, recommendations to micro-prudential regulators in the UK that must be either obeyed or addressed publicly, and general recommendations made to any financial entity. “One of the criticisms of financial stability functions before the crisis was that they report on and analyze risk, but they couldn’t actually do very much,” says Kapadia. “This framework attempts to address that problem.”

For example, the FPC would have the statutory authority to move capital requirements up and down, to the level of specific sectors of the economy. Facing more individualized one-time concerns, like the effects of disclosure of specific information or changes in market infrastructure, the FPC would use a “comply-or-explain” recommendation to force regulators to examine the issue. And if a new financial entity emerges as a major player and poses significant risk, the general recommendations allow the FPC to act on them as well.

“If you see risk migrating outside of the regulated system—regulatory arbitrage—this allows you to make a recommendation to the treasury or a new institution that was previously not captured under previous regulatory framework,” says Kapadia.

Establishing a legal basis for mandatory “directions,” such as capital requirements for specific sectors of the economy, requires making a clear case that a given element of the economy contributes to systemic risk.  “It’s quite striking is that from 2003 to 2007, two-thirds of the growth from debt in the UK economy was explained by inter-financial system activity,” said Kapadia. That data provided evidence that capital requirements for financial system lending—along with mortgages and commercial real estate–should fall under the scope of the FPC’s authority.

Transparency is key. Discussion defining the legal basis of FPC authority—as well as the criteria they use when deciding whether to apply their macro-prudential powers—is deliberately made public. FPC publishes the indicators they examine in simple, intuitive language on a regular basis. Kapadia said that this both makes their actions understandable and predictable.

Kapadia said that the two-year-old council still faces numerous open questions in how it should operate: For instance, how can macroprudential tools best address uncertainty? Or what are the costs of “activist” policy versus policy that only shifts every 20 years? How do you best balance a resilient economy against one with ample credit supply? He said that he hoped continued conversations with MFM members might produce some answers.

On that topic, conference organizer Harald Uhlig raised the point that more intellectual transparency—that is, a sense  of the academic work influencing their internal thought processes, not just the data and criteria for decision making—would help the academic world understand their inner workings even better. This can be difficult; a common theme among all central bankers at the conference was that many intellectual influences are blended together in the policymaking process.  However, disclosing the intellectual underpinnings could help scholars consider and address challenges still faced by bodies like the FPC.

Macroprudential tools offer the Bank of England the means for stabilizing systemic risk. Stress testing of the UK economy, as explained by Vasileios Madouros, provides insight into precisely where those risks might pop up in the economy.

Madouros explained that the suite of models used by the Bank of England in stress testing provides both the FPC and the Prudential Regulatory Authority with concrete guidelines essential for policymakers. “They can go to Parliament and explain the stresses that we expect banks to easily withstand, and Parliament can act on them,” said Madouros.

In order to integrate as many scenarios as possible, and thus avoid the risk of any one model being wrong, the Bank of England employs models spanning three basic approaches:

  • granular regulatory models that measure consistent shocks over time;
  • coarse, systemic models of flexible and sudden shocks; and
  • banks’ own internal stress testing models, which leverage banks’ familiarity and data related to their own business models, even if they have incentives against proving themselves vulnerable to given risks.

Madouros said that systemic stress testing provides rich background information on the banking sector that can help guide of macroprudential policy, since it provides critical feedback on the effectiveness of policy decisions. “It’s a tool for calibrating the amount of capital that the system might need to withstand certain shocks, within given sectors,” he said.

Disclosure of those results remains a topic up for debate, according to Madouros. The correct balance between publicly disclosing banks’ risk assessments and keeping them internal to the Bank of England remains to be seen. “Which combination of financial forecasts will be best for the banking system as a whole?” he asked.

Discussing the Bank of England presentation, Lars Peter Hansen urged the audience not to fixate on the particular models being used by each central bank. “Let’s focus on what challenges this model presents moving forward.”

He pointed to their use of countercyclical capital buffers—requirements for banks to raise capital reserves in periods of growth to prepare for periods of decline—as the most compelling tool at their disposal. This approach is rooted in informal modeling and guided by “simple” indicators. “This didn’t look so simple, or obviously transparent for me,” said Hansen. “But I’m hopeful that there’s a push in that direction.

Hansen did note a large gap in the thinking guiding the Bank of England’s macroprudential policy:  the “macro” part. He said that further modeling was needed to understand the impact that tools like capital requirements could have on the macroeconomy.

As for stress testing, Hansen said that the problem of quantifying the effects of such a wide array of scenarios in a meaningful way provides a fascinating challenge. However, he cautioned against drawing overly specific conclusions from models that rest on a highly-aggregated view of the financial sector.

Since people within those models may alter their behavior over time, Hansen proposed dynamic stochastic general equilibrium (DSGE) modeling as a potential solution, as the integration of microeconomic principles into the larger macroeconomic model might offer a more complete picture for drawing policy conclusions.

In the question and answer session that followed, Chicago Booth’s Anil Kashyap disputed the idea that a comprehensive DSGE approach could more effectively model the financial sector. He insisted instead that we should instead narrow down the effects of adjusting different aspects of the financial system, holding the surrounding elements “sticky.”

Kashyap said the only way forward is to not break too many assumptions at once. This launched a larger debate between attendees over whether it’s more important to know the precise cause of a financial distress, or whether the system as a whole is resilient enough to weather the resulting fallout.

—Mark Riechers

Sujit Kapadia, Bank of England
Vasileios Madouros, Bank of England
Discussant: Lars Peter Hansen
Tools for Macro Stress Testing and Macro Prudential Policy Assessment: The ECB Perspective

Representing the European Central Bank, Christoffer Kok described the challenges that the ECB faced in shifting roles from a monetary policy body to more of a banking union for all of Europe. In that role, the ECB is using macroprudential policy as one of the main instruments for taking a more active role in stabilizing the EU economy.

Like the Bank of England, ECB economists find themselves providing analytical support to the European Systemic Risk Board, the EU’s macroprudential policy body. But since 2012, the ECB has been working on creating a system of financial supervision via the financial authorities present in each respective country. The idea is for ECB to eventually have more direct oversight over both macro- and microprudential policy in each country.

Christoffer Kok

Kok detailed the ECB’s principles of stress testing, which, like the Bank of England, involve a suite of testing models that aim to measure how systemic risks materialize over a given period of time. The tests double as a macroprudential policy tool, since they provide the ECB with a method of identifying banking sector vulnerabilities that they can then address.

In discussing how the U.S. and European financial crises has shaped the way ECB has considered macroprudential policy tools, Kok explained that the shift toward becoming a true banking union underscored the realization that monetary policy “was too blunt an instrument” to deal with the complex systemic risk factors.

Moreover, the microprudential regulators in each country lacked the perspective from the top to see system-wide trends; they missed key externalities caused by institutions within their own countries.

While that makes a case for a body with broader macroprudential regulation powers, Kok said that the precise role of such a regulator still remains up in the air, partly because the definition of systemic risk remains somewhat fuzzy.  Also, we have limited data on the effect of trying to manage risk factors on the economy, especially in contrast to other policy tools like fiscal policy and microprudential regulation.

Moving forward, Kok said that the ECB will be compiling data on how their macroprudential policy tools perform, in the hopes of refining their models to focus their stress tests on particular elements of the economy. More data—coupled with feedback from the academic community—could also help answer questions of when macroprudential tools should be activated and eased in given scenarios, as well as aid regulators in assessing the long impact of structural reform measures in the banking system.

In discussion, Simon Gilchrist of Boston University broke down the relative differences between the ECB’s bottom up and top down stress testing techniques.

The bottom-up approach, performed by bank regulators, measures the effect that macro factors might have on random samples of individual bank loan. The top-down approach, performed at a country or EU-wide scale, feeds public data into a dynamic model of a macro-scale scenario to test individual banks’ capital ratio response.

Gilchrist pointed out that the latter approach has a huge benefit of applying a common yardstick against which all banks must be measured, as opposed to varying by country. However, what is done with the results of that test must be considered carefully, since publishing the strengths and weaknesses of each bank has a demonstrable effect on the market. This was demonstrated in data released after the Comprehensive Capital Analysis and Review conducted by the Federal Reserve in 2012. “Clearly, the banks that had the lowest decline [in capital] under the stress test did much, much better in terms of market performance,” said Gilchrist.

What does this mean for macroprudential policy more broadly? Gilchrist suggested that finding ways to bridge micro- and macroprudential data with minimal statistical distortion would lend more nuance to stress test results. “For example, if I shut down the price effects, what would be happening here?”

He also noted a serious “elephant in the room” in the form of sovereign spreads—the extra interest that countries might incur should they default on their debts. Integrating them into the macroeconomic scenarios being employed in stress tests will be critical to achieving the ambitious goals that the ECB has laid out for itself, according to Gilchrist.

—Mark Riechers

Christoffer Kok, European Central Bank
Discussant: Simon Gilchrist , NYU
Perspectives from the Federal Reserve System

Kicking off the second day of panels, Tobias Adrian of the Federal Reserve of New York succinctly summarized what he and his counterparts at the Bank of England the European Central Bank shared on what drives macroprudential policy tools. “The aim of macroprudential policy is to reduce systemic risk,” said Adrian. “Microprudential policy is not enough, because it cannot account for externalities.”

To that end, Adrian said that macroprudential policy has two basic roles. Policy should both strengthen the resilience of the financial system against downturns, and limit the number of risks that occur as part of the normal financial cycle but, could cause or escalate a financial bust when they accumulate.

Tobias Adrian

Adrian outlined three macroprudential instruments at the Fed’s disposal:

  • capital-based tools like counter-cyclical capital buffers and sectoral capital requirements;
  • liquidity-based tools like counter-cyclical liquidity requirements; and
  • asset-side tools like loan-to-value and debt-to-income ratio caps.

Each instrument is calibrated to address a specific source or set of sources of systemic risk, ranging from excessive leverage to over-reliance on interconnectedness of financial entities.

A series of indicators would lead Fed regulators to activate these macroprudential tools: First, macroeconomic indicators specific to each instrument would need to be triggered. At that point, an assessment of the empirical robustness within that set of indicators would take place. Based on those results, regulators would decide wither  there was a downswing in the financial cycle, or if a crisis requiring intervention is taking place. When intervention is required, market-based indicators would guide regulators to the proper measure of response. “Based on what you see in these indicators, you can decide whether to tighten or loosen these macroprudential tools,” said Adrian.

Adrian said that the remaining work lies in more accurately predicting how these risks will crystallize into crises ahead of time. “These are all questions that, as economists, we like to have models to answer.” In particular, he pointed to uncertainty over the ways that different tools and indicators interact, and how the market might react to the action or expected action of Fed regulators. He said these are areas that the academic side might help regulators gain greater understanding over with further empirical work.

Shifting from macroprudential to monetary policy instruments, James Clouse—representing the Division of Monetary Affairs at the Federal Reserve’s Board of Governors—explained the ways that attitudes about monetary policy at the Fed have evolved over the duration of the recent financial crisis. Clouse described an emerging view that balances concern for overly “activist” policy maneuvers against the potential of monetary policy as a powerful tool for addressing financial stability.

Clouse noted numerous viewpoints on the matter, including concerns:

  • that low-interest policies may be creating incentives leading toward long-term financial imbalances
  • that a shift away from accommodative monetary policy could have dramatically negative consequences; and
  • that addressing risk by tightening monetary policy will inevitably require a risk-reward analysis based on delicate probability analysis.

That’s the reason why Clouse said that the Fed, like many other central banks, has been working on numerous stress test scenarios aimed at shedding light on the different ways that the monetary policy–among other policy tools–might affect systemic risk within the macroeconomy. But Clouse cautioned that stress testing only remains useful as long as too many aspects of the economy are not aggregated to the point of losing relevance. “Stress testing is helpful, but you have to be pretty careful about the elements you choose to test.”

The long-term goal, in Clouse’s estimation, is to enable policymakers to weigh the costs of shifting monetary policy with the most robust models possible. “Policymakers care about these issues, they just don’t currently integrate them into their working toolkit,” he said.

Discussant Christopher Sims’ response to the Fed presentation was blunt:  “My first impression from a lot of these papers is just how much we don’t know,” said Sims. He said the lack of transparency in the data behind some of the policy tools presented made him uneasy.

Moreover, he added, his general takeaway from the underlying literature is that Fed policy is trying to fight the natural cycles of markets in order to mitigate some of their underlying risk. The problem, in Sims’ view,  is that stabilizing any one aspect of the economy will always result in destabilization elsewhere. Sims said flattening the cyclical nature of the economy could limit the potential for innovation and entrepreneurship that takes place in the riskier corners of the economy. “There will be a cost if we try to dampen natural cycles of boom and bust,” he noted.

Criticisms aside, Sims said that the challenge of providing simple probabilities to policymakers to estimate how bad errors in policy judgment might be is a complex or even impossible challenge—given the number of uncertainties that can’t be resolved with data. “We need dynamic probability models to help us guide these decisions,” he said. “It’s not unlike simulating a nuclear reaction.”

As the morning of Federal Reserve panels continued, Fernando Duarte of the Federal Reserve Bank of New York delved into his work analyzing a specific source of systemic risk in the US economy: spillover effects resulting from fire sales. Examining fire sales documented in US bank holding company data offered a case study of the ways that risk in one market can cascade into other connected markets.

Looking at real-time market data as well as stress test results, Duarte concluded that fire sales can have a measured impact on the economy, even in good times; vulnerabilities had been building up since 2005, but their effects were amplified twice to three times during the crisis. Interestingly, the top ten largest banks in the study represented 80 percent of the externalities observed. The amount of leverage and connectedness for those larger banks likely contributed to their disproportionate share of the risk.

Duarte concluded that given the higher risks faced by those larger, highly leveraged and highly connected banks, the capital injections made under TARP came pretty close to optimal in terms of where they could be applied within the network of interconnected financial institutions. From those larger institutions, the capital had the best chance of being distributed throughout the system, he said.

Nellie Liang of the Federal Reserve Board of Governors turned to the issue of implementation, singling out three key policy issues for discussion: reducing runs in shadow banking, designating financial institutions of systemic importance, and installing countercyclical capital buffers to prevent feedback loops between adverse conditions within firms and the broader economy.

Nellie Liang

“We can’t eliminate fire sales, but we do want to reduce the costs,” said Liang. Evidence exists that by establishing a government backstop against runs on short-term wholesale debt, fire sales might become less likely.

Following what happened to AIG preceding the financial crisis, Liang said that tools for addressing the systemic risk that arises from spiraling maturity and credit transformations have been developed under requirements set out by the Dodd-Frank Act, including margin requirements for secured funding and regulations on bank liquidity and capital.

Liang said that future microprudential tools would aim to ensure solvency of individual banks and firms, via regulations on the minimum capital they must hold and supervision via review of how banks handle stress testing under multiple dynamic scenarios. Macroprudential tools will aim to mitigate the externalities by using surcharges on systemically important financial institutions and higher capital buffer requirements in growth periods, when capital is cheap. The idea is to build up resilience in the economy against the risks that build up behind the scenes of good economic conditions, said Liang.

But Liang acknowledged that crafting macroprudential policy tools that don’t introduce new sources of systemic risk into the system presents a challenge. Disclosure of policy guidelines and stress test results can enhance the discipline within firms, but can also encourage them to hold capital in anticipation of macroprudential action, said Liang.

The hope is that banks can’t really hedge on short-term movements of macroprudential policy. This is just one dilemma related to the timing and degree to which each macroprudential policy tool would best be applied for regulators and economists to consider as such policies are implemented.

Nobuhiro Kiyotaki of Princeton University joked that his comments were just a Mickey Mouse analysis of the work,” adding,  “But don’t underestimate Mickey Mouse.”

Kiyotaki pointed out that according to the literature, Dodd-Frank’s main weakness is its inability to stop a bank run. While the act established a regulatory framework for comprehensive stress testing at a macroeconomic level, it doesn’t address issue of runs within the shadow banking sector. That poses a need for more comprehensive systemic risk management.

Institutions in the shadow banking system should be identified as systemically important financial institutions and subjected to stress testing, to improve data collection on and disclosure of their resilience to potential shocks, he said.

But how are these policies allocated to the tasks they can address? Kiyotaki said that depending on the business cycle, it might be more appropriate to for the government to act as a lender of last resort, or strictly as a monetary authority, in response to a given crisis.

Honing the exact criteria for when a specific policy tool is set into motion will be critical for properly addressing future crises. Moreover, he noted a demonstrated need for both a forward-looking systemic risk monitor to better inform the use of tools at regulators’ disposal, as well as a better contingency plan for what regulators will do once a crisis is already in motion.

But at a deeper level, Kiyotaki pointed out the inherent tradeoff that must be considered with macroprudential policy tools: resilience, or growth—or, paraphrased from Carlos Diaz-Alejandro, financial repression or financial crash.

Higher capital requirements will make the economy more resilient, but excessive  requirements versus liabilities on balance sheets could constrict the growth of businesses in the marketplace. Regulators have a tricky target to hit in that regard, and the target moves as the economy moves up and down.

Acknowledging their challenges, Kiyotaki advised regulators to not neglect less obvious but equally important issues that could arise as macroprudential policy is rolled out. How do you evaluate a tool’s effectiveness, and eliminate waste of resources on policy that isn’t working as intended? How do you address limited credit availability that results from the slowness with which banks recognize loss and re-capitalize following a shock to their balance sheet?

Thinking through more basic issues may make the more complex dilemmas more tractable in the long run.

—Mark Riechers

Tobias Adrian, International Monetary Fund
James Clouse, Federal Reserve Board
Fernando Duarte, Federal Reserve Board of New York
Nellie Liang, Federal Reserve Board
Discussant: Nobuhiro Kiyotaki, Princeton University
Christopher Sims, Princeton University
MFM Fellowship Awardee Presentations

  Intermediation and Voluntary Exposure to Counterparty Risk

Maryam Farboodi presented a stylized model of network formation in which banks form interconnections to mediate the flow of funds from lenders to investors with valuable but risky investment projects. She defines a reasonable notion of competitive equilibrium in this framework and shows that individual banks find it profitable to unnecessarily expose themselves to counterparty risk that makes the system less stable.

In her network model, connections take the form of loan agreements, which delineate the set of possible loans in the future when banks realize who has projects and who can merely intermediate funds. The model has two types of banks and three time periods.

Maryam Farboodi

Banks of type I (investors) have no depositors but a positive probability of receiving a risky but valuable project; banks of type NI (never investors) raise one unit of deposits and never get the investment project. At time t=0, each NI bank raises one unit of deposits and all banks make bilateral agreements for loans at time 1, contingent on the arrival of investment projects.

At time t=1, some of the I banks get the opportunity to invest, and a subset of interbank connections become realized loans. No loans can be made between banks that did not form a connection at t=0. At time t=2 the returns from the projects are realized and loans are either repaid or banks fail.

In the model, banks can earn rents by intermediating between borrowers and lenders. This means that an I bank can put itself between a bank with funds to loan (say, an NI bank) and a bank with a need for funds (an I bank with a project). In other words, if a bank doesn’t get a project at t=1, it can instead earn some of the surplus from the project by intermediating funds between the two other banks. The important friction in the model is that banks cannot negotiate these rents down after the project return is realized, so that they are “exposed” to the risk that the project fails and they can’t be repaid.

Such an “intermediating” I bank is earning ex ante rents but is unnecessarily exposed to the risk that the investing bank fails, in the sense that the investing bank could get the funds directly from the lending bank. In the event of project failure, all three banks in the chain fail; if the middle bank had not intermediated the funds it would have been safe.

Farboodi shows that this inefficient competitive equilibrium is a general feature of her model, and compares it to the ex ante efficient equilibrium in which a single NI bank intermediates funds from all other NI banks to each I bank. This structure is formally equivalent to a centralized clearing party.

Jonathan Parker asked how collateralized lending, such as the repurchase agreements that were at the heart of the recent financial crisis, would fit into a such a model, since in principle it could mitigate the failure of intermediate banks. Farboodi agreed but noted that during the crisis, intermediating banks took heavy losses on their collateralized positions, possibly due to fire-sale effects such as those presented in other sessions at this conference. Because these positions turned out to be not nearly as safe as they initially appeared, effects like those in her model could still be important.

The Pass-through of Sovereign Risk

Luigi Bocola presented a model that provides a quantitative framework for understanding how sovereign credit risk can impact real economic activity, and how macroeconomic policy can limit these effects.

His model includes households, banks, productive firms, capital-goods producers, and a government that can default on its debt. In the model, households must save through banks, who are the marginal investor in all asset markets. At the same time, there is exogenous stochastic variation in the probability of a sovereign default.

Luigi Bocola

The possibility of a sovereign default impacts firms through two channels. The first channel operates through the model’s leverage constraints: a higher probability of sovereign default tightens the constraints faced by banks by lowering the market price of the assets that they use as collateral. The second is a risk channel: Bocola shows that sovereign default risk is a priced factor in his model—in other words, a risk factor for which investors in corporate securities demand a higher expected excess return. Both effects require a global solution method which allows for time-varying risk premia, in contrast to standard macroeconomic models which are linearly approximated.

Bocola solves the model numerically using a global solution method, which is especially challenging because the model features seven state variables. To estimate the model, he employs a two-step procedure: in the first step he generates posterior distributions for most model parameters using Italian data from 1999 to 2009, when sovereign default was not a major concern. In the second step he estimates the sovereign-risk process using Italian CDS spreads from the euro-area crisis period.

Bocola then uses the estimated model to extract the component of expected excess returns on corporate securities from 2010–2011 arising from exposure to sovereign risk. He finds that returns are 47 basis points higher on average because of the increased risk of sovereign default, and that 74 percent of this effect comes from binding leverage constraints on banks.

He also applies the model to the European Central Bank’s longer-term refinancing operations, noting that two of the mechanisms through which these operations might have worked—changes in the price level and reducing sovereign default risk—are not present in the model. He finds that the effects of such policies were weak in 2011, because high corporate credit risk rather than tight borrowing constraints seemed to be driving the lack of investment.

Andrew Lo was curious what the model has to say about the feedback channel present when banks are guaranteed by the government at the same time that they hold government securities. Although sovereign risk in the model is exogenous, it might be possible to endogenize it in the model by adding this feature.

Financial Crises and Policy

Andrea Prestipino presented a model that generates some features of financial crises but is also useful for analyzing the welfare consequences of policy interventions. In particular, this model can examine effects of “bailout” policies in which the government forces the transfer of resources from households to banks. Such a policy is not unlike some of the unconventional actions undertaken by the central bank during the last financial crisis.

Prestipino constructed a dynamic stochastic general equilibrium (DSGE) model with incomplete markets and heterogeneous agents, in which an agency constraint occasionally limits the ability of banks to raise funds from depositors.

Andrea Prestipino

The agency problem in the model is that bankers can “steal” funds from depositors, so that the latter must be careful only to lend up to the value to the banker of staying in business and not stealing. This means that financial shocks that lower the net worth of bankers can have real effects on the economy, because depositors in the model can only invest in risky but valuable projects through banks. When banks’ net worth drops low enough, valuable projects remain unfunded because the banks cannot credibly promise not to abscond with the funds.

Prestipino noted that current macroeconomic models are usually “solved” by applying a local approximation around the model’s implied nonstochastic steady state, initially assuming (and later verifying) that the constraint is binding at that point. This technique implies that the economy behaves symmetrically to positive and negative shocks, which seems to be at odds with the data.

By contrast, Prestipino uses a global solution concept in which the constraint is binding only in “bad times.” Such a framework allows the economy to respond differently in different states. He finds that the welfare costs of bailout policies are much lower in crisis periods, and that in fact, for severe crises, both banks and households can benefit from bailouts.

Nellie Liang questioned why Prestipino only considered a policy of bank bailouts; why not bail out the households directly? Prestipino replied that the model has a “brute force” assumption that banks are necessary, and that households cannot invest without them. Because there is no friction on the household side, there is nothing to bail out. Indeed, taking resources from the banks and giving them to households would have the exact opposite effect as the bailout policy in his model.

Fernando Duarte noted that a model like this one is important because it has the potential to help us understand the quantitative impact of financial frictions. Towards that end, he asked how well the model did in matching moments in the data, and how such a metric could be used to assess the credibility of his welfare analysis. Prestipino agreed that this issue is important and central to his analysis, but acknowledged that he was still working on a close fit to the data.

Financial Crises and Systemic Bank Runs in a Dynamic Model of Banking

Roberto Robatto presented an infinite-horizon model of banking panics and flight to quality in which the demand for money is endogenous, allowing him to study the potential for central bank liquidity injections to avert bad equilibria.

He compared the Friedman & Schwartz (1963) hypothesis— that the Federal Reserve turned a “normal” recession in the 1930s into the Great Depression by failing to increase the money supply enough— to the Federal Reserve’s aggressive and unconventional monetary policies in the recent crisis, which may have averted a much worse crisis than the one we saw.

However, standard models of banking panics either do not include money, or have an exogenous demand for money. In Robatto’s model, by contrast, panics and the ensuing flight to more-liquid assets are driven endogenously by agents’ beliefs, which may be manipulable by the central bank.

Roberto Robatto

The key friction in the model is asymmetric information: depositors are unable to immediately distinguish which banks have been hit with a negative shock. In the model’s “good” equilibrium, even the banks that are hit can survive (are solvent), but the model can generate self-fulfilling panics in which depositors “run” from the banks and the weak banks fail. Monetary injections are helpful because they improve conditions for the bad banks, although (depending on parameters) they can also amplify a “flight to quality” effect which mitigates their success.

A key mechanism in the model driving the multiplicity of equilibria is that debt contracts are nominal. This means that when the price of assets is high, even the weak banks that suffer a shock to their assets still have positive net worth and can repay their depositors. But if asset prices drop too low and the assets of weak banks are worth less than the face value of their debt, the banks’ net worth falls below zero. Depositors, expecting this and not knowing which banks are which, can trigger a run.

Monetary policy works in this model by increasing the price of capital. Robatto ensures that this mechanism is different from other nominal models of bank runs, in which the central bank “inflates away” the crisis, by assuming that the central bank cannot raise the price of capital to the good-equilibrium level.

With this restriction in mind, he considers two types of injections: direct asset purchases, and loans to banks that have the same seniority as deposits. The latter assumption is important, as it implies that the central bank is exposed to the risk that the banks fail, which encourages the private sector to use financial intermediaries. A key result of the model is that the central bank cannot avert self-fulfilling panics solely through asset purchases; it must lend money directly to banks.

Tobias Adrian observed that the model is really about insolvency, rather than illiquidity. Given that, he questioned why Robatto had not studied deposit insurance instead of monetary injections. Robatto answered that the banks in his model are engaged in significant maturity transformation and are weakly regulated. In this sense they are more like the “shadow banking system” at the heart of the last financial crisis, than the commercial banking system where deposit insurance has been shown to avert self-fulfilling bank runs.

Guido Lorenzoni asked why the model rules out the obvious “inflate away the crisis” solution by decree. Typically bank-run models add a fire-sale externality to avoid this outcome. Robatto noted that he thinks his restriction on the central bank is realistic, because in the last recession the central bank intervened and prices did not move very much, whereas during the Great Depression, the Federal Reserve failed to act and prices dropped dramatically.

It therefore seems unlikely that the best central bank policy is a drastic increase in prices. On the other hand, adding an explicit fire-sale mechanism to the model would probably only strengthen his results.

—Aaron Pancost

Luigi Bocola, Northwestern University and Federal Reserve Bank of Minneapolis
Andrea Prestipino, Federal Reserve Board
Roberto Robatto, University of Wisconsin-Madison
Banks' Risk Exposures
Monika Piazzesi, Professor of Economics, Stanford University
Discussant: Jonathan A. Parker
Structural GARCH: The Volatility-Leverage Connection

Robert Engle presented work with Emil Siriwardane that helps explain the empirical connection between a company’s leverage and the volatility of its stock price.

The authors estimate a structural model of credit risk. The framework is an extension of the classic Merton (1974) distance-to-default approach, in which equity is modeled as a call option on the underlying assets of the company, with strike price equal to the face value of the debt. This model allows them to back out the underlying asset value of a company from the prices of its observed securities and accounting data.

Robert Engle

Engle began by showing how the Merton model, when the asset value follows a geometric Brownian motion, implies a time-varying “leverage multiplier” that amplifies equity volatility relative to the underlying asset volatility. He then argued that a flexible generalization of this effect captures in a parsimonious way several extensions to the standard GARCH framework, including fat-tailed distributions and asymmetric volatility models.

The model generalizes the standard Merton model, as well as a model in which leverage has no effect on volatility with a single parameter.

The authors then assume an asymmetric GARCH model for the underlying asset value and use quasi-maximum-likelihood methods to estimate the model’s parameters on the equity returns of 88 separate financial corporations. They find that the median firm accords closely to the Merton model, so that leverage is indeed important for understanding equity return volatility.

The results can also be used to understand whether asymmetric volatility in equity returns comes solely from leverage, or from other sources. The authors find that after accounting for leverage, there is still substantial asymmetry left in equity return volatility, possibly coming from risk aversion.

Digging into the asymmetry estimates further, the authors show that firms with more leverage have a higher “asymmetry gap”—that is, a larger difference between the volatility asymmetry of their equity returns and the underlying asset return. Furthermore, firms whose returns have higher market betas also display more asymmetric underlying-asset returns, suggesting that perhaps these firms reflect more systematic risk.

Engle noted that the main purpose of the structural estimation is to aid in systemic risk measurement. To further this goal, he and his co-author apply the structural GARCH model to SRISK, Engle’s earlier analysis of systemic risk measurement (presented at a previous MFM conference).

That analysis estimates the correlation of firm equity returns to broad market shocks to answer how much each firm’s equity value would suffer in the event of a 40 percent market decline over a period of six months. Previous SRISK calculations assume that leverage stays constant, even though in the event of a collapse in its equity price, a firm’s leverage rises dramatically. To analyze the impact of this effect, including the indirect effect of the change in the leverage multiplier itself as leverage increases, the authors add the structural GARCH model as a component to the SRISK calculation.

They find that even as aggregate equity volatility began to rise in late 2007, the aggregate underlying asset volatility didn’t spike until the Lehman event in the third quarter of 2008. The reason can be traced to the behavior of the estimated leverage multiplier.

Turning to individual-firm estimates, Engle showed that the size of the long-run marginal expected shortfall (LRMES) estimated using the structural GARCH model—which takes into account the direct effect of leverage and the indirect effect of changes in the leverage multiplier—is significantly higher than their previous non-structural estimates, expect at the height of the crisis.

Discussion

Andrew Lo

Andrew Lo began his discussion by remarking that financial and macro economists seem to approach the same problem from different angles, but that he liked this paper because it uses elements of both. To illustrate his point, he quickly re-derived their model using the second proposition of Modigliani and Miller’s famous 1958 paper, at the same time showing that by using Ito’s Lemma they could avoid some of the approximations they employ. The added advantage of this derivation is that it clearly shows how the leverage multiplier is really an elasticity.

Lo went on to question whether the model should really be called “structural,” because the formula for the leverage multiplier relies on the assumed stochastic process for the exogenous underlying-asset value. He showed this by re-deriving the model again in closed form, assuming that the asset follows a constant-elasticity of variance (CEV) process instead of a geometric Brownian motion.

Although the resulting formula for the leverage multiplier contains gamma functions and convolutions of non-central chi-squared distributions, among other things, Lo quipped that “it looks ugly, but if you stare at it long enough, it has a beauty of its own.”

The key point, Lo went on, is that the Merton model is designed to be specific, whereas he thinks that Engle and Siriwardane are more interested in a flexible model. If specificity were the goal, then he urged them to consider whether the GARCH processes they are using are in fact consistent with the Merton framework.

If instead their goal is a general framework for understanding how leverage and volatility are connected, it may make more sense to use nonparametric methods either for the leverage multiplier or on returns themselves. In addition, they could also use implied volatilities from options markets as a gauge of each firm’s equity volatility.

Lo then showed a picture of today’s implied volatility from the volatility index (VIX), noting that although it can be difficult to tell plausible stories about the dynamics of the VIX, he doubts that a lot of it comes from changes in leverage.

Lo concluded with a few warnings about the use of the Merton model. First, financial firms have much more complicated capital structures than a single zero-coupon bond. Even just recognizing that they have multiple bonds that pay coupons involves pricing a so-called “compound option.” Bond indenture provisions and the like only make the matter worse.

Second, the Merton model prices the option by exactly replicating its payoff; it seems likely that this kind of replication is difficult in corporate asset markets, especially during financial panics when they are particularly illiquid.

Third, Lo mentioned some of his own work showing that the “leverage effect,’” which is a prime motivator of this paper, is just as strong for firms that have never had any debt in their capital structure as it is for debt-financed firms. This suggests that leverage is not only mechanism amplifying equity volatility relative to asset volatility.

Tobias Adrian

Tobias Adrian noted that book equity and market equity are conceptually very different: book equity is the residual value of assets minus liabilities, while market equity is the future discounted value of profits. The latter includes all manner of off-balance-sheet liabilities and other exposures, as Lo mentioned, but also (for example) the franchise value of the firm.

The Merton model does not account for any of this, though perhaps the authors can answer this by arguing that the underlying asset value they estimate is some kind of “shadow” asset value that incorporates these issues, rather than an accounting number.

—Aaron Pancost

Robert Engle, Michael Armellino Professor in the Management of Financial Services, New York University, Stern School of Business
Discussant: Andrew W. Lo
Systemic Risk and Stability in Financial Networks

Systemic risk is characterized by the cascading failure of the entire financial system triggered by some idiosyncratic events and exacerbated by conditions in financial intermediaries. It has been widely recognized that the architecture of the financial system is related to the overall level of systemic risk, yet mechanisms of how risk originates and spreads throughout financial networks remains, at best, imperfectly understood.

There are two polar views on the relationship between the structure of the financial network and systemic risk that have been suggested. The first view is based on the wisdom of risk sharing. For example, Allen and Gale argue that the “incompleteness” of the financial network can be source of instability, as individual banks are overly exposed to the liabilities of other banks. A more “complete” financial network with more equal distribution of interbank liabilities implies that the burden of any potential losses is shared among more banks, creating a more robust system.

The opposite view, based on the wisdom of “too interconnected to fail”, hypothesizes that it is precisely the highly interconnected nature of the financial system that contributes to its fragility, as the interbank liability linkages actually facilitate the contagion of  financial distress and solvency problems.

In view of the conflicting perspectives, Haldane conjectured that highly interconnected financial networks may be “robust-yet-fragile” and that they “exhibit a knife-edge or tipping point property.”  Haldane argues that “within a certain range, connections serve as shock-absorbers [and] connectivity engenders robustness.”  Beyond that range, interconnections start to facilitate the propagation of shocks.

Daron Acemoglu

Daron Acemoglu of MIT presented work that provided a tractable theoretical framework for formalizing Haldane’s conjecture of the robust-yet-fragile property of interconnected financial networks. He showed that t the same features that make financial network structure more stable under certain conditions may function as significant sources of systemic risk and instability under another. In other words, the financial networks manifest a “phase transition” property from one condition to another.

They compare a ring-shaped network, where financial institutions are linked  to creditors immediately adjacent in the network, and “complete” networks, where financial interconnections are more broadly shared.

As long as the size or the number of the negative shock is below a critical threshold, the ring-shaped financial structure is more prone to financial contagion and fragile; the complete financial network is the least fragile one. The intuition aligns with the risk-sharing idea argued by Allen and Gale, among others.

In the complete financial network, the losses of a distressed institution are divided among as many creditors as possible. This risk-sharing mechanism guarantees that the excess liquidity in the financial system can fully absorb the transmitted losses.

In the ring-shaped  financial network, however, the losses of the distressed institution are fully swallowed by its immediate creditor, leading to a high likelihood of default of that creditor. That creditor’s distress, in turn, will be strongly transmitted to its creditor in the same mechanism. Thus, the domino effect can be triggered and lead to failure of a large portion of the financial network.

However, most interestingly, when the size of the negative shock is sufficiently large, the complete financial network exhibits a form of phase transition: it flips from being the most to the least stable and resilient network.

Given that all institutions in the complete network are creditors of the distressed bank, the adverse effects of the negative shock are directly transmitted to them. Thus, when the size of the negative shock is large enough, not even the originally non-distressed institutions are capable of paying their debts in full, leading to default of all institutions.

The authors highlighted the presence of a financial network externality which cannot be internalized via simple bilateral contractual relations. This provides important insight on financial stability serving as a “public good” which is likely to be under-provided in equilibrium. This leads to the results that equilibrium financial networks may be excessively prone to the risk of financial contagion.

This work inspires a sequence of future research questions. First, discussant Jennifer La’O of Columbia University pointed out that a more complete and dynamic analysis of network formation in presence of counterparty risks is a key challenge for next step.

Second, Jonathan Parker of MIT suggested that relaxing the assumption that agents know the full structure of the network and the existence of a “central clearing house” would have significant impact on the equilibrium outcomes. Third, Robert Engle of New York University pointed out that it is important to balance the welfare of the financial institutions within the financial network and the welfare of other major players in the real economy outside the financial network, instead of focusing on the welfare within the financial network. Forth, Sujit Kapadia of the Bank of England emphasized that the skewed distribution in financial network data is pronounced—a feature that is absent in the analysis of this paper.

The investigation on the interactions of financial linkage structure and size distribution on the network is obviously in demand. Empirical studies, informed by the theoretical results in this paper, would enable us to measure the key components that play a role in financial stability.

—Winston Dou

Daron Acemoglu, MIT
Discussant: Jennifer La'O, Columbia University