Liquidity Stress Model

What went wrong with Liquidity Stress Models in recently failed banks and how does it reflect on Stress Exercises going forward?

The recent string of bank failures raises serious questions about the effectiveness of the Liquidity Stress modeling process within these financial institutions. While the failures may have primarily been caused by errors in judgement by Senior Management even while the models performed flawlessly, the collapses offer an opportunity to examine the Liquidity (and ultimately Capital) Stress Models used within these banks to reflect how the models and process surrounding them can be improved to help avoid bad outcomes in the future. (Notably, while SVB and Signature were ‘large’ banks, they both fell below the revised 2018 limits for CCAR and LCR style stress testing (even though they still would have had to
undergo ILST style liquidity stress exercises). The latter exercises were one of the primary regulatory tools/processes introduced post 2008 to prevent a banking crisis so if the current failures spread to a CCAR/LCR regulated banks, scrutiny of their stress process will be a huge focus (and they probably will be in regulatory focus anyway). Hence, looking forward, it is reasonable to ask what changes may be on the regulatory horizon for these models and exercises.

1 Note that the regulators can change the results and outcomes of these exercises substantially just by changing the scenario sets that are in focus in them even if they do not change the substance of the regulatory guidance.

Modeling a bank’s depositor base

One of the questions which arises regarding Liquidity Stress modeling in Silicon Valley Bank (SVB) relates to its sophistication when modeling the bank’s deposit base under stress. Various sources point out that the SVB deposit base may have been unique in deleterious ways including:
• It was an outlier in terms of having low risk retail deposits which
are generally FDIC insured 2
• It had a few large deposits relative to its overall deposit base.
• The deposit base was not very diverse in terms of industry (and location?).
• Its deposit base was more prone to leave promptly than your
average bank for other reasons

Ideally, these differences should have been reflected in the bank’s Liquidity Stress modeling, i.e., a good stressed deposit model distinguishes between modeling a diverse deposit base with many relatively small depositors and a deposit base with a few, large potentially ‘fast money’ depositors . It is unclear yet whether the failed banks in question made this distinction in their internal modeling between these types of deposit bases and the resulting consequences if a few large depositors pulled their fund. In the SVB
case, for instance, it will be interesting to discern what it viewed as an extreme liquidity stress relative to the size of some of its key deposit accounts and whether this would have withstood more in depth style

2 See e.g .,

Modeling a bank’s depositor base

One of SVB’s main issues was that their deposit base consisted of very large depositors (well over the 250K FDIC thresholds). Many were tech startups depositing their short term cash, payrolls, and loan payments. When these large depositors started to get concerned and pulled their deposits, SVB’s short term lending investments were not sufficient to offset this loss of funds, and SVB’s long term HTM investments in US Treasuries experienced large MTM losses as they had to be unwound or moved to AFS accounting. SVB’s Liquidity Stress modeling either failed to predict reasonably the speed with which the bank’s deposit base might be depleted under stress (or its contingency planned suggested it could raise new capital in these situations). As a result, deposit run off during stress was projected to be gradual over time but came much faster than anticipated with direct consequences. In the aftermath of this, we expect regulators (and banks!) to scrutinize more closely their stressed deposit model assumptions and perhaps even their underlying models. This will come either in the form of more refined models and/or more strenuous scenarios/criteria for judging depositor liquidity risk under stress.

3 For instance, the following article suggests that SVB would have passed the LCR style stress tests and it may have even encouraged them to do the wrong thing ( valley bank would have passed the liquidity coverage ratio requirement/) if these arguments are true, then regulators may consider ways to refine the definitions used to calculate the LCR going forward.

Taking shortcuts in ALM or Liquidity Stress Modeling

Another relevant question relates to the banks’ implementation of its ALM stress calculations. ALM management clearly contributed to SVB’s failure in the sense that it was running perhaps excessive
duration risk between its short term liabilities and its longer dated assets relative to its potentially volatile deposit base. Was the extent of the duration risk sufficiently reflected in the bank’s liquidity (and or Capital) stress exercises? While projecting the stress losses experienced in longer dated US Treasuries is a straightforward exercise (assuming stress scenario inputs that are reasonable more on that below), predicting further unrealized losses in other assets under large stress moves can be challenging. For instance, some banks choose NOT to re generate cash flow projections when calculating their stress values (for pragmatic reasons) for some asset backed securities. While this practice may not make a large difference under more moderate movements in interest rates, large stress moves (like the recent Fed tightening) will exacerbate the errors. In addition, some banks focus on easier to manage and calculate parallel shift scenarios in their reporting (e.g., this is what is often reported for Net Interest Margin in their financial reporting) when large yield curve change scenarios are more significant.

After the dust settles, it will be interesting to see if model implementation choices like the one just mentioned contributed materially to an SVB under assessment of its ALM duration risk or whether its Senior Management just chose to accept that risk. In any case, we anticipate that regulators will look more closely at the ALM model and implementation assumptions made in stress testing exercises going forward.

Choice of scenarios

As noted above, a core interesting topic is whether the Stress scenarios applied (either for liquidity or capital stress exercises) were stressful enough. Obviously, all aspects of the stress modeling might have been operating properly but the whole exercise might have been foiled by stress scenarios that do not sufficiently reflect the current economic environment (or the bank’s idiosyncratic risk situation). Choosing the appropriate stress scenarios to utilize for formal liquidity (or capital) modeling has always been a difficult question. One can almost always find some scenario that will break any bank, but the question then becomes whether that scenario is sufficiently reasonable to warrant bank (and regulatory) attention in costly formal stress exercises. To both force a certain uniformity to the process and force banks to choose stress scenarios with a sufficient level of stress, regulators now publish a standard suite of scenarios that must be used for CCAR/LCR or related formal capital and liquidity stress test exercises that are performed annually at banks. Assuming SVB (or related bank’s) stress scenarios were not stressful
enough, questions about the cause bifurcate into two sub questions. If used by the bank s 4 , were the regulatory stress scenarios stressful enough and/or if not then did the impacted banks examine additional
stress scenarios of their own that might have better helped Senior Management foresee and act on the crisis?

4 As noted above, while SVB and Signature were relatively big banks, both fell under the 2018 threshold for formal CCAR/LCR treatment hence, they had more discretion about the scenarios they chose to use in their stress testing.

Choice of scenarios

Let’s first examine the regulatory scenarios in comparison to what has actually happened at the banks in question. In a recent WSJ opinion article, Joseph Mason and Kris James Michener claim that CCAR scrutiny would NOT have caught the SVB problems because the 2022 Fed scenarios were not anywhere near harsh enough to capture the current interest rate environment that has transpired (i.e., their argument is focused on the ALM side of SVB’s problem). Looking at the 2022 CCAR stress scenarios, the Mason/Michener claim seems to be borne out, i.e., the Fed published scenarios did not force banks to directly evaluate their capital under what is now the ‘base case’ scenario for most banks. This will no doubt be rectified in the future given the sources of the bank failures and the pressure on regulators to respond to them (and if only because it has now become the “Base Case Scenario†more concretely). Note that while many banks combine their CCAR capital stress exercises with their Liquidity stress evaluations and reporting (and there have been many strong advocates of this 5 , many banks still perform the two exercises separately (and the regulators often examine them independently utilizing different scenarios) so while the formal CCAR scenarios may be open to the Mason/Michener criticism above, the regulatory scenarios considered for Liquidity may be different 6 . In either case, however, we anticipate increased regulatory scrutiny both around the scenarios used and the integrated nature of the exercises (i.e ., does the liquidity stress clearly reflect the potential losses or other actions resulting from capital stress scenarios.

Regulatory stress scenarios and requirements aside, the expectation always has been that banks will generate their own scenarios that particularly test their own idiosyncratic risks. Was there much evidence that SVB (or related banks did this from their statements and financial reporting)? [Short answer is ‘no’ here I think do any banks below the CCAR/DFAST threshold report stress testing? If so who for comparison?]. We anticipate that the push on banks to test their own idiosyncratic situations more thoroughly through internal scenarios will also grow as part of the fallout from the recent bank

5 See, for example, Beverly Hirtle, “The Past and Future of supervisory stress testing designâ€,
6 But note the argument above that LCR style analysis would not have clearly flagged a problem at SVB either in isolation.

Counter Performativity and Contingency Planning

One final note reiterated from the SVB crisis: trying to raise new capital in crisis does NOT work well if embedded in contingency planning. The very action of signaling to the market that you need new capital when under stress often becomes the death knell of a bank. Hence , arrangements for raising new capital should be put in place well before a crisis transpires. Hence, we regard regulators to take a closer look at contingency plans utilized within stress exercises going forward.

Conclusions for Liquidity Stress modeling

Each large bank failure or related financial events brings with it morals for stress testing if only it reveals new ways in which stress testing can either be enhanced or needs to be refined. Time will tell to what degree the exercises failed for the banks in question 7 , but the recent string of failures highlights some key issues that all banks must emphasize going forward (for internal and probably regulatory reasons) including:
• More sophisticated modeling of depositor behavior under stress when a depositor base is not diverse or concentrated 8 .
• Accurate evaluation of the stress exposures arising from their ALM practices (e.g., duration risk) including cash flow re assessment for large move scenarios or curve shift scenarios
• Renewed calls for integrated Liquidity and Capital Stress testing for smaller banks
• The need for banks to design their own stress scenarios particularly focused on their idiosyncratic risks
• Potential for re valuation of contingency planning involving raising capital under stress

7 If indeed they did or Senior Management simply choose to take the risks involved and clearly revealed to them.
8 We have not discussed the ‘crypto currency’ angle on this in detail because information on it has not clearly emerged, but perhaps encouraging depositors to move quickly back and forth to crypto currency accounts changes the dynamic of deposit liquidity stress testing going forward as well.


Greg Brozak, Ph.D. Senior Quantitative Analyst

Greg Brozak is a quantitative finance professional with 25+ years’ experience in developing and validating financial and risk management models across a wide range of product types and risk factors. He has broad experience in model risk management and model governance. He has led Model Development and Model Risk Teams Risk teams at large financial organizations. Greg’s areas of model expertise include Mortgage and Credit Models, Interest Rate/ALM modelling, Stress testing, Operational and Climate Risk models. Greg also has extensive experience setting up and overseeing model governance frameworks. Greg has a PhD from Northeastern University, an MA from University of Buffalo, and a BS from Queen’s College CUNY.

Leonard Mills, Ph.D. Senior Quantitative Analyst

Leonard Mills is a Senior Quantitative Analyst at KDOA. He has over 30 years’ experience in model development and validation. Prior to joining KDOA, he worked at the Federal Reserve, Fannie Mae, Wells Fargo as well as consulting with a variety of financial institutions. He has a BS in Mathematics from Hamden Sydney college and Ph.D. in Econometrics from Tulane University.
Len has extensive experience with Credit Risk/CECL, ALM, Mortgage analytics and mortgage hedging validations as well as stress testing framework validations, including familiarity with ADCO, MCT, QRM, Moody’s analytics.

Thomas Connor Chief Operating Officer, Partner

Tom Connor is partner and COO for KDOA. He has created, developed, integrated and managed multiple risk departments, systems, and programs during his 30+ years in the financial industry. While his early
career centered on Market Risk platforms and models, his primary engagements over the past 10 years have been focused on trans risk projects like overseeing builds on an integrated capital stress program and a large, cross departmental credit system. Tom joined KDOA in 2019 with a focus on engendering the growth of the joint venture with RMA on its model validation consortium.