DEFENSESTORM BLOG

What Bank Examiners Are Prioritizing in 2026: Cyber Edition

Monday, May 11th, 2026

VIEW ALL INSIGHTS

Cyber security risk management solutions from DefenseStorm.

The banks walking out of their 2026 exams with the shortest exit meetings usually have one thing in common: a risk management program that can be defended in the room. The focus of the exam has become more about risk and governance than technical controls, and the examiners are more capable of putting risk management into technical context than they were two years ago. 

Three things worth knowing before your next exam: 

  • Examiners are catching up. Federal Reserve training investments will bring cyber exam rigor in line with what BSA/AML and lending teams already face. Expect better questions, not easier ones. 
  • Third-party and fourth-party risk is a governance question, not a diligence question. Examiners are asking whether the risks are being managed and governed, not whether due diligence files are in order. AI governance lives inside that same question. 
  • Framework adoption is an exam topic now. With the FFIEC CAT sunset in August 2025, examiners will expect to see where you are in adopting a replacement framework and how it’s being used to shape your program. 
  • Board accountability is evidenced, not asserted. Committee minutes, board minutes, and packages are the record examiners will read. They should reflect quality data that supports meaningful decision making and effective challenge. 

The Supervisory Posture Is Sharpening, Not Easing 

Start with what the agencies are signaling. The most recent OCC Semiannual Risk Perspective is pointing examiners at technology architecture and end-of-life assets, and the question isn’t whether a strategy exists. It’s whether that strategy is appropriate for the size and complexity of the products, services, and operations it supports. Be prepared to show your work: how the risk was assessed, how the board’s risk appetite framed the decision, and the documentation supporting the risk-based strategy. On top of that, effective January 2026, the OCC moved away from policy-mandated examination activities for community banks to fully risk-based, tailored exams. When examiners have that kind of discretion, they go where the risk is. Right now, that’s cyber. 

The Federal Reserve is sending a parallel signal. Vice Chair for Supervision Bowman announced a comprehensive MRA review and described the direction as fewer, more substantive supervisory actions. To some, that may seem like softening. It isn’t. That’s risk-based supervision. Bowman also confirmed that cyber “remains a top priority.” 

Behind that sits a commitment that will change what the exam actually feels like in the room. Driven by a May 2025 OIG report on community bank cyber exams, the Fed will publish updated IT examination guidance by September 30, 2026 and will have formalized IT and cybersecurity training for generalist examiners by December 31, 2026. That’s significant. Historically, examiners weren’t always equipped to dig deep into cyber and IT; the bank’s bar was higher than the examiner’s. With this training initiative, the examiner’s bar will rise to something commensurate with what BSA/AML and lending teams already face: experts asking educated questions that expose program deficiencies and come with exam recommendations. 

Taken together, the picture is clear. A risk-based exam, conducted by better-trained examiners, with a cyber focus. 

Third-Party and Fourth-Party Risk: Not A Check-the-Box 

The vendor program is the first place that sharper focus lands, and it’s been there for a while. The Interagency Third-Party Risk Management Guide for Community Banks is the scaffolding examiners are using, and their focus won’t be on whether due diligence was performed. Expect questions targeted at whether the risk is fully understood and adequately managed. Do you understand the risk profile of each vendor? Are the controls applied commensurate with the risk? How is that assessed, monitored, and reported? What governance processes are in place? That’s a different conversation than walking through a due diligence checklist for a completion grade. 

That conversation gets harder the further out it goes, and fourth-party risk is where a lot of the AI exposure lives. Many community banks aren’t using AI directly at scale yet. They’re using AI because the vendors they rely on have embedded it in fraud models, core platforms, and communication tools. The Cyber Risk Institute’s Financial Services AI Risk Management Framework (FS AI RMF), released in February 2026, operationalizes the NIST AI Risk Management Framework and provides a structured approach for FIs to think through governance of both direct and embedded AI. Now that a banking-specific framework exists, examiner expectations around AI governance will evolve. They’ll expect banks to know about it, to understand what adoption stage they’re currently in, and to have a plan for building a program that aligns with its standards. 

Lastly, there’s vendor consolidation and concentration risk. A common theme among financial institutions has been to consolidate vendors, and that’s a reasonable strategy if the institution can show how it has thought through the concentration implications of pulling multiple services into fewer providers. The answer to that question lives in the risk framework, the vendor management program, and board reporting. Make sure concentration risk with vendors has been considered and communicated to governing bodies, including how it ties into resilience. That consideration should be well documented, measured, and backed by mitigating controls that are provable. The same applies to fourth-party risk, especially AI. The vendors embedding AI may be documented and inventoried, but do you know what the underlying model is? Is there concentration within one model powering a significant number of critical systems? What happens if that model goes down? Be prepared to answer fourth-party risk questions related to AI. 

Choosing a Framework in a Post-CAT World 

The FFIEC sunset the Cybersecurity Assessment Tool in August 2025, which means every institution heading into a 2026 exam needs to have landed on a replacement. The two most widely adopted are NIST CSF 2.0 and the Cyber Risk Institute’s CRI Profile. A 2025 industry survey of more than 420 financial institutions found 73% chose NIST CSF 2.0. Personally, that surprises me. The CRI Profile was built on NIST specifically for financial institutions. It’s NIST with banking context baked in. As a bank operating under specific regulatory expectations, CRI seems like the more natural fit. 

What’s worth paying attention to in NIST CSF 2.0 is the addition of a sixth core function: Govern. It covers board-level oversight, risk appetite documentation, and supply chain risk management. For banks, none of  that is new. Governance has been a regulatory expectation in banking long before NIST added it to the framework. For those who prefer NIST’s broader applicability, this addition aligns it more closely with banking than its predecessor. The CRI Profile maps to NIST CSF 2.0 and layers on banking-specific regulatory context. It also adds a sixth function, Extend, covering third-party risk management – something NIST doesn’t address at the function level. The CRI’s newer FS AI RMF applies that same banking-specific governance lens to artificial intelligence, and can be used in tandem with other frameworks. 

The exam question won’t be which framework you picked. It’ll be whether you can show frameworks are being used to shape your program. Expect questions around how you’ve self-assessed against the framework, where your deficiencies are, and what short-term and long-term plans are in place to mature as an organization. 

The Metrics Gap at the Board Level 

Examiners don’t typically sit with board members during an exam. They read the minutes and review the board packages. They review the cyber risk reports the board has been receiving, and they look at the minutes to get a sense of whether the board has been asking substantive questions in return. Their goal is to ascertain whether the board understands at a level that enables proper governance. The minutes will show whether there’s effective challenge in the room. They’ll show whether the board understands enough to ask the right questions. They’ll show whether the program is being appropriately governed at the top.  

Meaningful information is often the gap in cyber risk reporting. The board doesn’t need a list of unpatched vulnerabilities or raw log data. The board needs metrics tied to the institution’s risk appetite, presented in a format that supports trend analysis and informed decisions. Many key metrics at community banks are standardized. Capital ratios, delinquency rates, commercial credit concentration limits. Any board member can contextualize them. Cyber risk reporting is still largely incident counts, vulnerability details, and color-coded status slides without well defined, repeatable, meaningful metrics behind them. That’s the gap examiners are probing. They’re looking for defined cyber metrics tied to risk appetite, a monitoring cadence that supports trend analysis, and a reporting structure the board can understand and question. Be ready to answer for the quality of board-level engagement and to show you have been providing meaningful information that supports quality oversight. 

A lot of this comes down to one shift. Cyber risk is being evaluated as a risk management domain, and examiners are more equipped to evaluate it that way. Programs with many controls and weak governance are having more tedious exams than programs where the risk governance framework – by design – proves the controls are the right ones and they’re effective. 

DefenseStorm’s Governance and Monitoring capability automates evidence collection and maps controls to the frameworks examiners reference, including CRI Profile v2.1, NIST CSF 2.0, FedLine Security Controls, and exam procedures. If it would help to see what that looks like for an institution your size, the DefenseStorm team is happy to walk through it. 

The checklist behind this post. Everything above translates into specific actions your team can take before the next exam. We’ve distilled them into a structured checklist covering governance documentation, third-party and fourth-party risk, framework self-assessment, and board reporting quality.
[Get the Checklist →]
Want to see what automated evidence collection and framework-mapped reporting look like for an institution your size? [Talk to the DefenseStorm team →] 

Jessica Caballero

Jessica Caballero

Vice President, Banking Strategy

Jessica Caballero, CERP, CRCM, serves as the Vice President of Banking Strategy at DefenseStorm. She is dedicated to developing solutions specifically designed to meet the unique needs of banks and credit unions, enhancing their risk management capabilities while improving efficiencies for community financial institutions. Jessica oversees the alignment of the GRID Active platform strategy with our customers’ needs and regulatory expectations.

Jessica began her career as an examiner with the Office of the Comptroller of the Currency (OCC). Her extensive experience spans various roles including examiner, banker, and consultant. Since 2014, she has focused on banking technology innovation, initially at Abrigo (formerly Banker’s Toolbox). Before joining DefenseStorm, Jessica was a subject matter expert in BSA/AML and enterprise risk management. She now leverages her expertise in risk management to assist information security and cybersecurity teams within community financial institutions.