As a part of the supervisory process conducted at these institutions, examiners seek to validate the filed balance in a given report’s line item by tracing its data back to discrete transactions, such as an individual trade or a customer account. For example, examiners may review loan documents in order to validate that the slotting of the data into a particular category is correct on the basis of the loan’s stated purpose, or inspect trade confirmations in order to verify certain trading activity. Examiners will also trace transactions through different reports to ensure consistency is achieved at both the parent and subsidiary levels. In addition, examiners will review all worksheets used to prepare the reports in scope, which requires banks to clearly document all processes, including explanations for manual adjustments, in order to avoid unwanted criticism.
Accordingly, preparation for these reviews is a substantial undertaking, as the data requests require end-to-end mapping that often results in tens of thousands of pages of documentation that must be prepared in advance of the examination start date. For most institutions, these reports require significant coordination between lines of business, who are the data users and providers, and the regulatory reporting filers. This is critically important and an ongoing area of concern, as the report filers do not necessarily have a clear line of sight into the source data and the data providers often do not fully understand the reporting parameters and definitions. This frequently results in misinterpretations around what exactly needs to be provided to the regulators for a given line item.
In conducting exams, the Federal Reserve (the Fed) Board and the respective Fed Banks will typically assess an institution’s policies and procedures, processes, systems, data, and governance as a part of the supervisory review of accuracy. While most institutions rely on significant manual processes and resultant reconciliations in their report preparation process, the Fed and regulators more generally have become less tolerant of an over-reliance on manual solutions and “work-arounds,” especially in instances that lack sufficient oversight and documentation. In addition, materiality is often not factored into the examination process for various reports, resulting in regulatory findings for errors that may be immaterial when compared to the size of the institution’s overall balance sheet. Management must also prepare for the possibility that the institution’s regulatory reporting examination may expand to cover more traditional safety and soundness-related control issues, which may then be identified as part of the original regulatory reporting exam.
The stress testing regimes, such as the Fed Board’s Comprehensive Capital Analysis and Review (CCAR), and the resultant reports that need to be filed, create an additional layer of complexity for institutions. Specifically, the scope, volume, and granularity of data that banks are now required to submit to the regulatory authorities seems to represent a sea change in the way that financial institutions are regulated. Indeed, at the Fed Board’s Third Annual Stress Test Modeling Symposium, Governor Daniel K. Tarullo stated that, “supervisory stress testing and the associated review of capital planning processes have provided a platform for building out a regulatory framework that is more dynamic, more macro-prudential, and more data-driven than pre-crisis practice.”
Where to start when tackling the data dilemma
As enhanced data quality, reporting, and management requirements gain more prominence and their role in capital management intensifies, the industry would be wise to pay more attention to this key area. In 2009 the BCBS issued guidance designed to enhance banks’ ability to identify and manage firm-wide risks, and in 2013, the BCBS published a set of principles aimed at strengthening RDA capabilities and internal risk reporting practices at banks, along with guidance on the principles’ implementation. Although these principles, which apply at both the group level and all material business units and entities within the group, are initially addressed to the largest, most systemically important and globally interconnected banks, national supervisors have already signaled that they plan to apply the principles to a wider range of financial institutions in the future.
In January 2014, the Office of the Comptroller of the Currency (OCC) released a proposal setting forth new standards, based on the agency’s heightened expectations program, for large national banks and federal savings associations. In the proposed guidelines, the OCC stated that it expects the global systemically important banks (G-SIBs) it supervises to be “largely compliant” with the BCBS principles by the beginning of 2016, and that other banks under the OCC’s purview, while not expected to comply with the principles by the same deadline, should nevertheless view the principles as leading practices and make an effort to bring their practices into alignment with the principles wherever possible.
Similarly, the Financial Industry Regulatory Authority (FINRA) recently issued a “concept proposal” to develop a new, rule-based program called the Comprehensive Automated Risk Data System (CARDS). This system would impose new investor account reporting requirements on brokerage firms and allow FINRA to collect account information on a standardized, automated, and routine basis. Once implemented, FINRA envisions analyzing CARDS data before examining these firms on-site, thereby potentially identifying risks earlier and shifting work away from the FINRA’s traditional on-site exam process toward off-site continual monitoring. This trend will likely continue, as regulatory authorities build their proficiency in collecting and analyzing these data and move towards a real time, “bird’s-eye view surveillance” model.
Reporting requirements for the Securities and Exchange Commission’s (SEC) Consolidated Audit Trail are also on the horizon for these firms. Adopted by the SEC in 2012 in response to the 2010 “Flash Crash,” this system will require broker-dealers to accurately identify and report every order, cancellation, modification, and trade execution for all exchange-listed equities and equity options across all US markets in a uniform manner, thereby allowing the SEC to use these data to conduct cross-market supervision of firm trading activities.
Focus of supervision has changed dramatically
After the introduction of the Supervisory Capital Assessment Program (SCAP) in the spring of 2009 helped to stabilize the US financial system, supervisory stress testing has become a cornerstone of the Fed Board’s approach to the regulation and supervision of the largest financial institutions. The SCAP and the subsequent Comprehensive Capital Analysis and Review (CCAR) also centralized the supervision of these institutions at the Fed Board, and CCAR’s incorporation of macroeconomic scenarios broadened the role of the Fed Board economists who were previously involved solely in monetary policy by incorporating their analysis and viewpoints into the supervisory process. As a result, the CCAR process has since been integrated into ongoing regulatory supervision, with the Fed Board signaling that CCAR will become an integral part of its year-round supervision, rather than a discrete, annual exercise.
Banks struggling to meet the data challenge
In December 2013 the BCBS published a review of banks’ progress toward implementing the principles, which included a self-assessment questionnaire completed by G-SIBs. The results of the self-assessment showed that, broadly speaking, the principles related to risk reporting practices had higher reported levels of compliance than the principles related to overarching governance and infrastructure and RDA capabilities. Nearly half of the G-SIBs reported material noncompliance with their data architecture/information-technology infrastructure’s adaptability, accuracy, and integrity, with many banks reporting that they were facing difficulties in establishing strong data aggregation governance, architecture, and processes. To compensate, banks reported that they are resorting to extensive manual work-around that are likely to impair their RDA and reporting capabilities.
The anomaly of the risk data reporting principles rating higher than those principles related to governance/ infrastructure was cited as “difficult to interpret” by the BCBS, as these principles are considered foundational to ensuring compliance with the other principles. Similarly, the BCBS noted that a few banks rated themselves as fully compliant on the comprehensiveness principle, but rated themselves materially noncompliant on one or more of the data aggregation principles, raising concerns about the reliability and usefulness of their risk reports when the underlying data informing them and the processes to produce them have such significant shortcomings.
This self-assessed lack of progress against most of the principles is telling and is not likely to improve dramatically, as the deadline to comply by the beginning of 2016 is fast approaching.