By Matt Trager, ex-Managing Director at Silicon Valley Bank and current Product Management Director for Finance Transformation at Wells Fargo
Previously published in The Digital CFO magazine (Summer 2023)
How can the smaller institutions meet the regulatory challenges of tomorrow?
In the ever-changing world of financial services, one thing remains constant: the evolution of the regulatory landscape and the increasing sophistication and investment needed to meet those developing standards.
Today, regulatory reporting and data management are seen a essential strategic functions which provide vital services to the business and its management of risk. The maturity and robustness of these capabilities have become a core differentiator across the financial services landscape. Firms have invested tremendous resources and capital on technology and business transformation to build out these capabilities to meet evolving requirements. Given the costs of insufficient compliance, financial institutions are thinking through how they will deliver timely and robust regulatory frameworks.
The most meaningful question being debated since the global financial crisis is how to right size regulation to financial risk and how to ensure the adaptability of regulations to address changing macro risks.
What does a great architecture provide?
The burning question on the minds of small and mid-sized financial institutions is: How can we afford to meet the same standards as the world’s largest financial institutions? Global systemic banks have made substantial investments to develop data and reporting architectures capable of meeting regulatory requirements, but these solutions come at tremendous costs. Building an architecture that meets the evolving standards involves various crucial elements, including data sourcing, quality controls, report management, and change management based on regulatory instructions. An architecture which can meet these standards must deliver the following:
- A data sourcing solution which provides/enables:
- Sourcing and standardization of data attributes in a consistent, auditable away
- Proactive data quality controls aligned to BCBS data principles
- Ability to trace the flow of data from capture to use with ready access to underlying business and/or reporting logic
- Management and control of manual data sources and attributes
- A reporting architecture which provides/enables:
- Report process management, easily tracking reports across their delivery lifecycle, ensuring timely and accurate reporting
- Multiple concurrent users across reports, with tight version controls to maintain data integrity and streamline collaboration
- Intra/inter reporting controls, enabling precise reconciliations, such as linking 14M/Q to FRY9C line items effortlessly
- Lineage from data set to report position/column, providing a transparent trail that instills confidence in the accuracy of your reports
- Controlled change management for adapting to changing regulator instructions and requirements while ensuring compliance
What are the real drivers of implementation cost?
These largest institutions not only spent the significant investment and multiple years of effort to deliver these solutions, but also incur the ongoing operational costs to operate and manage their architecture over time. Aside from the technology and platform operations staff, institutions require data management and governance professionals at the enterprise, product domain and functional domain (Credit, Risk, Finance etc.) levels to ensure and maintain compliance over time.
Underlying these significant delivery costs are a set of cost drivers which go beyond the direct expense of the software itself. Some of these drivers are:
- Volume, complexity and age of domain source system platforms requiring data ingestion and standardization.
- Complexity and customization of the regulatory reporting data model.
- Quality of source data landscape documentation (metadata) and availability of data dictionaries.
- Availability and expertise of subject matter experts (SMEs) in both regulatory reporting and source systems.
- Maturity of the existing data landscape, including automated data quality controls, agile development practices and modern ETL (Extract, Transform, Load) components.
- Potential costs associated with data remediation efforts to ensure data accuracy, completeness, and consistency across multiple systems and reports.
- Costs associated with regulatory and data platform development/configuration, including the involvement of consultants and vendor subject matter experts.
What do loyalty programs and Regulatory Reporting have in common?
Both large institutions that have grown through acquisition as well as smaller institutions that have yet to make the necessary investments to scale their environment are vulnerable to these cost drivers. The cost of data and reporting platform configuration is by far one of the most impactful to the cost of delivery. The current suite of platforms requires tremendous amounts of effort to configure them for the breadth of requirements. It is this challenge where I believe innovation is both needed and, with the right ecosystem, coming within our reach.
In our daily business and personal lives, we have become accustomed to the concept of tiered service without giving it much thought. Whether it’s airlines, hotels, or even cloud hosting, we expect different levels of service that align with our specific needs. In the realm of application support, we understand that real-time transaction processing applications require a different level of technical support and disaster recovery measures compared to HR systems. So why don’t we apply the same mindset when it comes to building and delivering mission-critical data and regulatory applications?
The explosion in large language AI models is the appetizer for a meal we can’t even dream of yet. This wave of change coupled with the maturation of cloud technology, SaaS and natural language processing is creating an opportunity to develop a radical new approach to how software can be configured and delivered. Let’s take a moment to envision a bank with $50 billion in assets. This bank likely operates with a modest Regulatory Reporting team consisting of 3-5 individuals. Its annual budget for financial initiatives may reach a maximum of $1-2 million to support funded initiatives. How is this organization going to fund the transformation programs necessary to upgrade its landscape to meet the new requirements? Moreover, in light of the volatility of the macro environment and the unprecedented behaviors exhibited by the markets, how can this bank ensure command and control over its data to effectively address its most critical regulatory questions?
I firmly believe that the solution lies in integrating cutting-edge technologies into a tier-based service model. This model will revolutionize the way regulatory reporting is approached. Picture a plug-andplay, template-based, self-service platform that is easily accessible to small internal teams. Gone are the days of relying on costly consultants and vendors specifically tailored to the needs of larger, more intricate institutions. This innovative model will effortlessly adapt to the requirements of retail and community banking, simplifying their operations. What’s more, it has the flexibility to scale and cater to the needs of global systemically important institutions. These institutions, with their complex balance sheets, diverse product offerings, and extensive client base, will benefit from a heightened level of customer service, exceptional design, and comprehensive configuration and implementation support.
Irrespective of the future landscape, one thing remains certain: the demand for information from regulators and other supervisory stakeholders, necessary for agile supervision in our ever-changing world, will continue to escalate.
What is needed to realize the vision?
With more than 4,000 retail/commercial banks in the US, where only the top 35 have more than $100B in assets, the potential for this tiered service model is enormous. Furthermore, to preserve the diversity and vibrancy of regional and community banks, they need the best tools to tackle the challenges ahead. However, realizing this vision requires a few critical enablers:
- A vibrant Financial and Regulatory Technology landscape incubating and driving disrupters and innovations (must have)
- Rationalization of reporting requirements across US regulators to ensure minimal duplication of data across reports and schedules (must have)
- Agreement on a financial and prudential data point model defining the data requirements necessary to meet this rationalized set of reporting requirements (must have)
- Transition from reports to data sets as the foundations for regulatory supervision (nice to have)
Irrespective of the future landscape, one thing remains certain: the demand for information from regulators and other supervisory stakeholders, necessary for agile supervision in our ever-changing world, will continue to escalate. Consequently, it becomes imperative for financial institutions of all sizes to discover avenues through which we can seamlessly integrate these strategic capabilities, eliminating any cost barriers to implementation. The current Financial and Regulatory Technology space boasts a plethora of more than 20 established and startup vendors, each contributing to the innovation ecosystem. With their expertise and the support of this dynamic ecosystem, I firmly believe that the next evolution in this realm is imminent, poised to revolutionize the way we navigate regulatory changes and challenges.
Author’s note: Opinions are my own and do not necessarily reflect the opinions of Silicon Valley Bank or First Citizens Bank.