RegTech Forum 2023 - highlights

This week, we proudly hosted our annual RegTech Forum at the historic Old Library of Guildhall, in collaboration with the City of London Corporation. The forum presented invaluable insights from a distinguished panel of experts representing both major banks and building societies, as well as figures from the technology and regulatory sectors. The conference explored various topics, highlighting recent shifts in finance and banking, including the rollout of Basel 3.1, RegTech innovations, Proportionality Rules, and the emphasis on data standardisation. Alderman William Russell and William Coen, former Deputy Secretary General of BCBS, each delivered a compelling keynote. Additionally, two focused roundtables were conducted, addressing the nuances of open data and the resource challenges faced by smaller banks.  

Alderman William Russell of the City of London Corporation, gave a keynote where he underscored the UK's prominence in the fintech sector, contributing £6.6 billion to the economy and witnessing a 217% investment surge in 2021. London, with one-fifth of Europe's fintech unicorns, stands as a testament to this growth. The City Corporation is committed to bolstering London's position as a global fintech and Regtech hub. Russell emphasized that Regtech, often seen as a fintech subset, has evolved into its own vital sector. It offers solutions to regulatory challenges, ensuring cost-effective compliance. The success of firms like Suade exemplifies Regtech's potential as a strategic enabler for the UK's financial sector. Russell concluded by stressing the importance of collaboration and innovation to stay ahead in this dynamic landscape. 

 

The inaugural session, titled "The Evolution of RegTech", then delved into the current trends in RegTech adoption, the hurdles faced by regulators and financial institutions, and the prospective influence of RegTech on reshaping compliance. Several panellists concurred that RegTech is now essential in enhancing the competitiveness of the UK's financial sector. Given the growing intricacy of regulations, coupled with emerging requirements like ESG, technology has become crucial in maintaining efficiency and adherence, tackling both old and new challenges. The potential of technology in mitigating crises was underscored, emphasizing its efficacy in data analysis. 

A few years ago, the significance of RegTech was relatively subdued, with some major regulators still unfamiliar with concepts like blockchain. However, the landscape has transformed. Regulatory bodies now universally incorporate fintech teams, a testament to the UK's progressive stance as the world's second-largest RegTech market. While the sector has matured, leading to fewer emerging startups due to investment challenges, there remains untapped potential in technology adoption. The emphasis now is on demonstrating the tangible benefits of technology in enhancing compliance, accuracy, and efficiency. Financial crime emerged as a promising domain to gauge RegTech's impact.  

The sector, however, isn't without its challenges, such as KYC regulations, cross-jurisdictional compliance, and integration issues. An industry expert underscored the importance of aligning compliance with genuine customer benefits. The ubiquitous presence of AI was acknowledged, highlighting a knowledge gap that often translates into apprehension. There's a pressing need for upskilling professionals to bridge this gap and for regulators to encourage the phasing out of outdated systems. Potential solutions include incentivizing tech adoption through tax benefits and introducing certification systems for RegTech. 

From a regulatory perspective, while the AI act poses challenges, especially for high-risk companies, the UK's flexible approach is commendable. A panellist suggested introducing standards that are prescriptive but not obligatory. The call for agile regulation, complemented by feedback mechanisms, was loud and clear. In the post-COVID era, there's renewed vigour for bold initiatives, emphasizing the importance of collaborative efforts between the public and private sectors. The overarching sentiment was the need for patience in the face of change, recognizing the pivotal role of visionary individuals driving innovation. The journey of transformation demands time, energy, and conviction, but it's a journey worth embarking upon, for change is the only true constant. 

 

We then engaged in a thought-provoking session on "The State of Global Data Standardisation." This discussion explored the crucial aspects of data standards, current worldwide initiatives, and the collective drive towards a harmonized future. The financial sector has historically mishandled data, with the repercussions of such negligence becoming evident in the 2008 crisis. Minor oversights can escalate rapidly in today's fast-paced environment. The events that led to the 2008 crisis weren’t visible in the stress tests because the available data was aggregated, obscuring the underlying triggers. For accurate modelling, granular, standardized data is essential, ensuring everyone operates from a unified understanding. If we don't manage our data correctly, we can't harness technology effectively. The hope is to establish a standard proactively, rather than in response to another financial crisis. 

While an imperfect standard is better than none, it's imperative that there's a single standard. Given the interconnectedness of the financial system, regulators must adopt a global perspective rather than persisting with isolated national approaches. Consider the universal standardization of electrical sockets; we don't give it a second thought now. If financial data could be seamlessly integrated into regulatory systems in a similar manner, it would be a game-changer. Encouragingly, collaborative efforts are yielding tangible results. While much has been achieved, there's still a long road ahead. Enhanced cooperation across jurisdictions is essential to prevent conflicting regulatory regimes. Financial institutions, with their multifaceted nature, further complicate the landscape. The UK is pioneering data standardisation, and initiatives like the Bank of England's Transforming Data Collection Programme - where industry and regulatory representatives collaborate - are commendable. However, the collaborative nature can sometimes slow progress, and the outcomes are often recommendations rather than binding regulations. 

The momentum for data standardisation is building, driven by its potential for widespread benefit. Demonstrating the advantages through case studies is crucial. Institutions seek clarity from regulators about where to allocate resources. Even minor costs can be burdensome for smaller institutions, emphasizing their desire for simplicity and certainty. The public sector's apprehension about being overly directive needs to be addressed. With the increasing demand for real-time data accuracy, there's a pressing need to leverage technology for enhancing data quality, fostering innovation, and promoting competition. Active participation and engagement in these projects are essential for all stakeholders. 

 

The third panel discussion was on "The Finalisation of Basel III", a burning issue in the regulatory landscape. This session delved into the implications of the new rules and the role of technology in facilitating their implementation. The Basel 3.1 regulations primarily target internationally active banks, mandating them to uphold substantial capital reserves and rigorous risk management protocols. The criteria for their application hinge on a bank's size, international reach, and intricacy, all aimed at fortifying global financial stability and mitigating systemic risks.  

A key point of discussion was the "output floor" from the Basel III Endgame reforms. A former regulator elucidated that this "output floor" stipulates that risk-weighted assets (RWAs) from a bank's internal models shouldn't fall below 72.5% of RWAs ascertained using the standardized method. This provision, a compromise among regulators, was designed to limit banks from overly reducing their capital requirements via optimistic internal models. This threshold has garnered attention, as it could elevate capital requirements for numerous banks, ensuring they sustain adequate capital reserves and fostering uniformity in the banking domain. 

Another focal point was the significance of credit risk models in gauging and managing lending-associated risks. While these models are pivotal in setting capital requirements, the U.S. has notably excluded their use specifically for regulatory capital determinations. This stems from historical apprehensions and regulatory viewpoints, underscoring the preference for standardized methods. However, a panellist emphasized that U.S. banks aren't exempt from using these models; they remain integral for internal risk assessments and stress tests, crucial for assessing a bank's robustness against economic downturns. 

With such transformative regulatory shifts, banks confront numerous challenges, particularly concerning data and operations. An industry expert stressed the importance of data consistency, cautioning against siloed operations. The need for comprehensive data traceability is evident, ensuring data's trace back to its source. As regulations morph, banks must recalibrate their models and strategize for smooth transitions. Financial constraints, especially IT-related costs, can be significant impediments. Operationally, the emphasis should shift from data cleansing to data analysis, ensuring initial data accuracy.  

The PRA's Banking Data Review initiative seeks to refine banks' data collection methods, aiding their adjustment to regulatory changes like Basel 3.1. Given the extensive regulatory reporting and overlapping data requisitions, it's vital to reevaluate the daily data needs of regulators. Streamlining reporting and standardizing processes can bolster data checks. Data's timeliness, accuracy, and precision are crucial competitive factors. A standout feature of the BDR is granularity, underscoring the importance of swift decision-making based on dependable data. For the implementation of new regulations, it's essential to comprehend the regulation, utilize consultation periods, and gauge the magnitude of the change. This involves assessing IT system overhauls and prioritizing certain activities based on dependencies and parallel execution possibilities, such as data processes and control frameworks. 

 

In continuity to that discussion, William Coen shared insights from his tenure with the Basel Committee, tracing his journey from joining in 1999 during the Basel II era to his pivotal roles as Deputy Secretary General in 2007 and later as Secretary General in 2014. Under his leadership, the finalization of Basel III was overseen, addressing intricate elements of the capital framework in 2017 and the FRTB in 2019. He emphasized that the Basel Committee's scope extends beyond just capital, encompassing areas like BCBS 239/RDAR, Corporate Governance, and Core Principles for Effective Banking Supervision.  

He shared an anecdote about the South African reserve bank's data transformation journey, underscoring the importance of timelines and deliverables. Drawing from his experiences, he emphasized the significance of regulatory sandboxes and the need for collaboration to ensure success. 

Concluding his address, Coen spoke about the inevitability of change. He stressed the importance of managing change effectively through planning, setting milestones, and ensuring deliverables. Touching upon innovation, he cited SVB as an example of technology's transformative impact on the financial sector. He warned that many banks, having delayed tech adoption, now find themselves lagging in a competitive global economy. For banks to thrive, agility is paramount. 

 

Then, the panel discussion on "Strong and Simple - The Impact on Proportionality Rules" delved into the pressing need for tailored regulatory measures in the financial sector, especially for small, non-systemic domestic banks and building societies. The primary challenge identified was the complexity arising from a one-size-fits-all approach. Such an approach, while designed to ensure uniformity, often places undue burdens on smaller institutions, making it difficult for them to navigate the regulatory maze. The panellists emphasized the goal of establishing a regulatory framework that is both robust in its oversight and straightforward in its application. This would ideally cater to domestic financial institutions, particularly those with assets of £20bn or lower, ensuring that they can operate within a regulatory environment that recognizes their unique challenges and capacities. 

Historical context also played a significant role in the panel's deliberations. The panellists reflected on the regulatory changes that have taken place over the past decade and a half, especially in the wake of the global financial crisis of 2007-09. These changes were primarily aimed at preventing future crises and minimizing their impact on the global economy. However, the panellists noted that the impact of a financial institution's failure varies based on its size, as evidenced by cases like Northern Rock and HBOS. 

The discussion further delved into the potential risks and benefits of streamlining regulations. While simplifying rules can make them more accessible and understandable, there's also a risk of creating loopholes that can be exploited. The panellists also touched upon the importance of collaboration and engagement between regulatory bodies and industry stakeholders. They argued for a more nuanced form of engagement, where both parties, despite their differing perspectives and objectives, come together to shape regulations that are both effective and practical. 

One of the standout points of the discussion was the pursuit of smarter, not necessarily more, regulation. The panellists advocated for regulations that are not just stringent but are also relevant, easily comprehensible, and adaptable to the changing dynamics of the financial world. They emphasized the need for regulations that can be easily internalized and implemented by financial institutions, regardless of their size. 

The discussion rounded off with insights on potential system abuses and the critical role of data in regulatory decision-making. The panellists underscored the importance of vigilance in ensuring that institutions don't exploit regulatory grey areas. They also highlighted the pivotal role of data, with one panellist noting, "The bank looks at data," emphasizing its centrality in shaping informed regulatory decisions. 

 

The second-to-last panel delved into the topic of "Recent Banking Failures," shedding light on pivotal aspects such as the influence of feedback mechanisms, the ripple effect of social media on contagion risks, the core of regulatory norms, ethical governance, and the metamorphosis of risk management strategies. A representative from the banking sector characterized the SVB incident as a traditional bank run, albeit amplified exponentially by social media platforms, notably Twitter. The rapid dissemination of information via social media, often likened to the "wild west," necessitates regulatory intervention. The emphasis was on fostering a transparent rapport with regulators, as misinformation serves no one's interest. While the rapidity of reactions to information isn't novel, the anonymity offered by social media platforms introduces an element of chaos. Regulators, despite their influence, aren't omnipotent. The onus is on institutions to devise strategies to manage crises, avert panic, and champion transparency. Rebuilding eroded trust is a formidable challenge. 

 

Another panellist highlighted the lag between evolving market risks and regulatory adaptations. In SVB's context, the oversight wasn't in gauging market fluctuations but in neglecting the illiquid risk amassing on their balance sheets. The bank's aggressive risk-taking in pursuit of profits was evident. The discourse then shifted to future protective measures. While bolstering capital reserves enhances systemic resilience, lapses in both internal governance and external oversight are equally culpable. The need for transparency was underscored, especially in sharing internal projections with regulators. Proactive identification of vulnerabilities, albeit costly, is preferable to institutional failure. Auditing mechanisms, like the s166 in the UK, offer insights into rectifying reporting inaccuracies and enhancing regulatory oversight. Such tools guide banks on improvement trajectories. The emphasis was on proactive governance reviews and the indispensability of investments in this domain. 

 

The panellists also stressed the importance of clarity, both internally and in interactions with regulators. Regulatory reporting's precision and transparency hinge on well-defined roles, a controlled environment, and structured reporting management. The overarching sentiment was the urgency to develop and implement a harmonized framework across different jurisdictions and to dismantle internal siloes. Collaboration emerged as a recurrent theme, underscoring its criticality. 

 

The banking landscape is currently undergoing a seismic shift due to a plethora of regulatory changes. The panellists identified the introduction of global standards, such as standardizing definitions, as an immediate priority. Surprisingly, even a decade and a half post the financial crises, such standardizations remain elusive. With the volume of reporting on an upward trajectory, banks must harness technology and scalable solutions. Standardizations across the industry can significantly aid this endeavour. The quality of data is paramount; inaccuracies can have catastrophic repercussions. Collaborative efforts across institutions can distribute the cost of change. Investments in technology and data are non-negotiable for banks, especially given the substantial initial costs of establishing utilities. Regulatory encouragement can catalyse such investments. The panel concluded with a call for ingraining accountability into the very DNA of financial institutions and upholding governance principles. 

 

The concluding discussion delved deep into the theme of "How to Power Through Finance Transformation?" In an era marked by swift technological advancements, evolving regulatory landscapes, and dynamic market shifts, the imperative for finance transformation has never been more pronounced. The panellists underscored the significance of this transformation as a linchpin for ensuring organizational resilience in these turbulent times. 

Central to the discourse was the idea of re-envisioning finance processes, harnessing the potential of digital innovations, and tapping into the power of data analytics. The trajectory of this transformation, as highlighted by the panellists, is intrinsically linked to an organization's risk appetite. This appetite delineates whether institutions position themselves as risk takers or makers in the transformative journey. A thought-provoking projection was shared: in the near future, automation could potentially subsume 80% of jobs, rendering the banking landscape virtually unrecognizable. 

Machine Learning (ML) and Artificial Intelligence (AI) were spotlighted for their undeniable advantages. Observing successful implementations of these technologies in other firms can serve as a catalyst for others to take the transformative leap. However, costs remain a formidable barrier for many institutions. 

To navigate the complexities of internal change, the panellists emphasized the indispensability of robust governance and effective communication. Setting realistic expectations, ensuring clarity on timelines, and fostering a comprehensive understanding of the transformational roadmap are paramount. One panellist astutely remarked on the inherent risk of failure in technological endeavours. Yet, this risk isn't necessarily detrimental. Failures can offer invaluable feedback, paving the way for product enhancements. 

The ripple effects of Facebook's Libra were discussed, highlighting its role as a catalyst for innovation in both the public and private sectors. On the regulatory front, initiatives like MICA exemplify how innovation is steering regulatory transformations. However, a recurring sentiment was that regulation often lags behind the curve of innovation. The dual role of regulators was emphasized: not just as framers and enforcers of rules but also as entities that incentivize firms towards innovation. 

The UK's supportive ecosystem for technological advancements was lauded. The existence of regulatory sandboxes allows firms to experiment, derive insights, and disseminate their findings. In this transformative era, the ability to swiftly acquire knowledge on novel subjects and upskill is crucial. A robust data strategy was underscored as the bedrock for firms. While AI promises transformative potential, its efficacy is contingent on organized and streamlined data. 

The term "transformation" might evoke notions of vast, intimidating change. However, the panellists advocated for a more nuanced perspective. Transformation is a continuous journey, and when broken down into its constituent parts, it becomes intuitive and manageable. The concluding note was one of encouragement: institutions should be primed for experimentation and potential failures. Embracing change, rather than fearing it, is the way forward. 

 

Furthermore, we organized two intimate roundtables, providing a platform for select attendees to engage in focused discussions. The first one centred on the unique challenges faced by small banks and building societies as they navigate the complexities of regulatory transformation projects, especially given their resource constraints. 

One of the primary concerns raised was the issue of proportionality. Small banks often rely on consultants who, at times, might push them towards standards that might not be necessary for their scale. There's a pressing need to educate regulators about the true essence of proportionality to ensure that these institutions aren't held to unrealistic standards. 

Another significant challenge is the balancing act between regulatory reporting and creating value. While compliance is crucial, there's a growing realization that merely focusing on returns isn't enough. A deeper understanding of both the business and the data is essential. Misunderstandings or misinterpretations of data can lead to significant issues. However, smaller institutions often have an edge over larger banks in this aspect, as they tend to have a more intimate understanding of their data due to less departmental segregation. 

Yet, the resource constraints in smaller banks often mean that there's a rush to complete tasks, leading to a superficial understanding of data. This hurried approach can sometimes compromise the depth and quality of insights derived from the data. 

Discussing the financial implications of regulatory reporting, attendees noted that smaller banks often grapple with the challenge of finding the right consultancy. Many have found that advice from consultants can often be off the mark. Thus, there's a growing emphasis on fostering a direct and clear understanding with regulators. This clarity, they believe, will lead to better compliance outcomes. 

Expertise, especially in data knowledge and seamless integration, is another area where small banks often find themselves at a disadvantage compared to their larger counterparts. While big banks might have dedicated roles like Chief Data Officers, smaller institutions often lack such specialized roles. The challenge then is to build systems that are efficient without overwhelming the organization. 

When discussing the importance of regulatory reporting within organizations, attendees stressed that it's not just about compliance; it's about understanding the risks of non-compliance. The "dear CEO" letter, for instance, served as a wake-up call for many, including Sainsbury Bank. Another key takeaway was the value of proportionality in technology deployment. Instead of scattering resources across various tools, the focus should be on versatile technologies that can handle multiple tasks. 

From a vendor's perspective, the question arose: how can they support these banks better? One suggestion was to take on the testing component and provide valuable feedback, thus easing some of the resource burdens on these smaller institutions. 

 

Finally, we facilitated another roundtable titled "Leveraging Open Source Data Standards in Financial Regulation." The discussion underscored the importance of accessibility in standards, emphasizing that they should cater to a broad spectrum of stakeholders. By adopting an open-source approach, standards become freely available, eliminating financial barriers and promoting extensive adoption. Such an inclusive approach nurtures collaboration, knowledge exchange, and ongoing refinement, culminating in a more comprehensive and resilient standard. 

The myriad benefits of open-source communities to standardization endeavours were explored. Open-sourcing projects not only ensures greater accountability through external scrutiny but also enhances quality via meticulous error rectification. Moreover, it paves the way for collaboration from a wider community, invaluable for intricate tasks like categorizing the entire financial sector. Yet, it's imperative to juxtapose the merits of open-source collaboration with regulatory compliance. By cultivating partnerships, ensuring project team consistency, and judiciously leveraging open-source prospects, standardization projects can craft and implement data standards that resonate with the entire industry, fostering interoperability and innovation. 

Addressing resource constraints, the open-source paradigm emerges as a potent solution. By harnessing open-source tools, frameworks, and collaborative ecosystems, financial entities and regulatory bodies can access a vast reservoir of expertise, curtail expenses, and accelerate the evolution and acceptance of data standardization methodologies. This synergy will inevitably bolster the successful rollout of data standardization initiatives, heralding enhanced interoperability and data precision in the financial domain. 

Data mavens and banking professionals accentuated the pivotal role of collaboration and open-source tenets in the success trajectory of a data standard. The open-source ethos ensures expansive participation and uniformity, while machine-readable directives amplify accessibility and automation. 

The choice of hosting platform for the standard warrants meticulous deliberation. Although GitHub is a favored option, it's crucial to recognize its status as a private entity with distinct goals. Potential organizational shifts or alterations in domains/reference links necessitate a backup plan. Contemplating a dedicated project website offers more autonomy and ensures neutrality. Alternatively, affiliating with a fundamentally neutral entity, like the W3C, can bolster the standard's credibility. Such strategic decisions ensure the standard's enduring accessibility, reliability, and relevance.