The State of Global Data Standardisation: A Deep Dive at the London RegTech Forum
Panel:
- Francis Gross, Senior Economist at the ECB's Directorate General of Statistics
- Andrew Douglas, Chairman of the Data Standards Committee of the Bank of England
- Andrew Smith, Head of Prudential Reporting at Clear Bank
- Andrew Turvey, Director of Prudential Risk at Belmont Green
Moderator:
- Murat Abur, CTO and Co-founder of Suade
The iconic Guildhall's Old Library recently hosted our esteemed annual RegTech Forum, in partnership with the City of London Corporation. This prestigious gathering convened a select panel of experts from leading banks, building societies, the tech industry, and regulatory authorities. The forum's objective was to illuminate recent transformations in the finance and banking sectors. A central theme of our discussion was the state of global data standardisation in regulatory reporting.
In an era where data is often referred to as the "new oil," the importance of standardising this invaluable resource cannot be overstated. The session delved into the crucial aspects of data standards, current worldwide initiatives, and the collective drive towards a harmonized future.
Historically, the financial sector has had its share of challenges with data management. The repercussions of such negligence became glaringly evident in the 2008 crisis. The events leading up to this financial meltdown were obscured in stress tests due to aggregated data, which masked the underlying triggers. For accurate modelling and forecasting, granular, standardized data is essential. It ensures that all stakeholders operate from a unified understanding. The overarching message was clear: if we don't manage our data correctly, we can't harness technology effectively. The goal is to establish a standard proactively, rather than in reaction to another financial crisis.
While having any standard is better than none, the session emphasized the need for a single, unified standard. Given the interconnectedness of the financial system, regulators must adopt a global perspective, moving away from isolated national approaches. Drawing a parallel to the universal standardization of electrical sockets, the discussion highlighted the potential game-changing impact of seamlessly integrating financial data into regulatory systems.
Encouragingly, collaborative efforts in the realm of data standardisation are yielding tangible results. The UK, in particular, is at the forefront of these initiatives. Programs like the Bank of England's Transforming Data Collection Programme, where industry and regulatory representatives collaborate, are commendable. However, the collaborative nature can sometimes slow progress, and the outcomes are often recommendations rather than binding regulations.
The momentum for data standardisation is undoubtedly building, driven by its potential for widespread benefit. Demonstrating the advantages through case studies is crucial. Institutions seek clarity from regulators about where to allocate resources. Even minor costs can be burdensome for smaller institutions, emphasizing their desire for simplicity and certainty. The public sector's apprehension about being overly directive needs to be addressed. With the increasing demand for real-time data accuracy, there's a pressing need to leverage technology for enhancing data quality, fostering innovation, and promoting competition. Active participation and engagement in these projects are essential for all stakeholders.
The session on "The State of Global Data Standardisation" at the annual RegTech Forum was a testament to the industry's commitment to harnessing the power of data effectively. As the financial landscape continues to evolve, standardising data will play a pivotal role in ensuring transparency, accuracy, and efficiency.