how banks and financial services companies can use their data management to reduce the cost of change


Through Martijn Groot, VP Strategy, Alveo

In response to a report from the European Banking Authority (EBA) analyzing the current RegTech landscape in the EU, JNick Maynard, chief analyst of uniper Research, commented: “Ultimately, the EBA report underscores that regulatory frameworks can be harmonized to spur greater [regtech] use, but identified issues such as data quality, integration with existing systems, and awareness as the biggest issues.

Banks and financial institutions generally still have a long way to go to ensure they are prepared for the current wave of regulatory activity impacting the industry. Post-trade regulatory reporting requirements have increased following a series of regulations including Dodd-Franck, EMIR, MiFID II and SFTR. This poses new challenges in reporting, but also creates complications in the activation of trade with new requirements on the integration of instruments and counterparties or clients.

Faced with multiple challenges

There are a multitude of challenges that banks and financial services companies face here. Regulatory requirements state that banks must be able to identify and standardize the source and provenance of all data.

Demonstrating this lineage of data can be very complex – a lack of controls and transparency of data flows can mean companies are unable to explain the origin of specific data points. It also masks the actual business use and value of the data leading to inefficient demand / usage management and control of the overall cost base, content licensing, legal and regulatory outlook. There is also increased regulatory control over the data entering models with new data lineage requirements from the ECB’s TRIM initiative, for example.[1]

More generally, the number of data points to be declared on transactions has continued to increase. In addition to the increased granularity, especially since MiFID II regulators have become more normative on the identifiers to be used for instruments, legal entities and places of execution.

The quality of the data is also of crucial importance here. Businesses need full visibility into their data, including how it was processed and where it came from, so they can immediately see where it came from and be able to assess whether it is fit for purpose. the goal. They must be able to explore the data of each process and see where the quality is insufficient and where corrective action is needed. Poor data quality will hamper efforts to respond to the growing array of financial services regulations in the market today.

When auditors or regulators ask questions or conduct significant investigations as part of their oversight role, companies must provide answers that are not only accurate, but are credible and convincing.

This is especially the case as regulators increasingly scrutinize the quality of data, and in particular the quality of the data that feeds into models. Financial services firms will often need to explain the results, using not only the mathematical and economic logic of the model itself, but also the data that was entered into it, what the quality issues were, what the sources were, and who the touched on the way.

Find a solution

First, banks should look to implement best practices in data collection, cross-referencing, and integration, before moving on to data quality workflows, such as proxy monitoring and tracking. . Aggregated data is perhaps the most important common goal of regulatory regimes, but it has yet to be adequately addressed.

A standard business domain model can help track market demands and benchmarks that businesses face. Data integration should include all major data providers and identification standards, including major service providers.

Regardless of the specific regulations, organizations will need to source from different channels, but they will also need to document how they came to the information they have. It can also include estimating missing data fields using specific proxies or industry benchmarks.

Data normalization includes mapping to a common format, interpolation, and converting reporting bases and units of measure to a standard. Data lineage tracking is necessary for full transparency, distinguishing between data sources and also between company reported facts, third party opinions or internal estimates.

A data management function includes integration with common data sets and reporting systems as well as an enterprise-friendly process for integrating, inspecting and supplementing data sets. Data quality metrics act as a feedback loop to optimize procurement and improve overall data quality.

Going forward, complying with ever-changing regulations means that businesses must increasingly respond to demands for real-time and ad-hoc data. As a result, financial services companies will need to deliver the required data to business users and consumer systems and update it quickly when needed. At the same time, they should reduce costs and prevent unnecessary data acquisition or duplicate data. The cost of the change, whether it’s just adding an instrument, refreshing fields in a title, or changing a source, should be significantly lower. The required solution must be able to handle large volumes of requests while delivering new information with precision.

Ultimately, however, ensuring compliance with modern financial services regulation depends on the continued application of sound financial data management principles. It can’t be treated as a ticking exercise, but if companies source and integrate data well, document how it does it, and maintain high levels of data quality and data lineage, they’ll be in a good position to do it. comply with regulatory reporting requirements at the same time be agile enough to quickly adapt to the changing needs of tomorrow.


Comments are closed.