Is your firm capturing quality, meaningful data?

Following his blog for UK Finance on confidence and ownership amongst data consumers, Neil Plant, Jaywing’s Lead Data Architect, expands on how firms can ensure they are capturing quality, meaningful data.

Financial firms hold an abundance of data. Failure to maintain the quality of this data and their systems can present great risks to operational functions and reputation, as successful data users must be able to access meaningful data to make educated business decisions.

Whenever firms use data, they heavily rely on the data’s infrastructure and plumbing as well as the quality of the data themselves. These are often taken for granted, with confidence that technical teams continuously develop, maintain and support the data supply chain. While this is sustainable, it does not encourage progress. Assessing the quality of data and systems allows us to identify problems or potential concerns and determine where we need to make improvements.

Fortunately, there are industry standards that we can use as a benchmark. The main ones are Basel Committee on Banking Supervision’s BCBS239, the European Banking Authority (EBA)’s Regulatory Technical Standards (RTS) and more recently, elements of the Prudential Regulation Authority (PRA)'s CP6/22. Although these standards are regulatory, and primarily aimed at risk data and systems, firms can reap real business benefits by conforming to these high standards. For most firms, full adherence to these standards would be disproportionate, but, if not bound by these, firms can selectively apply principles, safe in the knowledge that they are improving data quality and governance.

A review of data and systems is best done when implementing new solutions, as this is the most suitable time for making changes with limited disruption. Reviews of data and systems with a specific purpose (e.g. IFRS 9, campaign management) can quickly establish a standard for implementation and can be used as a template for future activity. This allows improvements to be rolled out as part of ‘business-as-usual’ rather than being the product of a 'big bang' change.

Unsure what to assess in your review? The data and systems should exhibit certain characteristics such as, but not restricted to, the following:

Data should be:

  • subject to governance
  • driven by business needs

Data and systems should also contain:

  • Source to target lineage – knowing where data come from and how they are derived are a key component in trusting the data.
  • Metadata – a common metadata store is an efficient way of retaining information described above, as well as business descriptions and definitions.
  • A data dictionary – this will prove beneficial to users when preparing data for analysis and reporting.
  • Quality monitoring and incident management– for best practice, firms should continuously assess issues related to data and work to resolve weaknesses.
  • Appropriate technologies and automated processing – teams should assess what tools and software could drive efficiency through automation.
  • Flexibility – data should have an ability to respond to challenges quickly to swiftly benefit downstream users.
  • Regular review – inevitably, data will change, so data management needs to be ready to adapt.

Increasingly, there is the possibility of data needing to be shared externally, whether for regulatory purposes, customer requests or as part of open banking. Such data users will have high expectations that what you share is accurate, so using this checklist to foster confidence in data and systems will make the sharing process far easier.

To stay up to date with Neil’s data management insights, visit Jaywing Risk.