Data quality: improving to survive

The increasing demand for better data quality is attracting more attention throughout the industry. Wipro's Sukant Paikray explores the reasons for this development and offers advice on what measures firms should take

sukant-paikray-wipro
Sukant Paikray, Wipro

Regulatory compliance requirements, the increasing role of enterprise risk management, and climbing operational costs are all positioning data quality as a top priority for financial institutions. Improving data quality is no longer a secondary activity, but rather a systematic, enterprise-wide agenda aimed at achieving maturity through effective governance and the integration of quality measures.

The immediate focus is on tracing the data lineage and implementing proper controls over critical data shared outside the organisation, as well as continuously measuring and improving the quality of data, based on key business requirements.

Financial institutions are beginning to recognise the benefits of proactive data quality management, which decreases human dependency, limits errors, tracks quality and provides continuous advancements. Organisations are redesigning their infrastructure by introducing focused data quality technology themes, such as workflow and rules management, exception reporting, data entry control, auto-corrections, and reconciliation tools throughout the data processing cycle.

Looking to the future, there are several issues and next steps that financial institutions should look to address and implement:

Clean-up of legacy data: Among all reference data domains, the clean-up of client data is a top priority due its complex nature. Cleaning up involves proper data-matching between systems, linking to the right hierarchy and redefining the lineage. With the proper selection of resources and effective technology, clean-up efforts can be significantly reduced.

Maintaining a single reference data hub: Financial institutions are drawing multi-year roadmaps to create common reference data repositories and to simplify the data distribution process. More recent technologies are used to dynamically identify the data conversion rules that reside in multiple sources, and to then unify them into the golden copy hub to reduce downstream systems changes. Clearly, there is a case for the single-hub program that reduces maintenance and licensing costs while avoiding data irregularities and increasing user confidence.

Increased integration of operations and IT: Activities such as application/data testing, helpdesk support, release and minor works, and 'keeping the lights on' can be performed by an integrated team. This progression is essential as analysis, remediation and stewardship require stronger relationships between both operations and IT. The concept of IT vendors offering 'data quality as a service', from a utility construct, is evolving fast.

Downstream control points: Even if a single hub is established as the strategic reference data golden source, it is difficult to control clean data downstream due to a wide range of integration factors, such as inaccurate mapping of data attributes, issues with the transformation layer and/or flawed manual adjustments. To curtail this risk, organisations deploy similar data quality measures for both upstream and downstream systems. Constant reconciliation of data with upstream systems and returning the derived/adjusted data to the source are critical components in the overall process.

Obtaining minimum required data: Since BCBS 239 regulation is around the corner, organisations are in a rush to be compliant. They aim to obtain the minimum required data components in order to meet the regulatory deadline. Though this is a short-term remedy, it is a solid first step towards achieving the long-term goal.

Reduced costs, fewer errors

The advantages of data quality management are both evident and measurable. In our own experience, legacy data clean-up has reduced organisations' operational costs and risk, opening up new opportunities to cross-leverage data between investment banking, wealth management and private banking groups.

The automation of data processing will create fewer errors while increasing data quality and potentially reducing operational costs. A single data hub employed to unify processes and enterprise-wide management will provide big opportunities to cut overhead costs.

The development of a data quality roadmap allows the industry to revisit and revise its data architecture for IT improvements. It will allow firms to adequately prepare for future issues and align with industry trends and initiatives while improving technology and adapting operational processes. An enterprise-wide data quality initiative is a significant opportunity to gather internal business groups for a discussion on their issues around data quality, and hopefully could result in a policy to use data from a single source.

Sukant Paikray is director of data management practice at Wipro

This article was originally published on sister website WatersTechnology.com.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

Chartis RiskTech100® 2024

The latest iteration of the Chartis RiskTech100®, a comprehensive independent study of the world’s major players in risk and compliance technology, is acknowledged as the go-to for clear, accurate analysis of the risk technology marketplace. With its…

T+1: complacency before the storm?

This paper, created by WatersTechnology in association with Gresham Technologies, outlines what the move to T+1 (next-day settlement) of broker/dealer-executed trades in the US and Canadian markets means for buy-side and sell-side firms

Most read articles loading...

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here