green ribbon

Tier 1 Global Bank selects Datactics RegMetrics™ to Improve Operational Efficiency and Meet Regulatory Data Compliance

Datactics Ltd, a leading provider of data quality and matching software, today announces that a Tier 1 global bank has signed a three year licence agreement for its RegMetrics™ solution that enables continuous data quality reporting across the enterprise to address the regulatory data quality demands of BCBS 239 and other regulations.

Powered by the Datactics in-memory data quality and matching platform now installed in 50+ sites globally, RegMetrics enables active monitoring, remediation and reporting of data quality for data at rest and in motion. It ensures that content delivered to business applications is accurate and fit for the purpose of regulatory reporting on BCBS 239, FATCA, MiFID and CCAR.

Following a rigorous assessment of leading vendors, the Tier 1 global bank invited Datactics to undertake a short proof of concept designed to address the requirements of reporting on large entity, instrument and risk data sets. The proof of concept resulted in a rapid installation which, coupled with the use of existing out-of-the-box rules, developed a powerful CDO dashboard within eight weeks. The system went live last month and now monitors the bank’s entire data universe, processing hundreds of millions of data points at field level. As well as reporting on data quality across the bank, RegMetrics is also used to actively scrub and aggregate information for automatic submission to the SEC and the US Federal Reserve.

Financial regulations, such as BCBS 239, require the need for continuous data quality monitoring over all major data types as opposed to the traditional sampling approach. The Tier 1 global bank will use Datactics to store historical quality information that can be reported for the purpose of building historical quality analysis and benchmarking.

This initial deployment makes use of the Enterprise Data Management Council’s Data Management Capability Model (DCAM)™ standard. Datactics has incorporated the five relevant data quality dimensions used in DCAM allowing bank staff to drill down at a department and desk level through metrics like completeness, conformity, accuracy, duplication and consistency.

RegMetrics comprehensive measurement and monitoring capability over the client’s production data assets equips its data management team to improve data quality in near real-time. It also improves the data supply across the business while simultaneously addressing their regulatory commitments.

Luca Rovesti, Datactics Lead Data Consultant, comments on the project: “RegMetrics automates corrections when there is a clear breach of the test conditions assigned to the data, for instance when maturity dates are in the wrong format or when counterparty data is found in the wrong location. A small portion of failing data requires human intervention – in these instances, RegMetrics alerts the business owners about their failing records via ServiceNow® for investigation and fix.”

The analytics that RegMetrics allows the firm to create opens up new opportunities to significantly improve data quality with a truly agile and adaptive approach. Now armed with a database populated with near real-time and historical data quality metrics, this client is investigating the possibility of using statistical predictive analytics for financial data quality.

According to Stuart Harvey, Datactics CEO, RegMetrics is a compelling proposition for banks and asset managers seeking a regulation-ready and easily integrated system that puts power in the hands of the data content specialists in the firm and does not require expensive programming resource.

"We are delighted to be working with this Tier 1 global banking client on this project. They set us a very demanding requirement in terms of timescales, integration and reporting. I’m pleased that our team, working alongside the client’s data management team, has delivered a genuinely innovative solution in the area of regulation technology and compliance.”

“As more firms accurately and reliably measure the data they receive from third party vendors, the more pressure there will be on these vendors to improve the quality of data they sell which should drive up data quality standards benefiting everyone".