Banks Urged to Maintain Data Integrity

Banks are faced with more complex reporting requirements, more scrutiny of the data that they use to fulfil them and more severe consequences when things go wrong, according to Gresham Technologies.

The white paper from Gresham published today, Data integrity: Your key to confidence in a complex regulatory environment, argues that all too often data integrity is compromised – by poor data management practices, by manual processes, or by a lack of control. 

“In many instances, data is stored across multiple repositories, making it easy to miss key elements. Different jurisdictions require different data fields, creating complexity that leads to duplication of effort and unnecessary costs.”

When it comes to meeting regulatory expectations, financial institutions which cannot have total confidence in their data are setting themselves up to fail right from the start ensuring that firms are reporting high quality, accurate data is far from straightforward.

A 2020 report on the state of capital markets by PwC summarises the situation: “For capital markets participants and users, the regulatory landscape is ever more complex and more difficult to navigate … Our survey shows that 90% of industry executives expect it to take between one and five years to execute on [regulations such as Basel III, Dodd-Frank, etc.]”. 

This report also points out that “not only are the rules much more complex, but … Regulators are increasingly less flexible in their demands to improve compliance, reporting, risk controls and the underlying business processes and data.”

So firms are saddled with the twin challenges of greater expectations and greater scrutiny, according to Gresham’s paper.

Despite the recent challenges of COVID-19, regulators have made it clear they will not accept reporting errors. 

“In fact, they have indicated that they will be enforcing existing regulations more forcefully.”

Moreover, financial institutions face regular regulatory updates that add complexity, increase the number of data fields, and require more frequent reporting.

Philip Flood

“Visibility of where your data has come from, how it has changed and where it is being used – this is the key to data integrity and control,” said Philip Flood, Business Development Director, Regulatory and STP Services, Gresham Technologies.

According to the paper, with much of their IT infrastructure based on legacy technologies, many firms lack the flexibility to handle new data sources and formats.

In addition, many firms are still “not automating to their full potential”.

Automation removes the risk of human error inherent in manual processes, which is why many firms are now looking to automate their data integrity and control processes.

“Automation can deliver some truly gamechanging results for customers -one global clearing firm we worked with reduced the time taken to onboard new ETD controls by 97%,” commented Neil Vernon, Chief Technology Officer, Gresham Technologies.

According to the paper, with many vendors in the data integrity space, it’s important to find the right partner. 

“The Proof of Concept approach will certainly weed out those who say and those who actually do – it’s easy to produce a great demo with theoretically perfect data, but the real test of a solution comes with multiple different feeds and formats that have built up over time.”

“Data comes from multiple systems and in multiple formats and it’s this complexity which can make guaranteeing data integrity so difficult. Working with a data agnostic solution is the only way to navigate this,” Christian Schiebl, Group Business Development Director, Gresham Technologies, said.