Is due diligence so important we can’t accept inaccuracies?

A member of a compliance What’s App group to which I belong posed the question, “what happens if doubts arise about the accuracy of due diligence data ?” A good question without an easy answer. But I shall do my best.

For most of us due diligence is a thing we do because there is a regulatory requirement. Of course there are plenty of reasons to do due diligence without a regulatory impetus, but it is surprising how many markets there are where word of mouth suffices to manage risk. In other words without regulatory pressure our risk tolerance probably goes up.

So called “lead” regulators tend to be principles based, so will never prescribe total accuracy in your due diligence data, nor tell you what you must get right when. We ourselves have to work that out. One key to satisfying such a regulator, however, is to have control measures proportionate to risks.

In due diligence terms ‘proportionality’ looks different from organization to organization. In global financial institutions it very often means having departments within departments devoted to testing and checking due diligence data is fit for purpose. A variety of metrics are sent each month to managers informing them of due diligence data gaps, inaccuracies and other failings. Inaccuracy and gap tolerance levels are set and treated as KPIs. Inevitably all this effort is high cost, and leads to painful remediation exercises, back-logs management and problems down-stream with systems (and data reservoirs) that leverage KYC data.

Given the potential complications, such organisations like their due diligence data to come from reliable sources, and more often than not employ small armies of people in-house to get such data. But even then their processes still rely upon a multitude of providers offering anything from their own data sets (eg. Dun & Bradstreet or Lexis Nexis), aggregation of data from multiple sources, or collections of data for specialist purposes – such as screening (eg. Acuity and Thomson Reuters). Always the client expectation is infallibility, fast delivery, and inexpensively.

Elsewhere there may be even more reliance on external providers for due diligence data collection, but that does not reduce the need for accuracy. Regulators may expect firms to say why they use a chosen vendor, and why they trust the data the vendor provides. Justification is not something that can be outsourced.

Such organisations, big and small, need to have some way of testing their KYC data’s accuracy. Usually this will be via sampling. They will also need a process to govern responses when they become aware of an inaccuracy, and corresponding metric for justifying their faith (or otherwise) in a particular vendor. Lastly, they need to set tolerance levels and ensure that managers are notified when tolerance levels are breached, then record what actions are taken with respect to respective data sources.

All this is in addition to expectations that due diligence data will be kept up-to-date. Collect and forget due diligence is no longer due diligence. Best practice expects automated dynamic real-time due diligence, or, failing that, periodic updates (manual or otherwise).

As everyone knows, in a world where transparency is still far off, not only are there competing imperatives (such as data privacy) driving transparency (some say, sideways or backwards !), but many governments seem to be taking a perverse delight in making many necessary due diligence sources (such as corporate ownership records) unreasonably inaccessible. Even where transparency exists, few governments do more than pay lip service to ensuring their statutory registries are up-to-date or accurate. The problem with many official registries is that data have received a veneer of authority by being lodged in an official database with penalties for inaccuracies, when in fact information is self-reported, and accuracy has never been checked. In short, registries that are often thought of as arbiters of truth and accuracy are nothing of the sort.

That leaves the compliance officer with one measure of accuracy – namely corroboration. However, not all data points will have multiple sources, so understanding the efficacy of any single source has to be baked into any due diligence process.

Many concerns with inaccurate data are more than just regulatory issues. Organisations that leverage data across multiple systems and functions risk polluting an entire ecosystem with faux data. At its lowest level it is little more than a time-wasting irritant. A wrong individual date of birth may cause a salesman to try selling the wrong financial product. At the more serious end it can have grave repercussions for the innocent, and backfire horribly on the institution. Wrongly ascribing an Iranian IDD code to a telephone number (a single digit error) could result in the entire customer or supply chain relationship being defenestrated.

There are also growing concerns as to how inaccurate data (due diligence or otherwise) could skew AI. When AI fails to meet expectations the usual fix is to throw more data at it, to enhance its learning and eliminate bias. That, of course, will not work if the data is either the wrong sort of data in the first place, or it is approximate only in its integrity.

One of the biggest drivers for outsourcing due diligence is cost. As regulatory expectations broaden and deepen the price of doing due diligence has come under ever closer scrutiny. Errors cost. But cost cutting can have perverse effects. Organisations that use the cheapest vendors risk rising remediation costs. When a vendor’s own cost are under pressure, it is unreasonable to expect them to be spending big on data assurance. One vendor made a virtue of a necessity, saying they relied on feed-back to identify and correct issues with their data. Whilst this is commendable and reasonable, it is hard to avoid the suspicion they are in effect telling clients they can outsource the collection of data to them, but not verification or assurance – and certainly not at the price clients expect to pay for the service.

So where does this land us ? Accurate data is necessary to effective due diligence – that is a given. However, verification is only possible if data is updated frequently (preferably dynamically), and a robust assurance programme is in place. Sitting back and waiting for something to go wrong is not managing risk.

Nevertheless, no amount of testing and updating is ever going to assure 100% accuracy. There will be failings, but these must be proportionate to risks. An organization must ask itself three questions: (i) will inaccurate data infect the organisation’s downstream system, leading to multiple failings; (ii) will the inaccuracy trigger wrong responses and; (iii) will the inaccuracy impact risk ? If the answer to each question is “probably not”, then a certain level of inaccuracy maybe something that has to be accepted.

The author leads TSG’s Advisory Services. He has spent many years in law enforcement and banking specialising in financial crime risk and compliance. TSG is a Research (including due diligence) specialist, also offering Ethics Compliance and Advisory services to its clients. TSG offers expertise in Eastern Europe, as well as East Asia.