Ramifications of errors are serious (PR, Brand, Legal, Compliance)- The broader issue here is faulty back-end-processes that are not fit for use. Back-end-processes are the pillars that allow activities like PR, Brand, Customer Management, Product Management, Legal, Compliance, etc. to move smoothly. A faulty back-end-process is a huge liability. A famous example of a faulty back-end-process is the “United breaks guitars” (*) episode. When it was exposed, United Airlines’ stock price fell 10%, costing stockholders about $180 million in value. This does not account for loss of customer trust in their brand, which could be many times as much.
For businesses that depend on data, back-end-processes that move data around are critical not just for success but for survival. They have the potential to create a costly “United Breaks Guitars” episode with data. Data errors are Black Swan events (**), comforting by their infrequent occurrence but devastating in impact. Measurable and unmeasurable damages are enormous and also easily avoidable with easy precautionary steps.
Why did the data get corrupted? Data-errors creep in when data flows at a high volume and high speed, in different formats, from multiple sources and through multiple platforms (***). When data is important its accuracy has to be double checked as it travels through the firm’s IT systems. But, validating data using conventional approaches has become a nightmare. The conventional data validation tools and approaches are limited in handling large data volumes or meet processing speed requirements. So usually IT teams can only check <1% of the data. This means there is a very high chance that data-errors will sneak into your system. The scary part, as one banker said, “this is happening more frequently now than in the past.”
How to stop data errors from propagating- The best practice is to validate the integrity of 100% of the data moving through your system, not just 1%. The process has to be automated and not manual. Many companies feel throwing more people at this problem will solve it, unfortunately it does not. Specialized tools designed specifically for large data flows have to be utilized (****). Relying on the old usual suspects for Data Validation tools is a recipe for disaster.
Data errors are known culprits with known solutions. If you don’t validate 100% of your critical data it is unacceptable as it has serious consequences. Especially banks and insurance companies must validate each and every piece of data flowing through their systems.
Source: Chicago Tribune