Data Errors Cost us all – Data Integration is part of the solution

(0)

 

bigstockphoto Analyzing The Data 990862

Data Accuracy – My first blog article was about cleanliness and accuracy of data. The title – Does your company have a Chief Data Officer? In the article I proposed that all companies that trade with electronic data appoint a Chief Data Officer to ensure that their electronic data is clean and accurate. It is amazing over the years what we have billed out to clients to correct poor data or to recover from the down stream effects of poor data.

I have never seen the true cost of poor data actually quantified; although I am sure academics have looked at the issue.

Today, I stumbled across an article at Logisticsmanager.com that detailed recent work by a British researcher Professor Alan Braithwaite of Chairman of LCP Consulting and Visiting Professor at Cranfield School of Management; and Professor Richard Wilding from the Centre for Logistics and Supply Chain Management at Cranfield School of Management.

Their research expanded upon earlier research done by GS1 in the UK, which indicated that poor data accuracy cost the UK retail sector 200 million pounds on an annual basis.

The results of the reanalysis are astounding. Taking the original GS1 numbers and applying a more rigorous Six Sigma methodology the researchers determined that data inaccuracy was costing the big 5 UK retailers and their suppliers 1.4 billion pounds on annual basis. This is a staggering number. I wonder what the North American number might be.

I quote from the article on Logisticsmanager.com

“Braithwaite said: “From our experience of working with many companies, data accuracy is poor with errors in physical dimensions, pricings and operational parameters such as shelf fill, replenishment quantities and order quantities. As this report shows there is a big opportunity cost hidden behind this problem.  Companies need to take a fresh look at their master data management processes alongside their data identification and capture methods; the business cases from investing in both identification and processes may be bigger than they expect. This backroom stuff is crucial.”

“Wilding added: “The reported levels of inaccuracy and their associated costs are worrying. This is especially the case in the context of the enormous investments that all the big retailers have made in product identification, data capture and supply chain integration, and the focus that many companies have put into lean and six sigma methods.”

The paper goes on to highlight another key challenge in the form of GTIN (Global Trade Item Numbers) which are forecast to increase to 250 with the addition of food safety data.  As some of this data will bring liability implications for retailers and manufacturers, corporate accuracy may yet become an issue of corporate governance and social responsibility. “

This article is the first I have seen that actually quantifies the issues around data accuracy. On some of our sites we have put processes in place to validate data, and to report on inaccuracies but it has always been at a very rudimentary level. Why? Because the cost associated with setting up a more rigorous data audit processes is viewed as an unnecessary expense by most businesses. These numbers clearly show that an investment in data accuracy will potentially pay huge dividends and that it also may have liability and corporate governance implications!

Something to ponder, as we are about to celebrate Canada Day on July 1st and the US celebrate the 4th of July. Great holidays to all.


Click here to see VL’s Integrations Options!

Want more information?

Visit our Website

Categories: logistics

By continuing to use this site, you agree to the use of cookies. Click here for more information.

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close