Clicky

Home > Expert Journal     (For detailed search please visit the Archives)

Taking The 'Fear' Factor Out Of Data Quality By Duane Smith

Photo Credit - Flickr, LWPrencipe

Photo Credit - Flickr, LWPrencipe

This is a Guest post by Duane Smith, Data Quality & Data Governance Practitioner, Australia

Selling your data quality initiative based on fear, may have a short-term pay back but I believe it will ultimately fail in the longer term.

Data Quality is a value proposition and as such needs to be sold on the basis of it’s positive benefits and not the risks associated with it’s failings. 

This might be tough based on the current economic climate of today as data quality is easily brushed aside as a nice to have when times get tough or out-sourced IT projects promise to deliver it for pennies on the dollar.

However, the purists know that data quality is only achievable with consistent effort over time and a commitment to quality that goes beyond the data and to the core of business ethic and culture, through day-to-day professionalism and respect that businesses espouse to.

It is all to easy, for the sake of the short-term dollar, or shareholder pressure to throw away solid principles toward goals worth striving for. Anyone who has been involved in a project commissioned towards a short-term data quality gain will understand what I am referring to, when the project ceases and the quality regression commences towards where the project started.

Furthermore, any data migration project that took place in the process of delivering a new system with promises of improved business process or positive change outcomes has in the majority of cases only embraced the benefits of an altered future state based on system functionality rather than an improved way of managing existing business processes with valuable lessons least from the legacy of the past.

The primary legacy being those data quality issues that have accumulated over time due to lack of definition, understanding or commitment to information capture and principles.

I can assure you that when it came to the crunch, and the project ran over time, as they inevitably do, data quality was the first casualty.

Because it too time consuming to remedy, and the issues to overwhelming, it became the scapegoat for the project delay. When in real terms it should have been the first alarm bell that the project has been rushed or not well thought through.

In order to keep an even keel and meet the business users expectations, which have been reduced to timing and not value, the decision was made to cut quality from the proposition. The businesses historical pain points became decision points of ‘GO’ / ‘NO-GO’ to the sound solid principles of data quality improvement and resolving issues of the past.

Just as they are saying about most things today, those that come after will be looking back and wondering ‘what were they thinking’.

The data quality issues that exist within your information sources are pain points and should represent the proverbial red flags commonly required by senior executives, middle mangers and business intelligence professionals.

Instead they are removed, diminished and in some cases ‘deleted’ in order to sweep the dust under the carpet so-to-speak.

However, just like the criminal who is never really free from his guilt, so too is the business, manager or developer that decides to forge on ahead without taking consideration to the mistakes of the past and how this may impact on the future system or business user.

For the data quality professional these issues are like ‘gold nuggets’ just waiting to be turned into realised value for the business and sign posts as to where to go from here, towards the light of realised information value as an asset.

Maybe I am an idealist, but I still believe there is ‘value’ in ‘value’ even though with each generation and technology improvement there is always a tendency to do away with knowledge gained in order to embrace the next big thing even if it is a case of throwing the baby out with the bathwater.

In this case it is ‘BIG’ data and in this case the baby is data quality and it has been thrown out many times over in light of a quick decision without deep consideration. Technology is not the answer to the problem here, it is just a useful tool to manage our processes.

Until we take stock and value ‘quality’ we will reside in the realms of the also rans and never be the brilliant information managers and leaders we were always meant to be.

Data Quality is not a project, it is a way of doing business just as ensuring you have the best ingredients when preparing anything worth selling.

What do you think? Welcome your views.


Duane Smith

Duane Smith

About Duane Smith

Duane is a data management practitioner specialising in the areas of Data Governance, Data Quality, Business Intelligence, Data Analytics, Data Driven Marketing & CRM.

Visit Duane on LinkedIn