Interview With David Loshin, Author of “Master Data Management”

MDM

David Loshin is one of the leading authorities on Data Quality, Knowledge Management and Master Data Management in particular. David recently released a book titled “Master Data Management” and has kindly agreed to answer a few questions on the subject.


DQP: What was the inspiration for writing the master data management book?

DL: I have been working in the area of data quality for about 15 years and been involved in a number of activities involving customer data integration and cleansing that never seemed to be finished.

Master data management appears to be different because my perception was that it hinged upon some critical concepts that were part and parcel of Knowledge Integrity’s approach since I started the company: focus on business drivers for exploiting information, identify the quality criteria that are important to achieving business goals, and carefully integrating technology when it is necessary but not to the exclusion of common sense.

In essence, that is the message I wanted to convey in the book.

DQP: Why has MDM become more popular relatively recently?

DL: Actually, the answer to this question is associated with the answer to your next question as well. Once good ideas have been acronymized (is that a word?) into the three-letter form, we are already on the slippery slope of overhype.

MDM is popular because it is presented as the cure-all solution to al data problems in the organization. To some extent, having your master data organized properly and driving all business applications from a unified view of master data subject areas would address many of the recognized shortcomings in many organizations’ enterprise information infrastructures.

A wise technologist, though, will review what the business’s true requirements are for high quality master data and design a solution that best fits those needs without overwhelming the drain on available resources.

DQP: Are there any common pitfalls organisations typically fall into regarding MDM?

DL: The one that concerns me the most is the expectation that as soon as data has been extracted from data sources and consolidated into a single repository, you are good to go.

Considering that every organization has legacy applications, many purchased from external vendors having their own proprietary data stores, there is going to be a significant level of effort to synchronize *every* business application with that master data repository.

Even if you could recode 30-year old JCL and COBOL to access master data through your newly-minted SOA layer, there are bound to be many hidden shortcuts in the legacy code and the underlying data upon which many folks depend to make the business run.

Not only that, considering that effort, if you actually did choose that approach, after a few years of reengineering legacy applications, they still don’t do anything more than they used to do in the first place. It would be reasonable to consider the total cost of ownership when determining an approach to MDM, and identify where to move forward and where there is limited upside potential.

DQP: In the book you provide a lot of advice on how to implement data governance for MDM – is this message really getting through or are organisations still taking too much of a techno-centric approach?

DL: I did do an interesting paper on Data Governance that was released at the end of 2008 – check it out at http://www.beyeresearch.com/study/9251.

In fact, I am doing a lot of thinking about data governance these days.

What does governance really mean, though?

At what point does the need to govern the horizontal view of corporate information (and *all* information is corporate information) conflict with the vertical data requirements associated with operational activities? This is an area that I’ll be exploring in a few different forums over the next year.

DQP: What advice would you give on the phasing of some of the supporting elements of MDM such as Data Governance and Data Quality? A lot of companies have neither strategies in place so is there a preferred sequence of events before companies embark on say the technology aspect of MDM?

DL: First thing overall is to assess the business requirements for data, both for operational and transaction processing and for reporting and analytics.

Ultimately, the same data drives both activities, just in different contexts and different kinds of collection, aggregation, and presentation.

This drives both data governance and data quality – once data requirements are identified, metrics can be defined, then measurement procedures and minimum thresholds for data acceptability.

DQP: What advice would you give to those who are looking to convince senior management of the benefits of MDM?

DL: The following points are important:

  1. Understand the senior management’s business objectives

  2. Identify ways that those business objectives are not being met because of issues that would be resolved by MDM

  3. Assess the financial “costs” related to the way those objectives are not met

  4. Identify the key components of MDM that address those issue

  5. Consider at least 3 different alternative approaches to addressing those issues

  6. Calculate the total cost of ownership for each of the alternatives including MDM

  7. Convince yourself that MDM is the most cost-effective approach

If you get this far and MDM is the best solution, then you might have a good chance of managing around any potential objections. If you got this far, though, you might have also determined that MDM might not be the best solution.

DQP: In terms of technology, do you feel the market has sufficient maturity or do you feel there are further innovations required?

DL: The tools have evolved nicely, but the real issue is that pushing the purchase of technology in the absence of providing competent advice in how that technology solves customer problems is not good for any industry.

At this point I would advise any organization considering MDM to engage some independent consultants (i.e. ones that are not reselling technology) to validate the need and help assess the best alternatives for solutions.


David Loshin

David Loshin, president of Knowledge Integrity, is globally recognized as an expert in business intelligence, data quality, and master data management, frequently contributing to Intelligent Enterprise, DM Review, and the Data Administration Newsletter TDAN (http://www.tdan.com).

David’s expert channel at the Business Intelligence Network provides thought leadership and advice, and can be accessed a thttp://www.b-eye-network.com/loshin.


Previous
Previous

Information Quality Management Framework (IQMF): An Overview with Ismael Caballero

Next
Next

ISO 8000: A New International Standard for Data Quality, by Peter Benson