Master Data Management Interview with Charles Blyth of CPP

Charles Blyth, MDM expert

Charles Blyth, MDM expert

As a key focus for Data Quality Pro is Master Data Management (MDM), we routinely interview leading practitioners and experts in the field. This interview discusses a range of MDM challenges and techniques with expert Charles Blyth of CPP.

CPP have engaged in a major MDM initiative and Charles kindly agreed to answer a range of questions surrounding the program.

Topics included how to create an MDM business case, what techniques to adopt for gaining executive buy-in, what strategy to adopt for MDM architecture, how to sustain data governance and what data quality issues did the program address.


Data Quality Pro: For the benefit of our readers, please can you describe your current role and responsibility?

Charles Blyth: I am the Head of Business Intelligence for the CPP Group. I look after the BI (including Data Warehousing), Data Management (including MDM), Business Performance and Customer Analytics teams.

Data Quality Pro: Can you tell us more about CPP, what type of business do they operate?

Charles Blyth: The CPP Group Plc (CPP) is an international marketing services business offering end-to-end bespoke customer management solutions to multi-sector business partners, designed to enhance their customer revenue, engagement and loyalty, whilst at the same time reducing cost, to deliver significantly improved profitability. This is underpinned by the delivery of a portfolio of complementary Life Assistance products, designed to help customers cope with the anxieties associated with the challenges and opportunities of everyday life.

Data Quality Pro: What were the key issues your Master Data Management (MDM) programme set out to address?

Charles Blyth: CPP initiated its Master Data Management Program to address three key issues that directly impact on the effectiveness of service, cross-sales, retention, collections and management information activity:-

  • Difficulty in resolving inconsistent data from multiple sources (two policy administration systems, no single customer view).
  • Limited control over the creation and standardisation of new core business data, in particular customer contact details, Business Partner and Campaign descriptions.
  • No Data Governance procedures or clearly identified ownership for the enforcement of data standards across the Business.

Data Quality Pro: What was the business case for the initiative?

Charles Blyth:There is a clear understanding at an executive level CPP that data is and must continue to be a strategic asset to our organisation. This goal provided a definitive approach to our programme. Prior to embarking on this initiative our Customer Data Integration services were outsourced, at significant cost, with little benefit. The requirement to bring in-house the vast majority of our marketing campaign and Business Partner data cleansing activities, provided a net saving to the Business of approximately £500k per annum.

Data Quality Pro: Expanding on the 3 key issues you cited earlier, how did these manifest as deliverables?

Charles Blyth: Our Program focus and deliverables fall into four key inter-related areas:

  • Single Customer View: The customer view was delivered into production within 10 months. This incorporates policyholder data from both our policy systems to produce a ‘golden record’ for each customer containing the best data available from both sources. As part of this process, key contact information is improved using the latest QAS address quality application to maximise postal deliverability, together with landline, mobile number and email address formats being validated. This data source is now being used to support retention campaigns and provide input into data management queries on core policy system data. The capability to score and maintain propensity model scores for a range of applications, such as cross-sell, attrition and claims, is now being developed.
  • Single Policy View: Core policy information from our disparate policy systems has now been integrated into a single hub that is refreshed weekly. Phase 2 of the program aims to deliver the daily updates necessary to support the majority of policy-related management information and this database will form the basis for the new Datamarts within our Data Warehouse, providing uniform and consistent data on all policies. This will improve the speed, accuracy and overall consistency of daily, weekly and monthly MI, and make the investigation and resolution of collection issues more efficient.
  • Data Governance: Alongside the collation of policy data from both systems, we have also developed DataFlux processes to extract, analyse and cleanse core descriptive information (such as Business Partner, Campaign, Product, Payment Method, Policy Status, Cancellation Reason) using agreed Business rules. These processes will again be migrated to daily processing allowing the immediate assessment, and correction if necessary, of new product/campaign/commission parameters thus fixing problems at source that impact servicing, commissions and MI throughout the lifetime of a policy. The corresponding business processes to review and sign-off changes to Business rules will be developed and in place by Q4 2009.
  • Data Management: Over 50% of existing data management processes have now been migrated into automated linear processes run within DataFlux. This has removed a significant number of opportunities for error, due to hand-offs between analysts and manual interventions, and improved turnarounds times for key processes such as fixing load errors on application files and campaign file creation. The most significant development has been in improving the quality of prospect data received for campaigns, thus improving early life communication to retain policy purchasers and the data services provided to our Business Partners.

The plan is to have completed the migration of all UK data management activity to DataFlux by the end of 2009. Best practices established in the UK are now being shared with our International territories, either through adopting the software as in Spain or programs supported by the Centre of Excellence in the UK.

Data Quality Pro: Was there a need to integrate external reference data in order to augment or improve your existing information?

Charles Blyth: Yes, as mentioned previously we have already integrated QAS into our Data Quality processes to maximise postal deliverability, together with landline, mobile number and email address formats being validated.

To add to this, we are currently delivering the capability to integrate additional 3rdparty demographic data to augment our customer data and drive enhanced capability to score and maintain propensity model scores for a range of applications, such as cross-sell, attrition and claims.

Data Quality Pro: You are obviously streamlining a lot of data management processes within this programme, has the MDM strategy lead to any additional cost-savings by making legacy systems redundant?

Charles Blyth: As yet we have not applied a monetary value to the efficiency savings that we have achieved through improved Data Management and Data Quality processes, they are there and it is something that we will review at the end of the current phase of the project.

Data Quality Pro: What were the data quality issues faced by the organisation prior to this programme commencing?

Charles Blyth: CPP is a very dynamic organisation, and to enable this our system capabilities need to be agile and as a consequence they are bespoke developments based disparate data sources. And with these disparate data sources you can imagine that we get the traditional issues of

  • Data inconsistencies across these multiple sources
  • Scaled down control over the creation and standardisation of new Business Data
  • A dynamic data environment that is constantly in flux and with the need to respond to ever challenging business needs

A pretty unique issue we have is the volume of Business Partner data that we deal with on a daily basis, with over 200 business partners, these data requirements and constructs very dramatically, all of this data needs to be standardised and provided for use by our CRM and various other applications.

Data Quality Pro: For the benefit of other organisations about to embark on a similar roadmap, how long has it taken you to deliver the various programme elements to date?

Charles Blyth: The initial phase of the programme was very aggressive, we needed to cancel our outsourcing contract in October 2008, having completed the proof of concept at the end of July we had 3 months to get to a point where we were confident that the capabilities were there and we were on the right track, we did not review our outsourcing contract and had delivered our Customer Hub in full by the end of April 2009. I now regularly quoted as saying ‘Zero to Customer Hub in 10 months can’t be beaten!

Data Quality Pro: How has the business community responded to the programme? What benefits are they witnessing on the ground?

Charles Blyth:I remember thinking that having been in the BI industry for over 12 years I have delivered my fair share of enterprise Data Warehouses and Data Quality solutions, surely this would just be another one, with the standard issues that you experience on those types of project. Data Warehousing projects are never easy, however we experienced a whole new set of challenges with this project. I link the challenges we had more to those faced in deploying a front office application, rather than a back office Data Integration project. You need to look at MDM deployments as a hybrid, as taking Data Quality and Integration to a new level.

Nothing is fun unless it is challenging, this programme is fun!

Data Quality Pro: Can we understand some more about the Data Governance element of the project. How is that being delivered because governance obviously requires an ongoing solution?

Charles Blyth: Even with all the success we have had so far, true data governance is still in its infancy at CPP. We have a solid foundation built around central Data Management process, with the green shoots of maturity spouting up, in a controlled manner, as we start to push data governance back into the business.

The first iteration of our Data Governance processes have been delivered and have been ‘piloted’ through phase one of the programme, phase 2 is bound to provide additional elements to the processes, with the galvanising of our support processes.

We are evangelising the need for Data Governance and understanding of Data Quality at the sharp end of our business, at every contact we have with a customer, prospect or business partner. The need to monitor and police this as we mature is clear and the Centre of Excellence in our Data Management team at Group Head Office is there to ensure this is done and all new processes and procedures are adhered to as they are rolled out on an iterative basis.

Phase 3 of the program will take us to ‘Utopia’ with the provision of MDM as a service into our SOA environment, further driving Data Governance back into the business.

Data Quality Pro: On the technical side, how have you implemented the MDM framework and associated hub/synchronisation strategy?

Charles BlythCPP: The technical aspects of the implementation could not be viewed on their own, resulting in us adopting an holistic approach, also taking in to account people, business culture and business processes.

The initial implementation timescales were very aggressive, so making sure the relevant departments and stakeholders involved understood the project’s objectives was key. Interdepartmental teams worked well in advance of deadlines and kept stakeholders involved of issues to ensure that when problems did occur, impacts were understood and resolved immediately.

Through this the MDM framework was designed and implemented with business stakeholder input throughout the process; the same stakeholders were also involved in technical workshops so as to facilitate understanding of not just the final outputs, but also the core processes delivering those outputs. Existing IT solutions, such as Oracle Warehouse Builder, were integrated with the final solution to ensure our investment in Group technology and capability was leveraged.

Our strategy to integrate with our SOA environment resulted in us taking the Transaction Hub approach to our architecture. Hub synchronisation was a major challenge throughout. A uni-directional approach from source system, to hub, to downstream targets reduced complexity and meant bi-directional conflicts could be avoided. Batch processing which consumes data from multiple source systems required new and complex business rules to be created, where individual records and/or fields could be ‘cherry picked’ for population in to the qMDM hub.

A major piece of the synchronisation jigsaw involved testing, testing and more testing; large volumes of data were analysed and revisions to the synchronisation rules applied. To quote Andy Hazelwood – Data Architect at CPP ‘Our approach during the MDM implementation meant we had key IT and business stakeholders on board from the beginning, which enabled better and quicker decision making throughout the technical design and implementation process. Our ongoing implementation continues in this vein and builds on the existing infrastructure and processes to provide improved platform stability and downstream analytics capabilities.’

Without the buy-in from these key stakeholders we will not be where we are today and our goal to reach ‘Utopia’ would not be feasible.

DataQuality Pro: Do you have further plans for wider MDM or data quality implementations within CPP?

Charles Blyth: We have started to roll out Data Quality capabilities to my team in Spain, which are already driving significant efficiencies, and through our central team we are working with our territories in Turkey and Germany on further Data Quality initiatives. The view is to focus on our core deliverables and data and work with the territories as required.