How to Create a Data Quality Competency Center: Expert Interview With John Schmidt

john schmidt.jpeg

John Schmidt came to our attention as he has considerable experience in launching competency centres - resources which are extremely useful for increasing data quality awareness and improvement across the organisation.

In this interview we speak with John to learn techniques that will help your organisation develop a world-class data quality competency center.


Data Quality Pro: In your experience John, what are the steps required to take in order to launch a competency centre?

John Schmidt: There are only two pre-requisites for launching a competency center:

  • Support from a senior executive

  • Committed practice leader

Ideally the support should come from more than one executive, at a very senior level such as the CXO, and it should be “active” support. You might get by with less, but it gets harder and harder to drive the investments and necessary changes as you water down the top-level support.

The second pre-requisite is a committed practice leader. The competency center leader doesn’t need to be an expert in all areas on day one, but he or she does need to have the capability to become an expert and is determined to learn through personal effort and commitment. Ideally the leader is also entrepreneurial, has a thick skin, is customer oriented, and is a change agent.

Once the two pre-requisites are met, there are many paths that the competency can take. It can be launched quickly over a 3-4 month period as a part of a strategic initiative or can evolve gradually over a period of several years. Regardless of the path, my advice is to have a plan. In other words, have a clear idea of your target state from an organizational, process, policy, and technology perspective, and define a specific series of milestones to achieve the target on each of the dimensions.

Data Quality Pro: Where would a DQCC typically reside in terms of reporting hierarchy?

John Schmidt: There are some advantages if the DQCC reports to a business function rather than the IT function, but either of them are fine. The reason a DQCC may report to the business is to emphasise the business ownership of application data i.e. that the business understands that they have a major role to play in data governance and the need to partner with IT to deliver the service across the enterprise.

Another important consideration is the “level” of reporting relationship.

A DQCC is most effective when it supports multiple business functions. After all, many of the really big opportunities in a large organization are bringing together data from multiple sources in order to realize new insights.

If the DQCC is too low in the organization it will be more challenging to integrate disparate groups. And if it is aligned with one business function, even if its charter is enterprise-wide, it may be perceived as having a bias or partisan interests that are aligned with its reporting structure. Ideally the end game or larger vision is to have data quality services enterprise wide, and the DQCC is the owner of all data quality rules which govern the data across end to end processes.

Data Quality Pro: As many organisations are still immature with regards to their data quality strategy, what advice can you give to help win over the sceptics of CC’s?

John Schmidt: Build a business case. DQ business cases are not easy since they often deal with “soft” issues and require cross-functional agreement around expected outcomes. Furthermore, company incentives sometimes fight against data quality.

For example, the benefits of one department cleaning their data may accrue to another department – so what is the incentive for the group that owns the data to spend the effort and take the risk to do the cleanup when someone else will get the credit? So figuring out exactly where the benefits will come from and how to measure them is incredibly hard. But if you get over this hurdle, then the hard work to build support and gain agreement will be done and executing a successful project will be relatively easy.

There are many ROI tools freely available on the web and from vendors. These tools track the hard and soft benefits and cost savings based on data quality across both horizontal business processes and sector specific processes. These can be used for ideas and to kick-start your own ROI thoughts.

Data Quality Pro: Is there a difference between a Centre of Excellence and a Competency Centre?

John Schmidt: Yes and no.

The problem with both of these terms is that they are used relatively loosely so depending on your definition, they are either the same thing or something different. In fact I’ve seen competency centers with all kinds of interesting names.

For me, the thing that is more important than the name is the level of accountability that the group assumes. In my view, competency centers have a clear idea of who their internal customers are, what the value proposition is that they are offering, and they take the promises to their customers very seriously. A good CC doesn’t wait for a customer complaint – they hold themselves accountable for delivering quality every day, continuously improving their delivery capability.

The term COE is often used for teams of technical experts on a particular technology or subject area with less emphasis on operating like an internal services business. But as I said, I’ve seen COE’s that have a charter and operating model similar to my description of competency centers.

Data Quality Pro: What is a typical organisational structure within a competency centre?

John Schmidt: There are as many variations as there are organizations. That said, there are a few common critical success factors.

First, you need someone that serves as the CC’s “sales person”. This is a person that has excellent interpersonal skills and can build relationships with different project teams and functional groups. It is also the person that plays the key role in setting expectations for a given project – for example, they would be the one to write the statement of work for what the CC will do for a given client.

Another common variation is to have a matrix structure within the CC. That is – you have one group of individuals or teams that are focused on technical expertise and another group of people that are focused around business functional areas.

The functionally focused staff are the ones that develop deep business expertise, while the technically focused staff develop deep expertise in areas that can be leveraged across functions. A world class CC needs to be good at both dimensions.

Data Quality Pro: In a post some months ago you mentioned that you had yet to see a data quality specific CC, why do you think data quality CC’s so rare?

John Schmidt: Data quality issues are some of the hardest challenges to tackle in a company. I’ve already mentioned several reasons, but here is another one. Quite often, data quality problems only occur at the enterprise level, and not at the department or group that is responsible for the data.

For example, the data that the call center staff works with might look just fine to them as does the data that the field sales organization works with. But when you try to combine the two domains, you discover that both groups have developed their own separate, and incompatible, conventions for documenting relationships and hierarchies between customers.

Often DQCC’s emerge initially from within an Integration Competency Center (ICC) or Business Intelligence Competency Center (BICC). Data Stewards are identified who manage cross-enterprise data-centric business processes and are seen as the guardians of the data for data quality metrics, data quality reporting, data quality targets and exception handling.

I wrote a paper about six years ago about the five laws of integration. These laws are facts or realities that you cannot fight or fix – rather you need to accept them as facts of life and adopt methodologies that minimize their negative impacts and leverage their benefits. One of the laws is “information adapts to meet local needs”.This is the problem that DQCC’s face every day.

That said, the challenges are not insurmountable. The practices that I have developed and are now available through Informatica’s Velocity methodology codify the lessons and provide a prescription for how to tackle the problems. Five years from now, DQCC’s will be so common that no one will think of asking this question.

Data Quality Pro: Recently, you blogged about Avaya and the success of their DQ Centre of Excellence. What were some of the key lessons learnt from their experiences of launching a DQ CoE?

John Schmidt: Speak the language of your customer. In the case of the DQCC, the customers are business leaders so it is critical to speak in the language of their function and to translate technical activities into the universal business language – money. As IT professionals we need to spend less time talking about “how” we do our work and more time talking about “what” the outcomes are in business terms. In fact Richard Trapp, the director of the DQCC at Avaya said that:

Business people are not excited about good data quality. Business people are excited about good marketing, increased sales, supply chain efficiencies, and reduced order-to-cash cycles.

Richard Trapp, director of Data Quality Competency Center, Avaya

The reality in today’s world is that integration practitioners need to be equipped with business skills. They need to be able to perform financial analysis and translate technical solutions into the business language of dollars and cents.

Data Quality Pro: What are the pros and cons of a central structure as opposed to localised efforts?

John Schmidt: This is simple question with a complex answer since so much depends on the circumstances, company context, and organizational culture.

That said, the main difference is that centralized structures provide greater control, improved governance, and are able to implement changes more rapidly.

On the other hand, federated structures are often more acceptable politically and may be faster at implementing specific solutions since the lines of communications in a distribution organization are shortened. Centralized structures offer economies of scale, but they can also introduce diseconomies if not implemented properly. So it depends.

Data Quality Pro: How is staffing typically managed in a CC, for example are resources dedicated to the CC or are staff seconded on a per-project basis?

John Schmidt: It depends once again on the structure of the CC. Regardless of how you do it, the main advice I have is to make the relationships and responsibilities explicit. If staff are seconded on a per-project basis, then the onus is on the CC to have a direct discussion with the supervisor of the assigned staff member to ensure that the meaning of the “dotted line” relationship is clear.

Data Quality Pro: If an organisation was undertaking a readiness assessment for a CC, what pointers would they look for?

John Schmidt: I partially answered this in your first question. The other element I would suggest is to assess the maturity of four groups in the organization:

  1. Enterprise architecture

  2. Program management

  3. Change management

  4. Financial management

These functions are, in addition to the competency center, the cornerstones of an integration strategy. At Informatica we have a detailed methodology for evaluating the maturity of each of these domains.

Data Quality Pro: Based on the type of Integration Competency Centre (ICC) concept that Informatica promotes, do you think a similar model could be viable for a DQCC?

John Schmidt: Absolutely, yes. 

The ICC concept is about management practices – not technology issues. The management practices are just as applicable to a DQCC as they are to an Integration competency center, a Business Intelligence CC, an SOA COE, a B2B Supply Chain CC, or other integration function. 

From a management perspective, the CC’s practices are very interchangeable across these different integration domains.


John Schmidt

John Schmidt is vice president of Global Integration Services at Informatica Corporation where he advises clients on the business potential of emerging technologies and directs the company’s Integration Competency Center Practice.

Previous employers include Wells Fargo Bank, Bank of America, Best Buy, American Management Systems, and Digital Equipment Corporation.

John has written hundreds of articles on Systems Integration, Enterprise Architecture, and Program Management, is a frequent speaker at industry conferences, and served as Director and Chairman of the Integration Consortium from 2002-2009.

He wrote the first book in the world about ICC’s in 2005, Integration Competency Center: An Implementation Methodology, and followed it up in 2010 with Lean Integration: An Integration Factory Approach to Business Agility.

For further information visit: http://www.integrationfactory.com.

Previous
Previous

Data Quality Rules: General Attribute Dependencies by Arkady Maydanchik

Next
Next

Integrated Modelling Method: An Introduction with John Owens