Data Quality and Business Rules Explained: Expert Interview with Ronald G. Ross

What is a business rule? Why are they so important to data quality management? How can we use business rules to focus our data quality efforts?

In this interview, Ronald G.Ross, the “father of business rules”, provides the answers.

Ronald serves as Executive Editor of www.BRCommunity.com and its flagship publication, Business Rules Journal. He is also a Co-Chair of the Business Rule Forum Conference.

Ronald is also author of “Business Rule Concepts : Getting to the Point of Knowledge” which has recently been updated to a 3rd edition and is widely regarded as the definitive handbook on this topic.

(Note: There is a summary of key points at the end of this interview)

Data Quality Pro: What do business rules have to do with data quality?

Ronald G.Ross: First some background. It’s not simply the quality and timeliness of data that matters, it’s what you can do with it that really matters. It’s all about showing more value to business people from having higher-quality, more up-to-date data. When you begin to look at it that way, you’ll want to go beyond the usual approaches people talk about.

Specifically, business rules relate to data quality in at least two fundamental ways. First, they can automate the decisions that the company makes in its day-to-day operations, Second, they can be used to audit data produced by existing processes for compliance with external regulation as well as internal business policies and goals. In both cases, business rules and rule technologies offer exciting new ways to achieve break-through improvements in data quality.

Data Quality Pro: Can you provide some examples?

Ronald G.Ross: Sure. A few years back, the marketing department at a large, multi-channel retailer with credit and loyalty programs was about to bring a new expensive electronic gadget to the market. Think I-phone. They wanted to do targeted marketing, so they did some data mining and singled out the under-25 age bracket as likely targets. Think college-age daughter, like mine. They assumed that persons under 25 were allowed to sign up for the various credit and marketing programs only along with high-income parents. They got the go-ahead on some pricey promotion. What they didn’t know was that the under-25 rule was frequently waived for a person who was head of household or in the military. Those people have less disposable income and are generally less inclined to buy expensive, trendy new stuff. In other words, the population represented by the data was not what they thought it was. The campaign didn’t do nearly as well as they had hoped. The lesson here is that what you don’t know about the business rules for your data can and will hurt you. What good is the data if you don’t know the rules?!

Here’s another example. A medium-sized, mid-west insurer undertook a project to integrate several business channels, one recently acquired. A long-standing goal was to reduce costly manual interventions on claims, which at the time was running at a rate of 45%. They re-engineered and re-platformed their claims processing using a business rule management system (BRMS). On the first day of the launch, they were shocked to see the rate of manual interventions jump to 55%. Looking into the data, they quickly found that many company types were “other”. The data type “LLC” was also missing in one of the older systems. The good news was that they could quickly deploy new rules under the BRMS to handle company types appropriately. By very next day, the rate dropped to 36%. Now that had real value to the business people. Think of it as near real-time data quality.

Data Quality Pro: How do BRMS support such capabilities?

Ronald G.Ross: In traditional implementations, it’s very hard to get at the business rules. They’re hidden from view – really the implementations are black-box with respect to the rules. BRMS-based solutions, in contrast, are white-box. It’s far easier to get your hands directly on the actual rules.

Data Quality Pro: You seem to be saying that in some ways you can think of a BRMS as a data quality tool, right?

Ronald G.Ross: That’s not far off the mark at all. White-box logic lets you identify and address data problems quickly.

Data Quality Pro: What is your definition of a business rule?

Ronald G.Ross: We define a business rule as criteria used to make a decision in day-to-day operation of the business. Some people think of business rules as loosely formed, very general requirements. That is not the case at all. Business rules have definite form, and are very specific.

Data Quality Pro: Can you give some examples?

Ronald G.Ross: Sure, here a few simple ones. A customer that has ordered a product must have an assigned agent. The sales tax for a purchase must be 6.25% if the purchase is made in Texas. A customer may be considered preferred only if the customer has placed more than $10,000 worth of orders during the most recent calendar year.

Each example gives well-formed guidance focused on making some specific decision. Each uses terms and facts about business things, which should be well defined. Each is declarative, rather than procedural. It’s these latter characteristics that make business rules very data-friendly.

You should think of your company’s business rules as a resource that needs to be managed, just as many professionals have come to believe that data is a resource and should be managed. So we encourage people to think in terms of rule management.

Data Quality Pro: Companies have lots of initiatives these days to manage, improve or exploit their data. How can additional initiatives be justified?

Ronald G.Ross: It’s true there’s a plethora of ideas out there. We hear about active data warehouses, business intelligence, enterprise information integration, customer data integration, master data management, canonical data models … the list goes on and on. But what we find is that many professionals are having trouble explaining the ROI of these initiatives to their management. Management is starting to ask some very hard questions along those lines. Some companies, frankly, don’t seem to have a whole lot to show for their efforts.

To be clear, yes, of course many of these kinds of initiatives do have potential and some do pay off handsomely. There is obviously tremendous value in data and the ability to analyze it adeptly. But the real trick is turning insight into action.

Data Quality Pro: Can you expand on that last point?

Ronald G.Ross: Let’s suppose you do have high quality data, and you do have tools to analyze it. Suppose you discover that a 5% price reduction in the month of January drives up multi-year retention rates by 27%. That’s great, but what good does just knowing it do you? Or you discover that every 5% increase in prices in your least profitable line of business results in a 3% loss of best customers in the most profitable. That’s important, but what good is it if you can’t act on it? To show real value, data professionals need to make a difference operationally.

Data Quality Pro: How do you suggest they do that?

Ronald G.Ross: I believe the most important step, and this is where we go beyond the usual approaches people talk about, is to view data quality as just one part of the decisioning dimension of the organization. In essence, you want to understand the value of data quality in the context of being able to make high-quality decisions in the day-to-day operations of the business.

Data Quality Pro: How would you go about evaluating the quality of the decisions that good data enables?

Ronald G.Ross: As suggested by James Taylor and others, there are five basic aspects of decisions that you would want to evaluate: precision, cost, speed, agility, and consistency. Agility, by the way, brings us back to the earlier point about white-box logic – how long it takes to change the rules.

Many companies focus simply on what’s called ‘capture latency’ – how long after a business event is data available for analysis – and perhaps on ‘analysis latency’ – the time it takes to analyze the data. But the most important factor these days is ‘decision latency’ – the time it takes to decide how to act in response to analysis and then do it. There have been a number of good articles written about this.

Data Quality Pro: Capture, analyze, act – that sounds like some life cycle is involved?

Ronald G.Ross: Exactly, the life cycle of decisions. Actually, I would describe the life cycle as havingfourmajor phases:

  1. Perform business activities

  2. Capture the data

  3. Analyze the data

  4. Deploy changes to the decision logic back into business activities

Many data professionals concentrate only on the capture and analyze phases. But the real value to the company lies in the life cycle as a whole. As I suggested earlier, what good does insight based on high-quality data do you if you can’t readily act on it? By ‘act’ I mean roll-out modified business rules into the day-to-day operations of the business.

By the way, this is where many professionals focusing on business process re-engineering miss the boat. They fail to understand that if you want true business agility, what you really need is high-quality data and effective decision management.

Data Quality Pro: How do you deploy changes to the decision logic quickly?

Ronald G.Ross: The business rule message is don’t hard-code decision logic. Let a BRMS handle that. It’s plumbing – infrastructure. If your organization doesn’t have a BRMS already, it’s time to look into it. I chair the annual Business Rules Forum Conference in November, and every year we have great case studies illustrating what people have accomplished.

Now let me go a step farther. I believe you should be spending no less time and resources on this decision deployment infrastructure than you are spending on your data warehouse and BI infrastructure. These are actually pieces of the very same puzzle.

Data Quality Pro: Does decisioning offer more direct help to data quality professionals in planning where they should focus?

Ronald G.Ross: Yes. I encourage data quality professionals to identify specific, well-defined operational business decisions, then target the data that directly supports those decisions. It’s a great way to align squarely with the business.

Data Quality Pro: How do you identify high-potential candidate decisions?

Ronald G.Ross: You want ones meaningful to business people. They should be operational rather than strategic, high-volume rather than occasional, deterministic rather than fuzzy, and low-to-moderate complexity rather than high complexity.

Data Quality Pro: Can you give some examples?

Ronald G.Ross: Sure, here are some off the top of my head:

  • How do we price our product for this particular transaction?

  • What credit do we give to this customer at this point in time?

  • What resource do we assign to this task right now?

  • Do we suspect fraud on this particular transaction?

  • What’s the best cross-sell or up-sell offering for this sale?

  • Do we anticipate any delay on this shipment?

These all represent operational decisions in day-to-day business activities. The value of good decisions, and therefore the data on which they are based, will add up very quickly. 

By the way, I wrote a piece summarizing the kinds of ROI you can explore for decisions in my September 2007 column on http://www.brcommunity.com

Data Quality Pro: In starting off the interview, you mentioned that business rules might be used to audit data produced by existing processes for compliance with regulatory criteria and internal business policies and goals. How would that work?

Ronald G.Ross: Suppose you are an insurance company and you have adjudicators handling claims. Whenever people are in the loop, the quality of decision-making may vary because of the complexity of the business rules, the rate at which the rules are revised, training, supervision, motivation, mood, memory, and many other factors. You might want to examine how compliant and correct adjudication decisions are over some timeframe.

To do that, you would pick some meaningful timeframe – say the past year – and some appropriate subset of all claims processed during that timeframe. You should focus on cases of only low to medium complexity. Then encode some of the most obvious or frequently misapplied business rules using a BRMS. You’ll get a pretty good assessment of current practices and irregularities that way. I heard of organizations that have even used this approach for actual audits reviewed by external regulators.

A key feature of BRMS that they do extremely well is give decision audit trails. That’s a capability most BI tools can’t begin to match. For each claim identified as irregular, you can readily determine which particular rules produced the given result. The good news is that those rules will be in readable, business friendly form! Then using BI tools you can slice-and-dice in any way you want – for example, by type of claim, region, category of customer, etc.

A step beyond that is to do a what-if delta. In this approach, you actually change some rules for the purpose of retroactive analysis. In an insurance business, for example, it often costs more to go through all the motions of processing simple claims than simply approving the claims and paying out. We might raise the threshold for automatic claim approval from $300 to $500, then re-run claims from the past year and compare results. How much money could have been saved? The key is having the rules in some understandable form, easily changed and traced.

Data Quality Pro: Is that how you see the technological future of data analysis shaping up?

Ronald G.Ross: Absolutely. From what I hear, it’s not exaggerating much to say nano-fast data access on an unprecedented scale is right around the corner. You can use BI tools all you want, but unless you can do two things, the speed won’t really matter. First, you need to be able to trace what business rules were used in which simulations to make what decisions for specific cases, and be able to read what those business rules say in close-to-plain English. Then you want to go full-loop, and deploy modified rules as quickly into actual business operations as common sense and good governance allows. After all, what good is the data, or any insight you derive from it, if you can’t manage the rules?

Summary of Key Points:

  • A business rule is defined as criteria used to make a decision in day-to-day operation of the business

  • Business rules relate to data quality in 2 major ways, they can automate operational decisions and audit data for compliance purposes or internal policies

  • Traditional systems make it hard to discover business rules, a business rules management system (BRMS) is required to manage and visualise them

  • A BRMS effectively becomes a data quality management tool in its own right

  • Business rules are declarative, rather than procedural. It’s these latter characteristics that make business rules data-friendly

  • Business rules are a resource that needs to be managed, just as data is a resource that should be managed

  • To show real value to the business, data professionals need to make a difference operationally by demonstrating the value of data quality in the context of being able to make high-quality decisions in the day-to-day operations of the business

  • There are five basic aspects of decisions that you would want to evaluate: precision, cost, speed, agility, and consistency

  • The most important factor is ‘decision latency’ – the time it takes to decide how to act in response to analysis and then do it

  • The life cycle of decisions is Perform business activities, Capture the data, Analyze the data, Deploy changes to the decision logic back into business activities

  • Real value to the company lies in the decision life cycle as a whole but many data professionals concentrate only on the capture and analyze phases

  • To act on high-quality data means to roll-out modified business rules into the day-to-day operations of the business

  • If you want true business agility, what you really need is high-quality data AND effective decision management

  • Don’t hard-code decision logic, let a BRMS handle it

  • You should be spending the same time and resources on a decision deployment infrastructure as data warehouse and BI infrastructure, they are the same pieces of the puzzle

  • Data quality professionals should identify specific, well-defined operational business decisions, then target the data that directly supports those decisions

  • These decisions are identified as meaningful to business people, operational rather than strategic, high-volume rather than occasional, deterministic rather than fuzzy, and low-to-moderate complexity rather than high complexity

  • A key feature of BRMS is decision audit trails, a capability most BI tools can’t begin to match

  • BRMS support what-if analysis to analyze areas for performance improvement

  • Business intelligence tools are irrelevant if you cannot a) trace which business rules were used in which simulations to make what decisions for specific cases and b) go full-loop, and rapidly deploy modified rules into business operations


Ronald G. Ross

At his company Business Rule Solutions, LLC, Mr. Ross engages in presentations, consulting services, publications, the Proteus methodology, and RuleSpeak (www.RuleSpeak.com). 

He gives popular public seminars through www.AttainingEdge.com and www.IRMUK.co.uk.

Ronald G. Ross (www.RonRoss.info) is recognized as the “father of business rules.” He serves as Executive Editor of www.BRCommunity.com and its flagship publication,Business Rules Journal, and as Co-Chair of the Business Rule Forum Conference.

Mr. Ross is the author of eight professional books, including the handbook Business Rule Concepts (2009, 1998) and Principles of the Business Rule Approach, Addison-Wesley (2003).

Business Rule Concepts

Principles of the Business Rule Approach (Addison-Wesley Information Technology)

Previous
Previous

The Road to Data Quality and Governance Maturity: Interview with Jill Wanless

Next
Next

Data Quality Rules: General Attribute Dependencies by Arkady Maydanchik