Master Data Management Blog ➤

Guide: Improving your data quality with master data governance

Written by Ian Piddock | Apr 2, 2024 12:53 PM

When a chief data officer achieves excellence in master data governance, they’re transformed from someone who spends their time reactively fixing things when data gets messy or breaks a system, to a proactive, all-seeing business enabler. This series explores why master data governance is vital for: data modelling, data quality, rich content, industry standards and data enrichment.


Series introduction

If employees can’t trust the data they’re given, they won’t have the confidence to make the right decisions, which limits how efficiently and productively they can work.   

And what if, say, customers aren’t getting the correct information about your products? Well, they won’t be getting a product that actually meets their needs. You’ll end up making them angry while losing sales.  

Those are just a couple of the problems you could face if your data quality isn’t up to scratch.  

We’ve written this guide to help you define what good data quality looks like, show you some of the things that could go wrong if your data quality is poor, and then put you on the path to data quality excellence.   

Explore how improving your data quality through master data governance can elevate your role.  

 

Back to basicswhat is data quality? 

In simple terms, data quality refers to how good or reliable data is. It's about whether the data is accurate, complete, consistent and relevant for its intended use.  

High data quality means that the information is trustworthy and can be used effectively for making decisions, analysis or other purposes.  

Low data quality, on the other hand, means the data may contain errors, inconsistencies or missing pieces, which can lead to incorrect conclusions or unreliable outcomes. 

But it can get complicated. When your data gets used for lots of different purposes, each might have a distinct set of requirements.   

So, you can’t measure your data quality without knowing everything the data might be used for—in other words, its context.  

Because it’s about meeting particular needs, data could be high quality when used in one app, but low quality when used in an app that needs something different from the data.


A DQAF is really useful when evaluating the quality of a particular data set at any point in time.  

However, IT staff using a master data management (MDM) platform don’t necessarily know what looks good from the POV of different areas of the business.  

That’s why, when assessing data against a framework, you need to talk to subject matter experts across different domains. 

 

Signs your data quality has room for improvement 

Look, nobody’s perfect. There’s always room for improvement, and that’s just as true for data quality as it is your need to eat less junk food. So, here are a few signs your data quality could do with a spruce up.  

Data quality is swept under the rug 

Data quality can be a bit of a taboo subject within businesses. People often know that their organization’s data quality is bad, but they don’t like to talk about it. They just assume it can’t be improved (spoiler alert: it can), so sweep it under the rug and go about their day.  

When people do talk about their data quality, it’s usually safe to assume that the vaguer the statements and descriptions about the data quality, the more likely it is to be poor. And so, because people aren’t talking about it, the less likely it will be acknowledged or mitigated by the business.  

Too much data, not enough time 

The thing about nobody being perfect, is that we all make mistakes.  

People enter data according to their needs, as quickly as possible, to get onto more interesting work. They don’t have time (or the patience) to be considering the nine other business units their data entry might affect.  

So, the more room there is for free-form input, and incentives for speed over quality, the more likely the human factor will contribute to poor data quality.  

Repetitive, dull and repetitive data entry tasks 

Nobody does data entry for fun. And when the boredom creeps in, those human errors soon follow. Doing lots of data entry tasks, over and over again, means the motivation to do a good job erodes over time. And so, data quality also erodes over time.  

Duplication, duplication, duplication 

One of the biggest causes of data quality issues is the duplication of records in customer and supplier data. The same data could be repeated across various CRM, ERP and accounting systems.  

Without any validation, cross-checking or governance, the same record could be duplicated purely through inconsistent formatting or spelling, e.g. B.T., British Telecom, British Telecom Ltd or British Telecommunications.

You should stop this from happening at the point-of-entry wherever possible—e.g. by searching for matching records before creating new ones—it’s less costly than having to correct an error that’s already been made.  

Misbehaving data transaction processing 

Bad data quality can also lead to broken transaction processing.  

Let’s say “System 1” is trying to push data through a process into “System 2”. But when the data is in a format System 2 wasn’t expecting, it’s not happy. So, it starts to stamp its feet, behaving erratically or even halting the process entirely, rejecting the data.  

This all means business and IT employees are spending a lot of time reworking processes or trying to correct errors in systems. 

The factors above can all lead to some serious trust issues between you and your data.  

As you’ll see in the following section, this trust erosion can cause all sorts of serious challenges for your business.  

 

The devastating consequences of poor data quality 

 

Key stakeholders need good quality data to make the right business decisions. But if poor data quality means they can’t trust the data they’re receiving, they’ll feel less informed, and hesitate to take action. 

Multiplied across your entire business ecosystem, this slowdown of confident actions can have a devastating effect on the profitability and future of your business. 

Want some examples? 

The effect of poor data quality on employee performance 

If employees don’t trust the data, they’re not going to be acting in the most efficient and productive of ways. This ends up causing more operational error and waste as people make decisions either informed by poor data, or they just resort to gut instinct or habit.  

And if they’re only relying on gut instinct, what’s even the point of the data? 

All the issues this causes creates a domino effect that eventually gets to people outside of the organization—such as the wrong products being sent to the wrong places.  

The effect of poor data quality on products  

Customers buy products when enough of their questions have been answered, about how suitable the product is for matching their needs (for now, let’s just ignore that time you impulse-bought a Rolex you definitely didn’t need).  

They’re spending their hard-earned money based on their trust in the answers.  

But if poor data quality means they’re getting the wrong answers, the following can happen: 

  • Fewer conversions on the website, with more abandoned carts 
  • More complaints, from customers who feel deceived 
  • You gain a reputation for poor customer experience 
  • Disputes arise, which can take a lot of time and money to settle 

The effect of poor data quality on business performance 

Disagreements in board meetings are hardly unheard of—there’s nothing wrong with a bit of healthy debate. But it’s a big problem when the disagreements are the result of mismatched data.  

Like if two people have different revenue numbers, for example. Or different amounts of unique Global Trade Item Numbers (GTIN), which lets the board know how many products the business has. 

You’ll know you probably have a problem with data quality when there’s no clear, unified answer for questions like: 

  • How much money did we make last year? 
  • How many of product X did we sell? 
  • How much of that was profit? 
  • How many customers do we have? 
  • Who is our best customer? 

And without those answers, your organization won’t have an effective business strategy. 

Data leaders should look to their business glossary and their matching strategy and look at where it is that people disagree over definitions of terms. From there, they can work out where data isn’t conforming to the required data quality standard or being consolidated correctly. 

The effect of poor data quality on external communication 

Following from the above, if you can’t agree on your business performance data, it’s very difficult to confidently communicate it to the outside world. 

There’ll be a big chance that your results will be significantly different from your forecasted numbers. Or that your ESG reporting commitments are not met and that green claims are unable to be substantiated by trustworthy data. 

This makes it more difficult to manage customer, regulator and shareholder expectations when reporting.  

Then your business no longer looks like such a sound investment, making people regret buying shares, buying your products or selecting you as a supplier.  

And falling foul of compliance mandates can lead to significant fines.

>>Start building your foundation for better data quality, with this checklist.<<

 

Getting it right with master data governance  

If the chief data officer’s mission is to foster trust in the organization’s data, then their main weapon is master data governance. Tighter data governance controls, that cover all dimensions of the DQAF framework, will ensure they can audit, solve for and correct data quality issues.  

But strong data governance is also a people issue. The CDO needs to have the profile, mandate and resource to do the job effectively.  

They need to build a culture where people across the business are accountable for the quality of data. However, this is a two-way street. Employees need to be in a supportive environment, where it’s as easy as possible for them to meet their responsibilities.  

You could make extensive use of reference data—such as dropdown examples or data-entry recommendations, to offer compliant, pre-validated suggestions for data entry, that show them what ‘good’ looks like. They’ll be able to work faster while reducing the mental load and still getting it right.  

 

Trust is everything 

If you want to use your data to drive maximum business value across your organization, trust is everything.  

When you put in high quality data—which good master data governance ensures will happen every time—you can trust your data analysis, which means people can make the right decisions, which lead to the right actions. 

But the data itself is only part of the equation. The way people treat the data is key.  

That’s why strong master data governance makes data entry foolproof. With the right validation and business rules in place, entering the right information into forms is super easy.  

An error message will show up to correct your formatting, either providing a suggestion or a list of pre-formatted examples to choose from.  

That means employees can do more without the boredom setting in and taking away their focus.  

And the certainty that everyone has accurate information empowers knowledge workers, analysts and operational staff, which in turn empowers the board by ensuring forecasting and reporting are accurate and credible.   

 The cycle of trust

But after implementing Stibo Systems’ master data management platform, the organization now has a single source of trusted customer data, enabling better experiences through quicker customer authentication.  

Read the full case study to discover how Manitou Group achieved:  

  • Significant reduction in data entry time 
  • Harmonization of processes around data 
  • Improved quality of reporting and insightful dashboards 

 

Knock it out of the park: three big wins to improve data quality 

Here are three ways you can win big by improving data quality across your organization, with strong master data governance:  

Conducting a data profiling exercise 

Remember the six-part DQAF we talked about? Well, now it’s time to put it into action.  

First, you need to determine the problem/s causing your data quality problem, and just how big it is.   

The assessment should engage subject matter experts from all business domains to capture data quality issues according to business context. 

This will then provide you with a comprehensive traffic-light audit of the following factors across your estate: 

  • Completeness  
  • Timeliness 
  • Validity  
  • Integrity 
  • Uniqueness 
  • Consistency  

 Moving the data quality control upstream 

With the problem nailed down, it’s time to reduce the risk of it happening again.  

You’ll want to implement data quality processes and controls as far upstream as possible, ideally at the point of entry. 

To see why, think of the game of ‘telephone’ or ‘rumors’, or whatever you called it where you grew up. The more unchecked voices that are allowed to repeat whatever they want, instead of the true and correct answer, the more likely and widespread errors become.  

By placing controls upstream, you’re getting the first person to shout the correct answer to everybody—there’s no chance of anyone further down the line mishearing. 

Consolidating and merging duplicate records 

As anybody who’s collected Pokémon or baseball cards knows, duplicates are annoying. It’s frustrating to get your fifth Pikachu when what you really want is a Charizard (for anyone who’s not into sports, they’re both famous baseball players—no need to look them up).  

And when it comes to duplicate records, it goes beyond just annoying. It makes maintaining data quality more difficult. 

Because you might fix the address of ‘Adam Smith,’ but not the addresses of ‘Adam Smyth,’ ‘Adam Smythe,’ or even ‘Adum Smith.’ Even though they’re actually all the same person, just spelled differently.   

And how will you know how much ‘Adam’ has spent with you? Or how much credit you have provided “them”! That’s why you need to consolidate the data, ensuring there’s a unique identifier that can be used to prevent duplicate entities from recurring.  

Keeping the score 

 What to look out for in a master data management platform 

When you have the right master data management platform, that supports strong data quality, these are just some of the things you’ll be able to achieve: 

  • Uncover data quality errors in a data set, with a user interface that’s set up for data profiling. It will look for data conformity patterns and then spot outliers and duplicates, with automation assisting the process.  
  • Visualize progress towards data quality KPIs and track data quality over time using a data sufficiency KPI dashboard, based on the data sufficiency metrics you’ve determined.  

Get a head start on with your data quality, by downloading our handy checklist here.