Sök
Master Data

Master Data: Foundation and Ultimate Discipline in One

16 mar 2020

Is all of your most important data, your Master Data, under control? It may not sound very important, but we’re no longer just talking about a few mailings going astray. We’re talking about laying the foundations for automation in all processes, and consequently digitalisation. In this article we explain why you should never neglect your Master Data and show you how to achieve perfect data. 

Mitchells & Butlers aims to become one of the biggest operators of bars and restaurants in Germany. It is already one of the largest such operators in the UK, where it runs 1,700 pubs. To achieve similar success in other countries, you have to meet one key requirement, namely, potential customers must know that such an establishment exists in their area. And if you decide you really want to go there, it should ideally be open when you arrive. 

It’s a scenario we’re all familiar with. All too often, the data provided by restaurant portals, such as Yelp, no longer corresponds to reality. And if you think that Yelp isn’t relevant for your market, you should know that is the source for the majority of Google data on restaurants. Opening hours change, closing days are introduced or cancelled, and phone numbers can change. 

Uberall conducted an analysis of the US market 
to determine the data quality of companies with a 
stationary presence. The results were alarming. 

There’s no excuse for such errors nowadays. Firms that specialise in keeping the data of restaurant and bar operators up-to-date have been in existence for a long time. The respective company maintains its data at a single location and the corresponding service provider transfers it to the portals. US company Yext has acquired a market capitalisation of a billion dollars with this business model. 

Data management for machines 

There’s a good reason for it. Correct and clean data should be at the core of your digitalisation project. When machines process data automatically, the quality of this data determines the quality of the end result. “Crap in, crap out” is a phrase often used by programmers. If you supply poor-quality data, even if you use the best algorithms the result will never be any good. 

The rules that apply to basic data such as address, phone number and opening hours apply equally to the Master Data used in your company. If the address data for suppliers and customers isn’t correct, this makes communication with both parties difficult. Mailings are a good example of this. If you want to include a personal address in a letter, you need be sure that the data is generated, saved and transferred in accordance with a defined schema and that the data has been checked for errors and cleaned up. Otherwise, you run the risk of addressing Lisa Miller as “Dear Mr. Miller”. 

This is just one small example. Targeting works on the assumption that the data is correct. If an algorithm uses the hypothesis that most of the residents of Oberstrass in Zürich are wealthy and should therefore be targeted in a different way to the residents of Dietikon, this should at least be accurate so that the address data from the CRM system is of use. A postal code alone isn’t enough. 

In this case, incorrect data results in Lidl adverts 
being displayed with Arabic banners on a German news site. 

The quality of Master Data also plays a key role in risk management. Risk monitoring can be added as a field in the database. This can take the form of a scoring indicator, which provides information about a company’s solvency, utilizing the power of the D-U-N-S® Number from Dun & Bradstreet. If you add a regular analysis to this type of parameter, your company identifies risks – such as payment default – on both the purchasing and the customer side at an early stage and can take countermeasures.

Clean, enrich, maintain 

The process of adding additional data, such as scoring from external data sources as well as Bisnode or Dun & Bradstreet data sources, to an existing record is referred to as “enriching” or “refining”. 

Enriching is the second step in establishing good Master Data. Master Data is defined as the data that is essential for certain subprocesses involved in business activity and should therefore be available at all times. It is the opposite of variable data. When you order a primary product, the item number and supplier are part of the Master Data, while the order quantity and price are variable. In the CRM system, GDPR approval is now part of the mandatory data, and therefore also the Master Data. Stored surfing behaviour on the company’s website is variable. 

It all starts with the Master Data. You’ve spent many years capturing and collecting it and storing it in different databases – unfortunately sometimes in Excel. Every data project begins with consolidation. Is the data correct, is it consistent, does it have a standard format? Duplicates are removed, typos corrected and known changes entered in the records. Your data is now clean. 

In the second step, the data is enriched. This is useful for the purposes of targeting in marketing, but also for sales or support. While marketers would like to know which target group segment the customer to be targeted belongs to, Sales teams would be more interested in knowing whether the customer is solvent. Support agents work better when they know which products a customer is actually using or how long it will take to send a service engineer. 

Both of these steps represent the preliminary work, after which begins the infinite loop, namely data maintenance. This process should ideally be automated. If your e-mail program receives three error messages from the server, indicating that an address is no longer valid, it is deleted. Deleting Master Data is a painful process for many companies, but it contributes to a significantly better quality of result. Staying with our e-mail example: The Certified Senders Alliance, which is attempting to tackle the problem of spam, has agreed that mail distribution lists should be reviewed and cleaned up regularly. Companies that fail to do this end up on the blacklist and are classified as potential spammers. 

Much more important in the context of automation, however, are interfaces to data providers, such as Bisnode or Dun & Bradstreet. They manage a B2B data universe that currently has over 330 million companies in over 200 countries. To ensure that the information is always up-to-date, they draw on more than 30,000 data sources and import millions of changes every day. Many companies are already using an API to update their data with the latest information. Returned mail is thus a thing of the past, and there’s no longer any need to worry about accidentally addressing Lisa Miller as “Dear Mr. Miller”. But it goes much further than that. A sales representative on site with a customer can see the customer’s financials, Marketing teams have a 360-degree view of prospects, Smart Data Analytics ensures a huge supply of qualified leads, and much more. 

When it comes to automation, the more communication there is between machines, the more important it is to have consistent data quality. Examples include the Internet of Things, voice output in navigation systems and smart speakers. An automatic system can, for example, only process location data if it is supplied in a format that can be read by the system. The previous practice of storing the company address as a JPG or GIF on websites in order to prevent spam would be extremely counterproductive nowadays. Not even Google Maps would be able to find your company in this case. 

What exactly is Master Data? 

We’ve put together an infographic, which is available as a free download. In it we explain exactly what is meant by Master Data, why it is so important for digital transformation, the problems that can occur and how you can create and maintain perfect data in a Master Data project. 

Master Data explained

Master Data explained

Everything you need to know about Master Data – summarised in an infographic.