Digital, big data and compliance turn spotlight on master data management

John Leonard
clock
 A place for everything and everything in its place
Image:

A place for everything and everything in its place

Data-driven business needs clean and accurate data

Master data management is a Cinderella subject, essential in data-driven organisations yet woefully under-appreciated. Long the province of back office wonks, in recent years MDM has become more of a concern for the wider business as the imperatives of customer experience and compliance take centre stage.

MDM is a discipline rather than a problem to be ‘solutionised', nevertheless there are plenty of tools on the market to help with the task of discovering, categorising, labelling and managing the organisation's core data.

Companies like SAP and Oracle, whose products centre on their enterprise databases, are well placed to offer their customers MDM tools that can round up master data from across those applications and maintain it in a central repository. They are joined by third-party solutions from the likes of Informatica, Tibco and Talend which have a background in middleware and integration, and a newer generation of cloud-first vendors.

A single source of the truth

Master data is the core data without which the organisation could not operate. Depending on the sector, this might include customer details, supplier information, patient records, part numbers, location data and the like - but not transactional and other transient data.

Master data might be slower moving than transactional data but it may still need updating in real-time. Most MDM solutions provide tools to automate this and to allow administrators to monitor and configure systems. As MDM has moved out of the server room and into the business so the tools have had to change to make them easier to use by data stewards and chief data officers who may be business people first, technologists second - if at all - so most now include interactive dashboards and other graphical elements.

MDM platforms typically revolve around a hub, with tools capable of importing data in multiple different formats from various sources and sorting it into staging tables.

The staged data is then mapped to domain attributes (Customer, Product, Supplier, etc), after which it is cleaned, deduplicated and standardised according to business rules and then enriched, tagged with metadata and stored. The data may be versioned for audit - and to allow the user to track back through time to see what has been changed and by whom. This auditing capability is helpful for compliance and may also provide the basis for enterprise information management efforts.

The end result of these activities should be a repository of clean, accurate, current and complete core data which can reliably act as a single source of the truth for planners, strategists, sales and marketing, auditors and enterprise applications.

Features of a typical MDM system

Repository
The repository defines the structure of the master data using pre- and user-defined attributes.

Attributes
Attributes define the structure of a repository. They may be pre-defined or bespoke.

Attribute groups
These are logical groupings of similar attributes.

Business rules
Rules governing the validity of the data from a certain domain - for example, the format of an email address.

Authentication and authorisation
As well as credential-based authentication, most MDM solutions come with role-based security conditions for particular business domains and use cases.

GUI
Data stewards are likely to be non-technical so an intuitive UI is increasingly important.

Data publish tools / Web services
Master data is made available to other applications via web services and/or data publishing tools. Web services are pull-based, whereas data publishing tools push data automatically to subscribers.

Data quality tools
Data quality tools profile, cleanse, deduplicate, validate, standardise and mask data and may include monitoring capabilities to check data quality over time. These tools are often built into MDM solutions, or they may be available as add-ons.

A Computing research study for Delta found the top three triggers for implementing a MDM project were first, a need to increase efficiency; joint second, the demands of big data and a requirement to manage data complexity; and third, GDPR.

The business goals were standardising data, synchronising data across applications, and a drive to introduce better data governance policies and permissioned access.

An in depth analysis of the MDM market including the leading, tier-2 and up-and-coming vendors is available on the Delta market intelligence service.

Computing Delta logo - small. 250 pixels wide

Delta is a new market intelligence service from Computing to help CIOs and other IT decision makers make smarter purchasing decisions - decisions informed by the knowledge and experience of other CIOs and IT decision makers. 

Delta is free from vendor sponsorship or influence of any kind, and is guided by a steering committee of well-known CIOs, such as Charles Ewen, Christina Scott, Steve Capper and Laura Meyer. 

Ten crucial technology areas are already covered at launch, with more data appearing and more areas being covered every week. Sign-up here for your free trial of the Computing Delta website.

More on Hacking

IKEAs email system under attack, report

IKEA's email system under attack, report

Reply-chain attacks allow hackers to send malicious emails from genuine accounts

John Leonard
clock 29 November 2021 • 2 min read
Apple sues NSO Group for targeting its users with spyware

Apple sues NSO Group for targeting its users with spyware

Seeks to bar the Israeli firm from using its products

clock 24 November 2021 • 3 min read
GoDaddy data breach affects nearly 1.2 million WordPress users

GoDaddy data breach affects nearly 1.2 million WordPress users

The attacker used a compromised password to access the company's provisioning system for Managed WordPress

clock 23 November 2021 • 2 min read