Master Data Management – Why you’re doing it wrong and how to fix it

More than 2.5 quintillion bytes of data is generated worldwide every day and that number is set to grow rapidly, according to business intelligence provider Domo. The impending increase in data traffic will come from mobile and cloud computing sources, and from developments in AI, IOT and machine learning. Insightful business decisions derived from big data can provide organisations with a 23x uplift in customer acquisition compared to those contemporaries who do not use complex data sets.

In fact, companies making insight-led business decisions from intelligent data are predicted to take $1.8 trillion annually from competitors who lack data-driven capabilities. It’s not surprising that businesses were estimated to spend $187 billion on big data and analytics in 2019 alone, finds Leftronic.

Of course, data comes in many forms and from many sources, how you use it as an organisation depends on your department and your business goals. What is certain is that the movement toward SOA and SaaS is making master data management (MDM) a critical issue. It’s also irrefutable that the data must be clean and correct from the outset in order to perform relevant and purposeful analysis. So, it’s imperative that firms build strong master data management practices into their data strategies to move forward with data analysis primed for meaningful results.

Alongside customer master data, supplier master data is one of the biggest areas where procurement departments will find the most opportunity for insight-driven strategy and decision making. But there are challenges associated with how master data can be managed for it to accurately feed the data analysis machine.

Part of the problem

Many firms are moving onto their next stage of digital transformation, but are grappling with supplier data buried in multiple system infrastructures. HR, marketing, logistics, finance, IT -- they all may be using their own versions of supplier data and maintaining individual records. In effect, they are all keeping their own version of the truth and using a narrow, departmental view as the basis for decisions.

The anomalies can stack up, especially if this scenario spans different locations, countries or entire regions. Errors and gaps in information are often duplicated across organisations which is further exacerbated when multiple ERP systems exist.

What should be a shared or common asset within an entire organisation is instead a source of inaccurate or obscured output. Master data ought to be accurate and provide useful results in spend analytics, sourcing optimisation and centralised contract management.

The fact is you can make use of the most recent AI or algorithm-based developments, but you will not reap the true benefits they can offer unless the core data on which they depend is complete, correct and rationalised. If the judgements you make as a business are based on inaccurate, out-of-date or false information, then you are creating more risk in your decisions by not having the full picture of the truth.

Risk is what you get when you make the wrong business decisions. So how do you make the right ones? How do we trust our core data so that we can take it into a meaningful business context?

Trusting your master data means moving beyond ERP …

Data has long been the bane of procurement organisations. Where do you keep your master data? Who is responsible for it? Which master data can you trust? How do you know if your forecasts, what-if scenarios, supplier measurements and spend analyses are correct when you submit them to your board?

Many organisations have struggled to get full value out of their spend analytics or optimisation investments because of lack of high-quality, granular data. Firms have embraced 25 years of ERP systems to help them manage their internal operations, but now our third-party relationships, including supplier management and contract management, need nurturing.

If we are to fulfil today’s demands for CSR, sustainability and social impact while building balanced and trusted relationships with our supplier ecosystems, we need systems that are fit-for-purpose. But in today’s landscape of often globally segregated ERP systems, bolt-on tools, hybrid collections of solutions, there is no one full picture or validation of data. So, for trustworthy supplier master data, we are starting to look beyond our ERP system towards AI-powered solutions that are more fully equipped and better designed to do that job.

... to a single source of truth  

Tools that help us to master MDM are becoming more sophisticated, faster and rich in capabilities. There are now unified platforms using algorithms and machine learning to normalise the data through cleaning and standardisation, and which also leverage matching, merging and consolidating all the data from all sources. The most important feature of a good MDM system is having a high accuracy of matching data. It should be able to put all data into a common format, replace any missing values, standardise those values and remove duplicates. It should then be able to distil one valid record from the multiplicity of records which can then be pushed back into the ERP system.

The best solutions also cope with scalability. MDM doesn’t stop at creating a master data list. It is imperative that it also continue to nurture that list and scale with it appropriately as data sets grow. Investing time and money in clean, consistent master data will soon be redundant if it is not kept clean and consistent as it gets updated and expands over time.

More and more suites are offering MDM functionality because software makers realise the challenges of the buyer. But the depth and breadth of these offerings are not on an equal playing field. Those that are based on a bespoke system with home-grown capabilities, historical knowledge and experience as opposed to being acquired or bolted-on have developed organically to address today’s challenges.

One trusted and long-standing provider, GEP, acknowledges that the best solution is one that has never changed platforms, retains a stable base and a strong data model. As Rese Cleaver, Rese has a accountSenior Manager, PMG at GEP Worldwide, says:

“With the growing volume of data that organisations are dealing with, it just isn’t feasibly possible to continue to rely on the workforce to wrangle and reconcile it all.  To get a valid source of truth, this has to be delegated to an AI-enabled system with proven history of reliable data models that produce demonstrative results.”

Conclusion

In a world of hybrid ecosystems including ERP, P2P and CRM, what data can you trust? There simply cannot be multiple master data files. And today, how do we make sure that whichever master data model we use is scalable for the future as volumes of transactional data expand? These issues have been weighing procurement down for a long time. Clean and accurate master data is the key to providing trusted information on a global scale. Procurement can enable those daily micro decisions that inform the larger macro business picture on which corporate decisions are made.

From negotiating to contracting, from onboarding to renewing and updating, the entire supplier-related process is carried out with little regard for what is already in the ERP system. Who else is managing this if not procurement? The onus is on them to make that data a safe source, so that whether you are the CFO, CMO or CIO, it is usable and trustworthy. It’s a growing challenge requiring a holistic MDM strategy, the tools that can support it and a procurement team that can make it happen.

 

 

Disclaimer: this Brand Studio article was written in conjunction with GEP

Discuss this:

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.