Have you mastered the procurement basics yet? AI won’t save you if you haven’t!

AI artificial intelligence

Recently I published this note about AI procurement in the Analysts' Corner in the Weekly Spend Matters e-mail (but there's more to say):

COVID-19 struck fast and hard and crippled the supply chains of those organizations that weren't prepared, which were the majority of global supply chains. This not only exemplified the need for better visibility into, and better management of, supply chains, but the need for an organization to be able to source — and re-source — faster when new products were needed (PPE), new sources of supply were needed (as existing supply was cut off), and new transport options were needed (as airlines faltered and drivers in smaller operations took ill).

What better time for each and every vendor to hawk their wares, and hawk they did, with the loudest vendors being those that had built-in best-practices and project management, open APIs, supplier networks, virtual collaboration technology, and, specifically, such technology powered by RPA (robotic process automation), ML (machine learning) and AI ("Artificial Intelligence").

Analytics became front and center as organizations tried to get a grip on their data, supplier discovery, once a nice-to-have, became of vital importance, and supplier risk went from a nice-to-have to a must-have in Sourcing and Procurement platform selection (for those organizations that didn't have their budgets froze). And, of course, start-ups with AI/RPA or established suites that acquired/launched some capabilities here — especially around re-sourcing, better proactive risk identification and mitigation, and/or more intelligent analytics — started making all the short lists.

And I could get more specific — but rather than focus on what I'm seeing, it's more important to focus on what I should be seeing. Specifically, a focus on more SUM (Spend Under Management) and organizational MDM (Master Data Management) — especially w.r.t suppliers and products, risk identification and mitigation planning, and external risk tracking and analysis — not RPA, AI (which doesn't really exist), supplier networks, or even more advanced analytics (when organizations typically don't maximize the use of what they already have).

Spend Under Management allows an organization to quickly get a handle on what they are buying, from who, where and when. Master Data Management allows an organization to have a unified, accurate, list of suppliers the organization is currently using, the products the organization is sourcing, and services the organization is dependent on. Without knowing what is being bought, from whom, and how critical it is, there is no way to identify risks, and, more importantly, identify events that will cause the risks to materialize.

As natural disasters continue to increase, along with the risk of catastrophic pandemics, as global political tensions rise (thanks to the election of populist leaders), and as economic certainty becomes more variable, the ability to instantly respond as risks materializes becomes ever more critical. This is only possible if an organization has identified the risks for its strategic products/suppliers/services, identified triggers, and integrated with global risk monitoring software that constantly monitors for indicators that a risk is materializing.

On the flip side, all Robotic Process Automation (RPA) can do is automate tactical tasks against pre-defined rules, all machine learning (ML) can do is analyze historical data and identify decisions typically made in normal circumstances (to help RPA automate more), and all AI (which is not artificial intelligence, but augmented intelligence at best) does is detect something is not proceeding according to plan but not how (or what to do about it).

We're not downplaying the usefulness of RPA, ML, or AI (it can automate and eliminate the majority of tactical work), of advanced analytics (which can identify trends, opportunities, or outliers), or supplier networks (as improved supplier discovery is valuable in a crisis), but noting that none of these would have prevented the supply chain disasters and associated sourcing and procurement challenges that resulted from the pandemic — and that organizations that don't put in place the proper foundations to manage spend, organizational data, and risks, will be doomed to repeat their failures the next time around.

And this isn't the first time I've published words along these lines. Not long ago, one of my Coronavirus Response pieces was “AI Won't Save You.” Advanced tech doesn't work if you don't have good data, good processes and good visibility into both.

Even at today's level of AI, which is typically nothing more than augmented intelligence in select situations, you can't expect anything — and I mean anything — unless the system has learned behaviors, and that learning only takes place with the repeated application of supervised and semi-supervised learning over time on huge, clean, properly categorized and properly vetted data sets. If your data is a mess, you can't pluck a solution off the shelf, install it, and hope to get anything good out of it. In fact, it will bring about a supply chain failure faster than the massive fiascos of the past (where big bang ERP projects, SCP projects, and other endeavors have tanked Billion dollar companies) — see some of the author's classic posts over on Sourcing Innovation, including this one on how ERP reliance could land you in the Supply Chain Disaster Record Books.

Similarly, if you don't have good processes, any reasonable insights and recommendations advanced analytic, ML or AI systems might be able to pull out of the data regarding next steps will all be for nought as you will have no reasonable, reliable way to act on those insights. Furthermore, those processes must be powered by solid software platforms. That platform could be a S2P suite, process orchestration software — such as ignio or zapier, or specialized S2P project/program management such as Per Angusta — it doesn't matter, as long as the process is well defined and the platform supports it.

However, as noted in the analyst's corner article quoted above, the key to getting good data is two-fold:

  • MDM (Master Data Management) — especially for supplier and product/service data
  • SUM (Spend Under Management) — regardless if the spend is "sourced" or not

If you consider all of the risk management solutions your organization is desperate for, all of the advanced analytics solutions you want to use, and all of the automation you long for (as you're not getting more headcount any time soon), they all depend upon data, and the more data, the better. Let's take them one by one:

  • Risk Management supplier risk solutions need to be aware of all suppliers, and, preferably, the locations you are using, and, even better, the products you are buying and the Bill of Materials (BoM) that go into them
  • Analytics need a lot of cost/metric data over time to be predictive (and prescriptive)
  • Automation needs to not only integrate with all the software platforms you use, but have rules it can follow that are based on data values/buckets/trends, etc.

And yet, most organizations still have dozens of data silos, and their idea of MDM is the quarterly dump to the data lake for last quarter's spend analysis. That doesn't tell you which suppliers you recently started using, which suppliers you're no longer using, which products are nearing end of life (and where shortages may not be a major problem), and which products are major growth products (and critical to success).

Furthermore, their Spend Under Management (SUM) is limited to the contracts Sourcing has negotiated. This is not enough. To truly take advantage of best-in-class Source to Pay (and related) technology, you need as close to 100% spend under management as possible — i.e. everything should go through Procurement systems except internal payroll, whether you negotiate the spend or not and whether you have any control over the spend or not.

If marketing, legal, or the CXO wants to control their spend(ing on their pet projects), or if the spend is not significant enough where your involvement will save enough to make the effort worthwhile, so be it — but it should still go through approved Procurement systems so that the suppliers, products, services, and other key pieces of information can be tracked and the organization has full spend visibility. Only then will the risk monitoring insights give you the right alerts, the ML-backed automated analysis systems provide you (and other spend stakeholders) with meaningful system-generated insights, and the platforms correctly automate all of the tactical Procurement processes and minimize human effort.

So, if you want a chance at surviving the next disaster, get your Procurement house in order so that you can adopt the right technologies and actually realize the value they have to offer.

Are you signed up for the Spend Matters weekly email update? In it, you'll find exclusive, valuable insight from one of our analysts; must-read articles; an event awaiting your RSVP; and some self-promotion to help us keep the lights on. Maybe some occasional wit, too (we hope). Sign up now so you don't miss out on Spend Matters in your inbox each week.

Share on Procurious

First Voice

  1. peter smith:

    Good stuff Michael – I thought you were saying that only spend under procurement management mattered till I got towards the end of your article when I understood the point you were making! Looking at it as a shareholder, I don’t care too much WHO manages the spend (I’m not hung up on “professional procurement” or CatMan processs) – as long as spend is well-considered, managed by people wh know what they are doing and the data is available to let me (or my surrogate in the business, the CPO or CFO) understand and analyse it.

Discuss this:

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.