Spend Analysis in 3 Lessons: Insight to Action (Lesson 3)

funding patpitchaya/Adobe Stock

This three part series (see Part 1 and Part 2) covers three spend analysis lessons. I continue today (Lesson 3) with the opportunities and challenges associated with the visual display of quantitative information.

Lesson 3: Spend “As You Like It” — Contextual Information and Driving Insight to Action

When I last left off with this series, I hinted at some of the needs of the “power user” compared with the business user when it comes to the spend interrogation and visualization. My colleague Michael Lamoureux captured it best when he highlighted some of the requirements of the spend analyst, whose job it is not to work on the dashboard level but to hunt for the untapped opportunities.

According to Michael, these components of the power user’s spend desktop include:

  • Unlimited Spend Cube Creation. One spend cube is never enough (an analyst needs to keep looking at data until she finds savings or inspiration)
  • Multischema Support. What works for accounts payable doesn’t always work well for procurement, and vice-versa
  • Public and Private Cubes. Users need to be able to update their cubes without stepping on others’ toes (note: a cube is not a dashboard — though a dashboard can lead to a cube)
  • Prioritized Overlay-Mapping Rules for Classification. So that the analysts don’t run into rules conflicts
  • Support for Derived and Ranged Dimensions. How else can one quickly dive into a specific timeframe or out-of-range spend?
  • Rule-based Filters. To only eliminate what the analyst doesn’t want

For a spend analyst to drive insight to action, however, we come back to the notion of the effective visual display of quantitative information. And here again, I will reference the works of Gene Zelazny. To drive insight into action, we must be able to convince our peers of where to look, and in dashboards and reports, enable them to see for themselves what an opportunity looks like — and to drill into it to get their support and backing. Or better yet, we should simply present drillable reports that guide users down a path to the conclusions we want them to make, without giving them explicit takeaways.

At the end of the day, this type of decision guidance through data that can enable our business colleagues to get behind our ideas and support action (and occasionally) tough decisions. Similarly, we must also consider how spend analysis can lead to a system of measurement for procurement performance that all parties can sign up for, compared with leaving spend analysis as a tool for identifying opportunities.

After all, it’s how we manage and book savings that matters to P&L owners, executives and shareholders. This is a much bigger topic, one that we don’t have time to explore in detail in this series. But I would challenge everyone using spend analysis tools to think of the ultimate deployment as finance and procurement’s joint system of record for identifying opportunities, as well as tracking savings and performance in a similar manner to accounting systems of record for managing and closing the books. And this, of course, requires agreement and “signing off” on booked savings.

Tricks of the Insight Trade

In this series, I’ve talked about a number of the tools of the spend analysis trade and, as I see it, a number of the components necessary to get started on the right foot. But there are additional considerations worth thinking about, too. Here are four to ponder as we wind down this series:

  • How should you best enrich spend information with third-party data? For example, should this data be provided by the procurement organization, by the spend analysis vendor (as a reseller of third-party data) or network/P2P sources (single or multi-organization in the aggregate)?
  • What are the best tactics for benchmarking relative performance and spending? For example, benchmarking cost for like indirect SKUs is very different from benchmarking performance for finished or semi-finished direct materials parts, components and finished products. One size does not fit all, and above all, don’t expect your spend analysis vendor to be terribly effective on the benchmarking front for direct materials.
  • What are the best ways to incorporate commodity data into a spend analysis effort? For example, adding in Bureau of Labor Statistics-type commodity data might be useful in some cases, but if you’re buying resin or coil within China (or your suppliers are) you need local index data, not generic data, for it to be valuable beyond directional spend purposes.
  • Finally, how can we use spend and supplier data to fine-tune the sources of news — or even social media content — that we can deploy to improve our reaction times to externalities? For example, tracking movements in exchange rates, regulatory change and environmental disruption in the locations that most affect our key suppliers.

Looking Ahead

I’ve always believed the future starts with the past. This belief holds true when it comes to improving procurement performance, as well. Which reminds us, of course, of the importance of a strong data foundation in procurement. But if a solid data foundation is central to spend (and supply) analytics, and visualization and analytics hold the keys to interrogating data to drive intelligence and opportunity, what does the future hold?

This question brings me to the final thought that I want to end this series with: The future of spend analysis will rely on predictive intelligence that will recommend courses of action and drive a range of benefits, from better budgeting to forecasting commodity and supplier trends — and recommending associated strategies. Ideally, predictive information will be based on aggregate data sources to drive even stronger foundations. Supplier networks, EDI data exhaust and other sources are all promising forms of new data inputs to drive predictive analysis.

But let’s not get ahead of ourselves. Ultimately, what matters first is getting started and putting in place a spend analysis program that touches on all the foundational bases — data acquisition, cleansing, classification, enrichment, foundational/advanced drillable dashboards, power-user analytics and a flexible information architecture.

With this foundation, you can’t go wrong as you plot out a future strategy to leverage data in ways we likely can’t begin to imagine from our 2016 lens — at least not yet!

Share on Procurious

Discuss this:

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.