The vendors considered
Exploring the actual market state of the top 5 capabilities
1. Data Modeling
The ability to build more sophisticated analyses on cost drivers, demand, contracts, working capital, risk, supply chain and overall procurement performance while maintaining data that is flexible and rich enough to fully support this type of spend analytics.
The average vendor can use its analytics schema to go beyond deriving reports from transactional purchasing and invoicing data to provide deeper quantitative insights on the data, including performance modeling (e.g., normalizing/enhancing spend data for benchmarking), causal factor analysis (i.e., what/who is driving the spending by how much), trending/time-phased analysis and drill downs through master data taxonomies, e.g., category and supplier taxonomies. Additionally, the schema should support extensible data models, e.g., adding lookup tables, and multi-schema support, e.g., each business unit can tailor the models differently on initial implementation.
The top vendors extend the analytics schema to also support predictive data analysis, AI/ML models or real-time visibility across many systems. Additionally, they offer multiple predefined data models to choose from, e.g., different industry/category models as base from which to tailor. Some can use metadata for enhanced real-time observability for the same or similar records updated across multiple systems, which helps display spend data in real-time. Others use AI/ML data models to generate predictive and even prescriptive analytics to provide broader and deeper insights for opportunities to pursue. Some of these high performers also support more advanced ‘outside-in’ analytics models that integrate external data, such as commodity indices, risk data and ESG/CSR data feeds.
The end result for buyers is not just a timely accurate ledger of purchasing transactions but deeper insights into spend categories, demand/cost drivers, spending processes/controls and top opportunities for improving spend/supply performance.
2. Data Pipeline
The ability to automatically extract, transform and load spend data, supply data and related master data into a platform’s aggregated global analytics system from numerous data sources while maximizing data quality and cleanliness.
The average vendor should be able to offer basic master data management (MDM) capabilities. These include cleansing, standardizing, matching, indexing, enriching and harmonizing S2P data that was ETL’d to the centralized master data store to the underlying schema. Datasets should be able to accept loads of data sources of different formats and to remember those formats for future use.
To manage knowledge, a vendor should be able to guide the processes through context-specific, rule-based workflows and to leverage the external BPM systems and automated ‘outside-in’ knowledge served up into those workflows.
Top performers enhance basic MDM capabilities with abilities such as advanced harmonization through best practice views/models, application of AI, etc. In top performing systems, users are presented with an easy matching UI when loading data sets that enable them to quickly generate new formats, match up old fields with new fields and identify additional columns to be automatically added. Additionally, top performers advance knowledge management via AL/ML or advanced algorithms.
We have seen top performers who take MDM capabilities a step further. For example, some provide master data to other third-party software like contract management systems or SRM providers, which simplifies the systems a customer needs to maintain and saves time. Additionally, top performers’ knowledge management capabilities integrate all steps of the data pipeline, including everything from data cleansing to KPIs and reporting, into patterns or insights that are easy for the user to understand. This may include an AI/ML capability for guided insights. We have seen top performers in dataset loading and management who take the UI interface to the next level with a variety of standard templates that define which tables and fields to expect from source systems and multiple ways of interacting with the data for ease-of-use, e.g., in system or punch-out to Excel.
3. Data Cleansing and Enrichment
The ability to cleanse and enrich data in cases where it is not possible to do in the source systems so the user can feel confident that the analyzed data is complete and accurate.
The average vendor should provide multi-taxonomy support with taxonomies that can be user-defined and cross-linked across the underlying schema. For cleansing, the average vendor should be able to use complex regular-expression-based rule support for identifying complex errors with unusual patterns and cross-field verification requirements across multiple types of identifiers. The system should also be able to group/family data into core taxonomies/hierarchies without losing the original entities. Data should be automatically validated against vendor-cultivated databases, all integrated (subscription) data sources, all necessary alerts generated and any changes logged for audit purposes. Numeric and measurement data should be automatically subjected to statistical quality checks to verify likelihood of correctness.
Top vendors differentiate themselves by doing the above and allowing additional flexibility across taxonomies (e.g., supporting standardized ones like UN/SPSC concurrently with internal proprietary taxonomies), such as allowing users to pick whether they follow best practices or apply their own rules. For data cleansing, top vendors have hybrid or fully AI capabilities that can train and evolve rules based on changing errors, learned pattern similarity and manual overrides in order to reduce user frustration while improving data quality. Additionally, data validation for top vendors can include supplier-data specific capabilities that validate and cross-reference aggregated or curated ‘outside-in’ supplier data feeds, e.g, for risk and compliance.
Some top performing vendors can also apply their own knowledge bases (e.g., via ‘community intelligence’ and supplier information networks) through interactive supplier repositories and AI/ML techniques. These techniques can categorize incoming data based on similarities to already captured data, further cutting down the amount of time a buyer needs to spend on data management. Finally, some vendors have expanded their data cleansing and enrichment to improve data fidelity and quality beyond spend/supplier data into areas such as product/part data (which ties to information like inventory, assets and revenues/profits), contract data (to improve visibility into compliance and future spending), supplier systems, worker data, IT related data (e.g,. SaaS licenses) and supply chain planning systems. This is often done on a services basis and/or with partners, but it will eventually migrate internally into native solution capabilities.
4. Spend Cube Analysis
The ability to allow procurement power users and business users to tailor spend insights to their unique needs with core analytics for spend cubes, e.g., role-based, category-tuned, drillable and filters.
The average vendor should support native cross-tabs/pivot tables, including Excel 2-D cross-tabs/pivot tables and some level of 3-D cross-tabs/pivot tables. For cubes, the average vendor should support multiple cubes on standard spend data, with both auto-derived and user-derived dimensions. Users should be able to derive their own measures, filters and views that they can apply to dashboards or individual views.
Top vendors differentiate themselves by supporting these cross-tabs/pivot tables and k-d (‘k-dimensional’) cross-tabs/pivot tables with the ability to fix points for real-time drill down. Beyond just a standard spend cube, top vendors have workspaces that contain multiple cubes on multiple datasets that can be linked across derived dimensions to allow ‘drill around’ spend analysis. Users should also be able to derive and even share their own advanced measures, filters and views, such as cross-transaction derivation/linkages to enable tailoring and re-use of the analytics by end-to-end process, use case, role, KPI, etc.
Some vendors even offer highly customizable cross tabs/pivot tables, such as those that allow individual data points to be fixed and filtered around or those with the capability to link and filter across multiple cross tabs and in-cell visualizations. Essentially, these vendors offer the user multiple ways to build cross tabs/pivot tables. Looking beyond the ability to link cubes across spend, supplies, ESG and risk factors in a single workspace, which allows for analysis across all dimensions and negates the need to work across multiple reports, some vendors allow the analytics to dynamically link to external data (e.g., benchmarks and commodity indices) and internal data (e.g., KPI targets) to create enhanced spend dashboards that can be saved/shared.
5. Advanced Analytics
The ability to conduct predictive analytics (e.g., predictive cost/spend models) and prescriptive analytics (giving recommendations on opportunities/risk) — including using AI/ML technology — so that organizations can stop being reactive and begin seizing potential spend improvement opportunities proactively.
This capability focuses on using higher-impact analytics to create business value by analyzing spending data more broadly, deeply and proactively rather than just summarizing historical transaction data in the hopes that users will know what to do with it. The system should provide predictive analytics based on augmented models, such as time-phased demand/supply volume models for direct spend, with machine learning for parameter adjustment, e.g., predicting future spend based on internal/external model inputs.
Additionally, users should have some control over the predictive analytics, such as changing thresholds or inserting calculations so that the prediction/prescription models can be validated, tuned and maintained, i.e., ensure it is not a ‘black box.’ Similarly, what-if analysis should be available. These are built-in trend analyses for spend over time by supplier, commodity, geography, etc. that can generate comparative ‘what-if’ reports that cover hypotheticals in which purchases increase/decrease by a user-defined percentage or range. Some vendors use predictive algorithms to identify the relevance, urgency and potential impact of outliers relative to upcoming purchases and sourcing events, allowing for immediate action. We have seen vendors allow the user to select the trend data they want to use in their dashboards, rather than simply spitting out a trend on the backend. This allows for more targeted, relevant trends for each customer. We have also seen vendors meld their risk tolerance analysis with predictive capabilities to test different business scenarios.
Another type of analysis capability focuses on clustering and outlier detection. The average vendor should provide core support for advanced analytics, such as outlier identification via standard statistical and clustering algorithms that can flag not just data quality issues but also potential fraud, errors or operational events. Top vendors use advanced ML algorithms that adapt to changing data streams and patterns to detect outliers that standard algorithms would miss. For top vendors, the hybrid/AI system automatically defines various time windows for rolling integrity analysis. Beyond detecting trends of interest, the system may also detect potential issues by using external benchmarks and anonymized performance across the provider’s client base by using anonymized community intelligence.
Top vendors support predictive analytics with neural networks and related AI models for advanced prediction on real-time data and/or shallow learning and similar next-gen techniques to detect temporary/rapid changes. They also allow for more control over the predictive analytics, such as the ability to train/insert new AI/ML models, create variants of embedded algorithms or build new algorithms inside the tool from scratch. Top vendors also provide what-if analysis and advanced multi-factor models to generate ranges and banded trend charts for specific scenarios that can be integrated with other third-party data science tools.
What does a Spend Analytics solution do?
Spend Analytics solutions provide a single view into all historic, current and planned supplier spending in order to improve that spending through better sourcing, compliance and demand management.
Why buy a Spend Analytics solution
Spend analytics helps generate business-relevant spending insights across internal spenders and the supply base by leveraging historic and predictive datasets across the enterprise. By generating these insights, spend analytics solutions proactively help reduce costs in the form of input costs and unnecessary consumption, mitigate both internal and supplier risk/non-compliance and design better processes, policies and practices.
How ‘Market State’ is derived from the SolutionMap dataset
These “Top 5” (of 21) critical digital capabilities stem from the Spend Matters TechMatch workbench — derived from 108 requirements scored in the Spend Analytics Spring 2024 SolutionMap solution benchmark.
The Top 5 capabilities are the highest-weighted critical capabilities that are central to the displayed solution market benchmark. They have been developed by Spend Matters team of analysts and refined by procurement users in tech-selection projects using our market-proven SolutionMap benchmarking dataset and associated TechMatch decision-making tool.
Spend Matters® SolutionMap Procurement Technology Intelligence
Spend Matters built a better way to help companies, and their consultants, advance procurement practices via technology. SolutionMap compares technologies on two main factors: technical capability (gathered via a rigorous RFI process) and validated customer ratings. Data is refreshed every six months. Participation for vendors is completely non-commercial.
Spend Matters® SolutionMap Intelligence Process Overview
Featured vendors undergo a rigorous, RFI-based assessment process including functionality counter-scoring, supporting materials review, tech demos and submission of independent customer references. Expert analysts deeply vet solution capabilities prior to ranking inclusion. Only relevant industry players that meet the criteria determined by Spend Matters are invited in order to create a complete view and optimal intelligence for its members.
- To learn more about the Spend Matters methodology, click here.
- For deep Spend Matters insights informed by more than 10 years of independent, zero pay-to-play, brutally honest coverage of vendors, market developments, M&A activity and trends become an Insider member.
- To compare product strengths and weaknesses, try Spend Matters TechMatch to directly assess the capabilities of digital procurement solutions.
- For more information on procurement tech, services and vendors consult Spend Matters comprehensive free directory.