In our last installment of this series based on a Wall Street Journal interview of Tim Campos, I discussed some strategies for procurement to help IT, and help itself, by changing how it specifies and sources “solutions” for its internal and external stakeholders that it supports. It’s not such a big leap to extend the notion of “solutions” to include to physical goods if you again use the services metaphor (i.e., XaaS – Everything as a Service). For example, metal springs aren’t springs – they are motion control solutions that provide cushion as a service! When an original design manufacturer buys an electronic device to sell, it’s buying an electronics manufacturing service from an EMS solution provider.
So, by procurement not just managing itself as a world class services provider (e.g., in a shared services or “Global Business Services” model), it must also understand the “art of the possible” of how technology can be developed by a rich ecosystem of providers:
- Traditional goods and services suppliers (including distributors) using technology as part of their digital business strategy to differentiate themselves and be “stickier” in their customer relationships
- Traditional packaged applications technology suppliers (SaaS and/or on-premise)
- Custom development done in-house our through third parties such as BPO firms (I know one major buyer that used a BPO firm to basically custom develop a near-clone of a PLM-centric direct materials packaged application because they felt it to be so strategic)
- Industry disruptors (e.g., Amazon, Google, Uber, etc.) and more traditional electronic business networks/intermediaries
- Combinations of above. For example, consider large application suite vendors like SAP, Oracle, and SalesForce that are selling PaaS solutions to a developer/partner ecosystem – and to themselves – to allow a core-kernel application suite while allowing controlled bolt-on development instead of the application anarchy that exists at so many firms today
To get more specific, let’s get back to the interview with Facebook CIO Tim Campos and a few areas cited by the CIO for custom development and then consider the procurement implications…
Get Predictive and Productive
Campos told the WSJ: “We built a repair tool that says how many machines need to be repaired today. It connects with the supply chain, and identities where we have parts inventory gaps. It does all of this automatically. The repair technician only needs to pick up the spare parts and put them in a cart, initiate the repair and push a few buttons.”
Spend Matters’ take: This is a classic predictive analysis (and hopefully it’s predictive based on machine learning rather than rote fixed replacement schedules) that procurement should try to apply anywhere major spend occurs. Predictive maintenance for MRO assets is the classic scenario, but can be applied to any purchased asset, to any sold asset/product (to help align field service replenishment purchases) and to any set of processes that generate costs. For example, a large high-tech provider ran predictive cost forecasting models for its various customer events to help predict expenses and to provide those expense benchmarks to its employees. This allowed better financial planning and analysis accuracy for the CFO It also led to better spend control when the big spender sales/account managers (“outliers”) knew they were being monitored (i.e., the Hawthorne Effect provides good demand management) and that they better be selling at levels commensurate with their spending. Bottom line: Cost forecasting and spend planning is not just for direct materials!
Predictive analytics will also have an impact in talent management, although Facebook’s CIO cited a more straightforward automation scenario here…
Campos (to the WSJ): “It tells me all of the people I have to interview today. I submit my opinion on whether someone should be hired or not, and the report goes on its merry way. When recruiters schedule interviews, they have a pipeline of candidates, roles and interview panelists. All of that is in a database. All of the complexity of detail here is managed by the tools. The tool makes the scheduling decisions.”
SM take: There are 2 big lessons here. One is that technology-led efficiency improvements can have a big impact of effectiveness when the resources being freed up are strategic. In procurement, if you can free up 20-40% of a category manager’s time with good tools (e.g., spend analysis, market intelligence, performance reporting, etc.), you are freeing up the savings-generating capacity of a profit center, not reducing headcount of a transactional cost center. The other lesson is that technology is used to reduce complexity, not just automate workflow. In fact, machine learning is not just for AI-based spend analysis anymore (if you use this approach versus rules-based approach). You do have a modern spend analysis tool, right? And machine learning is becoming more prevalent in many areas (e.g., fraud detection, invoice scanning/conversion, etc.) and can even be implemented in the recruiting scenario cited by Mr. Campos.
Some talent management systems can “learn” from behavioral applicant screening and from employee performance evaluations to help predict what applicant behaviors (and underlying personality traits) will affect performance. I know a procurement organization that even used this technology to find staff outside of procurement that had the behavioral characteristics that were well suited to procurement. And, this capability can not only help procurement in its talent search, but also think about the effect this might have as an embedded capability within a contingent labor platform. Think of this as eHarmony for B2B services procurement. Again, it’s not that far away from being mainstream.