Back to Hub

The evolution of product cost management tools and the state of the art (Part 2): The 2nd revolution

12/19/2019 By

Spend Matters welcomes this two-part guest post from Eric Hiller, Managing Partner of Hiller Associates, a business performance consultancy specializing in product cost management (PCM).

Our first article in this series looked at product cost management software and how it fits in with the world of procurement software in the earlier years of development. This Part 2 focuses on the second wave of developments in PCM: feature-based automated 3-D CAD costing tools, advanced cost-accounting and control systems; the role of little and big data; and the future of product cost management.

To recap, it’s important to consider that purchasing has added a lot of new tools to what was mostly a relationship-focused discipline. These developments include:

  • Data-rich environments of spreadsheets, MRP and ERP systems
  • Supply chain management and supplier relationship management systems
  • Online auctions
  • Spend analytics tools/product cost management (PCM) software

And it’s important to recall our roadmap to this journey through time and technology, which is shown in Figure 1. Notice that the 3-D CAD development is the latest one. Let’s explore how this has developed and what it has to offer businesses today.

The 2nd PCM Industrial Revolution: Feature-based automated 3-D CAD costing tools

In the mid- to late 1990s, researchers within universities became interested in the idea of driving should-cost estimates for products more directly from product geometry. The impetus for this was the sobering result of a study done in the 1960s by DARPA, which produced the oft-quoted maxim that 70-80% of cost is determined and frozen in the first 20% of the product development cycle. The logical conclusion by researchers was that product development teams needed software that helped engineers understand what their products cost, as they were designing them.

What unlocked this possibility was the ubiquitous adoption of a 3-D solid model lead by the engineering community in the late 1980s and early 1990s. Before the move to 3-D modeling, there was not an easy source for product geometry. However, in the 3-D model, the engineer was spending hours creating a design artifact that was rich with information that could be tapped for many other analyses, including cost. In fact, product cost management was a late-comer to the table. The 3-D CAD model was already first exploited for final element analysis, computational fluid dynamics, motion dynamics and many other advanced engineering analyses.

There also had been valorous attempts at “feature recognition” — translating the 3-D model into a set of “features” as a human manufacturing engineer would view it. These efforts were not focused enough to yield useful results. However, when the use case was narrowed to a specific need (product costing), a problem that had been a science project in the past became a tractable reality.

In 1996, Michael Philpott of the University of Illinois and I began collaborating with John Deere, building a methodology later dubbed “feature-based costing” and led to a company first called FBC Systems, and then aPriori.

Previously, process models were constrained by the amount of geometrical input that a user would reasonably take time to provide, but feature-based 3-D analysis could provide nearly infinite geometry to the model. Furthermore, it could auto-generate a number of valid manufacturing process routings and calculate times, masses, tooling and cost in near real time. Until this point, costing was almost always done by purchasing specialists or, more often, costing engineers, ex post facto of design freeze. Now it could be done concurrently with product design.

Until the mid-2010s, aPriori had a near-monopoly on a working product with automated CAD-driven costing. However, there is a new feature-based costing competitor that spun out of the University of Ancona in Italy, called Hyperlean that offers a competitive product called LeanCost.

The PCM Knowledge Economy: Advanced cost-accounting and control systems

In the mid-2000s, another stream of PCM software was coming into the market. This software came out of the German tradition of corporate control. It focused, in a “Back-to-the-Future” way, on cost accounting, updated with a manufacturing cost flare. It had a more practical viewpoint on cost accounting than previous systems in America. The primary example of this sort of software today is called Facton. Not only does Facton’s platform provide control-focused cost accounting, but it has some process cost modeling abilities, as well. These “platform costing” tools, such as Facton, tend to prioritize flexibility and breadth of what can be costed in the enterprise over speed of analysis and/or depth of analysis.

Another variant of this breed of platform PCM software is Cleansheet. Like Tsetinis Perfect ProCalc and Seer by Galorath, Cleansheet has consulting roots. It was developed as an internal tool for the Design-to-Value practice of McKinsey & Co. It focuses on flexibility in a different way than Facton.

Facton has a focus on modeling large chunks of the spend of a customer on a macro accounting/control level. Cleansheet is focused on the ability to cost an individual part or assembly, product or service in a custom and in-depth way.

Little and Big Data: Stochastic, statistical or index-based spend analytics

The first three types of product cost management software in the Figure 1 graphic (above) all focus on bottom-up calculation of cost from basic cost inputs (materials, energy, labor time, machine inputs, tooling, etc.).

The last type of tool takes a top-down approach — using past data on costs to statistically forecast cost in the future. Price Systems was one of the first systems to use this method in a commercially formalized service or software, starting in the mid-1970s. It began in the aerospace and defense verticals focusing on high-level costing of weapons systems. Seer by Galorath also plays in this system-level space of PCM, and its software development cost module uses this methodology, as well.

In the mid-2000s, engineers at Caterpillar applied the same stochastic PCM methods to the commodity/part level of spend analytics. This technology spun out as a venture-funded company called akoya, which was eventually sold to i-Cubed, which was quickly acquired by KPIT. However, no mention of the technology could be found on KPIT’s website.

The two newest PCM players have extended the stochastic methods into the world of “Big (or bigger) Data.” One is easyKost, a French company that uses the “random forest” methods to analyze large amounts of past data from ERP, PLM and other sources to determine what portion of cost is driven by each input one provides in the data set. Using this knowledge, easyKost can cost future designs and identify outliers in spend.

WTP Buynamics (What’sThePrice) uses a different statistical approach: index-based costing. It allows the user to parse large buckets of spend into its constituent elements of cost (materials, labor, overheads, etc.). Then the user can step the price forward in time from the first date it was sourced/quoted to what the priced “should be” or into the future.

Quo vadis, spend analytics and product cost management software?

Product cost management is a big problem, and there are bigger rewards for solving that problem. And, like most complex problems it takes many tools acting in symphony and developing over time to make progress.

What the market will demand next is beyond our scope today, but I will leave the reader with these questions to consider in their own spend analytics and PCM journey:

  • How mature is your firm in the area of product cost management?
  • What tool(s) does your firm use today, and how developed are your PCM processes, tools, culture and team?
  • Is anyone in your firm familiar with the market offerings in spend analytics/PCM software?
  • What goals for spend reduction do you have this year? Which of these tools offers the most bang for the buck in delivering on those targets?