Sourcing Optimization: A Basic Primer From an Expert (Part 2)

Spend Matters would like to welcome back Dr. Jason Brown, a former CTO at CombineNet, who will further share information on how companies can best apply optimization to their overall sourcing practices.

In this next post (see the first post here) introducing Spend Matters readers to the fundamentals of sourcing optimization, we'll turn our attention to looking at the time savings benefit of applying such a toolset.

Analysis Time Savings

Optimization-derived time saving revolves around efficiency. There are two huge time sink holes when it comes to sourcing analysis: 1) taking the incoming bids and "cleaning them" so that they can be analyzed; and 2) the actual analysis of the bids once clean.

It is important to realize that this first sinkhole is not optimization, it's automation -- automating the process to cleanse and aggregate the data into a single source, so analysis can easily be run. For online sourcing initiatives, if the bids are only gathered via the online user interface, cleansing is usually straightforward and handled by the sourcing application. If the bids are gathered via Excel, then good sourcing applications will create individual Excel bid sheets downloaded by individual bidders with in-cell validation and Excel locking. Once the bidders complete them offline, they then upload them, at which point the application re-validates the individual in-cell bid field values, validates bid field values within each bid, validates multiple bids placed on the same items, and validates bids across different items. If the applications do not do this and / or the Excel sheets are not handled, this process becomes extremely time-consuming. Additionally, online systems aggregate cleansed data into a common repository for analysis. In most sourcing initiatives, if this is not automated through the sourcing application, then you are wasting a tremendous amount of valuable resources on this tedious task.

The analysis of the cleaned data, however, is an optimization sinkhole. Optimization really hit its heyday when it used to take 3-4 analysts 3-4 months to analyze the bids to determine a single way to award the business, taking into account the business rules. These analysts weren't necessarily concerned with finding the optimal solution; they just wanted to find a "feasible" solution, that is, any award that satisfies all of the business rules. Today, advanced sourcing applications can determine optimal allocations, including hundreds if not thousands of business rules within millions of bids across hundreds of suppliers -- in milliseconds. This allows the analysts to try different competing sets of business rules (aka scenarios) and then compare the results. The analysts can then apply their in-depth business acumen to analyze and modify the results, making hundreds of possible scenarios to compare, finally determining the one they would like to award.

Prior to optimization, consultants used to spend a lot of time "bundling" items and having the bidders bid on these bundles. This had a few drawbacks: 1) it takes a lot of time to pre-bundle the items; 2) you can only analyze one scenario, since the bundles inherently included the business rules; and 3) you treat all the suppliers as if they are homogeneous when they are obviously not. The last issue forces suppliers to hedge their bids: if they cannot supply all the items in a bundle, they are not as aggressive as they could be because they have to rely third party to supply outlying items if they win the business.

-- Dr. Jason Brown

Spend Matters would like to thank Jason Brown for his contribution. Jason can be reached at Jason (dot) brown (at) alumni (dot) unc (dot) edu.

Discuss this:

Your email address will not be published. Required fields are marked *