4 Reasons Why Supplier Scorecards Don’t Work

everythingpossible/Adobe Stock

Spend Matters welcomes this guest post from Andy Kohm, CEO and founder of VendOp.

When I think of supplier scorecards, I think of having to be prodded, reminded and occasionally begged to carve out a large chunk of my time to fill out question after question on suppliers I’ve used since who knows when. I cringe when I see the first email comes in knowing that more are to follow.

In my entire career, I cannot recall a single business outcome or strategy changing from a supplier scorecard. In theory, scorecards are perfectly sound tools to measure the effectiveness of key business relationships. But in reality, they’re hideous instruments for the collection of insight into supplier relationships. So most often, they collect dust. Let’s examine why.

They’re Too Long

The first question we all ask ourselves when opening a link to a vendor scorecard is “how long is this going to take?” When people see a survey instrument longer than 20 questions, it gets labeled mentally as a task vs. a quick ask. So it goes on a list of other things they probably would do, could do, but likely won’t.

Vendor scorecard school teaches that the more questions we ask, the more information we’ll gain. That becomes untrue when the length of the scoring instrument turns people off from completing it. It’s common to hear of completion rates under 10% in large enterprises. As they’re typically constructed, vendor scorecards sacrifice completeness of information over response rates. And that decreases their value.

They’re Too Infrequent

Because they’re so long and onerous, scorecards aren’t issued and collected as often as they should. (Good gets more, bad gets less.) You can’t increase the frequency of long scorecards that fewer than 10% of people are filing out without diminishing returns.

There are two big hazards with the typical annual schedule of supplier evaluations. One is that the product and market-facing contributors filing them out can’t recall and analyze a year’s worth of interaction with each vendor. Try recounting 100 visits to your local coffee shop. You’re going to recall most vividly what was most recent.

So that leads to the second hazard: procurement and sourcing teams are often looking at scorecard data skewed heavily to the recent past. Many large buyers compound the problem by issuing scorecards at the same time of the year. Welcome to Scorecard Season, where the scores only go up.

They’re Too limited

If you’ve ever been in a group review of scorecard results from an important customer, you know that there is an 90% chance that at some point, someone in the room will ask in frustration, “Why didn’t they just tell us that?!” Even the best of business relationships will suffer from the rigidity of the typical scorecard format. Point and click Q&A can deliver unintended messages, especially with low response rates.

No One Uses Them

If your company is anything like the ones I’ve worked at, after you fill out the scorecard for a supplier, that is it. You never see the scores of the vendors you use. Where do they go? How do you see them if you want to? If this happens at all, it often requires an email chain asking to the see the scorecard for a particular vendor. Good luck asking the right people and getting the right information back. This highly valuable and material data effectively sits in a black hole.

Collecting objective and subjective intelligence are not mutually exclusive but scorecards squeeze the humanity out of evaluation. We should not grade vital business relationships without the color, context and additional meaning that stories, anecdotes and language provide. You can’t measure everything through checkboxes.

Vendor scorecards often feel like a census of one. No one wants to make critical decisions based on data like that. Yet too often, when the question of how companies collect institutional knowledge and judge performance about their suppliers arises, scorecards are the only answer.

Voices (6)

  1. Ian Kirk:

    We produce scorecards monthly, with a rolling 12-month average for ontime delivery and quality. We reference the specific rejections and late purchase orders. Now, due to our volume, we are able to fit the entire scorecard on one piece of paper.

    For key suppliers, the top 10 that usually consume 80-90% of the spend, I would definitely recommend monthly scorecards. For the others, I would recommend quarterly.

    The biggest impact scorecards have are on those organizations that have ISO or AS certification, as their own internal systems and 3rd party audits utilize those scorecards, and a bad scorecard should trigger an internal audit. If that internal audit does not meet the 3rd party auditor’s expectations, the supplier can receive a major finding, which if not swiftly and adequately performed, can result in their ISO or AS cert being pulled.

    Ultimately, scorecards are only effective if you care and the supplier cares.

  2. George Ellington:

    Hello- I did like this perspective & agree with what was stated. What I am curious about, & could make this article more complete is what would be an improved alternative? How to do make this process more efficient & effective? I can certainly see the merit in calling out the common inefficiencies of a very common practice, but then I believe that to follow that with what would improve this from both an internal & external stakeholder perspective would be good way to fill out this article. I think the article sort of implies, but how does one shorten, run more frequently, improve the communication & ensure scorers & companies use the information effectively? Just curios of how do we improve upon such a standard way of scoring our vendors? Thank you.

    1. Andrew Kohm:

      Hi George,

      Most companies perform long supplier surveys all at once. A better process would be a shorter supplier reviews more frequent as close to the completion of the service as possible. This would reduce the chore of providing feedback as well as get better feedback as the person has better recollection of the performance of the supplier

  3. bitter and twisted:

    Idd add a fifth: theres is something fundamentally epistemologically bogus about combining quantative and qualitative factors into an overall score.
    That goes for tenders too.

    1. Andrew Kohm:

      I fully agree that there is a huge loss of information by combining quantitative and qualitative factors into an overall score it does allow for easier cursory comparison between vendors. Of course, you should also be able to see the break down and be able to also compare the two sections separately. In many cases, it is even hard to agree with the weighting of the scores between different sections. Different types of suppliers might need a different weighting scheme for the services and products they provide.

  4. Bill Kohnen:

    Some thought provoking comments obviously based on experience.

    I would agree that most organizations make the reviews to complex, not frequent enough, and generally don’t really share or use the data effectively to really drive change.

    I would add that many organizations also try to review too many suppliers. Not only does it not make much sense to measure any no critical suppliers but there is also a significant cost to measuring suppliers effectively. It does not just end with gathering the results and emailing them. You really need to meet internally and also with the suppliers to review and take action.

    In terms of frequency assuming that one establishes a review process where objective data is automatically collected and subjective input is only several key inputs via a social media like tool then I would say at least quarterly is the minimum.

    One other point is I would resist the temptation to set up a team that is only focused on collecting and reporting on scorecards. Once that happens then the tendency is to focus just on doing the scorecards as an end goal rather than using them as a tool to encourage communication to improve performance.

Discuss this:

Your email address will not be published. Required fields are marked *