Friday Rant — There's No Place for Lazy Reporting

When I got my start in journalism, I was 16 years old. I interned for a local paper, The Main Line News (or was it Times), and wrote a couple of stories looking at localized crime -- e.g., somebody spitting gum on the sidewalk -- along Philadelphia's idyllic Main Line. For a city kid who moved out to the burbs, suburban journalistic irony came easy, I suppose. I later penned columns and reported news for my college paper during my first year at Dickinson, a small liberal arts college in central Pennsylvania (before transferring to Penn after two semesters). In one column, I drove the whole campus into a tizzy when I slammed the multiculturalists bent on driving the fraternities and traditionalists underground. Something about controversy has never left my pen, whether it's black or virtual ink.

Next, at Penn, I wrote for a number of publications, as a reporter, columnist and, later, as an academic, serving as an editor of the undergraduate history journal. I also had the chance to study under two well-known non-fiction writers -- Carlin Romano, Literary Critic for the Philadelphia Inquirer and humorist and essayist, Cathy Crimmins. After taking both undergraduate and graduate degrees from Penn and spending countless hours learning the art of research, analysis and reporting, I took a job as management consultant by day, but authored columns by night for CMP publications and other trade rags (e.g., Internet Week, Information Week). By the time I was 25, I had hundreds of bylines under my belt. No doubt that I made a lot of mistakes, but it was a period of accelerated learning, to say the least.

Perhaps the most important lesson I originally learned in journalism -- and later as a columnist and blogger -- is the importance of sanity checks when researching a story. There's nothing wrong with using your brain to call into question something a source tells you. If something doesn't add up, I was taught to say something. Now, I'm not always perfect. Sometimes my fact-checking is not always as thorough as it needs to be -- and I'll be the first to admit it. But there's a difference between getting the occasional number incorrect versus missing an entire equation. Sometimes, reporters miss things and it's an honest mistake. But sometimes, they lazily reprint information without probing their sources and asking deeper questions about the big picture to more accurately present what they’re reporting to their readers. Which is exactly what happened to both Purchasing and Supply and Demand Chain Executive this week.

Both publications, which I usually hold in high esteem, essentially reiterated the gist of a press release or regurgitated discussions from ISM that they had with CVM Solutions, a provider that recently released a study looking at company spending information in CVM's database. To quote CVM's finding to set the context: "CVM recently completed a detailed cross-industry analysis using the CVM Master Supplier Database, which includes hundreds of data points per supplier including customer/supplier spend levels. The CVM Master Supplier Database and subsequent cross-industry analysis has provided CVM with the ability to track and monitor supplier trends and risks that could affect a supplier's ability, or inability, to support their customers. Through this analysis of the data, CVM identified a core set of 10,500 suppliers that are recipients of 80 percent of Fortune 500 spending. In addition, only 110,000 suppliers make up 95 percent of the total spend. CVM was also able to identify a relatively small set of suppliers with significant spend, 16,060, that are common among the Fortune 500."

Now, on the surface, this sounds like it would make for great ink. But the moment I saw the data in a quick discussion last week, I started asking CVM questions. What were the sources? How much spend did CVM look at? What percentage of the spend for each company was represented in the samples? How many records, etc.? Easy questions. Ones any reporter, analyst or blogger should ask. After doing a bit of digging, it came out that CVM used a subset of the data available to them in an attempt to get this study published in time for ISM. And in my view, given that the analysis is based on a sample data set, the analysis missed its true potential, and could possibly mislead organizations in its current form.

Let me explain. CVM looked at data across multiple years at different companies for which they had access to. When I probed, I learned that they looked at 9 million supplier records across 89 companies representing $75 billion in spend. Now, CVM knows the suppliers (and, often, spend levels) for over 350 of the Fortune 1000 and a complete analysis can indeed shed even more significant insights. They have captured this rather unique data set over the years from their supplier diversity and spend enrichment programs for customers.

Still, if you do the math in your head on what they announced in the press release, it becomes clear that 89 companies within the Fortune 500 as claimed would have significantly more than $75 billion in spend and more than 9 million supplier records. CVM sampled data yet ended up publishing very specific information on the number of suppliers (based on a partial sample over different time frames) rather than either specific information based on complete spend data sets or directional percentages with partial spend data sets over a defined period -- a big difference.

Based on this, we could dismiss this research outright. But I won't do that. It's an admirable and gigantic effort to aggregate this type of data. I talked to CVM's Chairman, Mike Anguiano and President & CEO Rajesh Voddiraju yesterday, and they told me that CVM intended all along for this to be a two-phased project. Phase 1 would focus on an initial analysis using a "representative sample of customers". They intend Phase 2 to be a broader and more formal research report produced by partnering with an independent research firm.

I hope to share the results from CVM's more thorough and updated research report on Spend Matters in the coming months. Once the dataset comprises a larger picture, you'll see it here first. My guess -- this is just the tip of the iceberg in terms of what the data might show ... And it also shows the power of data aggregators in the market to potentially help companies make better sourcing and supplier management decisions (Ariba is now headed in a similar direction with its spend visibility benchmarking programs).

Regardless, this whole experience suggests to me the importance of validating the information you read. After all, if reporters aren't going to do it, you’ve got to do it for yourself. Even though I've made scores of mistakes over the years as a columnist and blogger, the one sin that I hope I've never committed is being lazy and not asking probing questions to get to the heart of a story -- or a spend data set. Sometimes when you dig an inch, the ground moves a mile. Reporters owe it to their readers -- just as much as analysts and bloggers do -- to not only report, but to develop hypothesis and validate. In this case, only the reporting part happened.

The fact that reporters from both Supply and Demand Chain Executive and Purchasing reprinted findings from a study without asking questions that would have led them to a different conclusion is not a good sign of the quality of reporting we desperately need from the trades in the Spend Management market today. In Russell Crowe's new movie, State of Play, an old school whiskey-swigging political reporter laments the loose reporting standards of a youthful blogger (before they work to solve a crime together, sharing in the spiri

ts along the way). It's a shame that the Spend Management World is the other way around -- and also that we're not yet collaborating more closely and sharing successful reporting toasts more often.

Jason Busch

Share on Procurious

Discuss this:

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.