Trust in data and how businesses use data affect decision making, according to study by NC State, IBM, CIPS
03/01/2021
Not all data is created equal, and it turns out that companies that don’t treat their data well or that have low “data literacy” also don’t make the best business decisions either, according to two authors of the 4th annual Data Quality & Governance Study, published by NC State University, IBM and CIPS.
The results showed that data use is getting better for tactical decisions in procurement and supply chain, but strategic issues are only as good as how a company is structured to handle data — creating a further threat beyond the coronavirus crisis that hit all operations in 2020.
“The majority of organizations do not assign the same priority to data assets as they do to physical assets,” the study says.
The report discusses artificial intelligence (AI) and machine learning, and it goes into training, becoming a mature user of data and setting expectations for using things like data lakes and full-on analytics. But the authors have boiled down the issues to a couple starting points.
“What we see in our study is still a lack of faith in the data,” Joseph Yacura, a current advisor for supply chain and former senior level supply chain executive, said. “One of the big areas in our study also shows that lack of data governance.”
Data quality is an ongoing process within the supply chain that relates to how usable the information is. For years, analysts have spent more of their time cleansing and organizing data than analyzing data. That takes less time now, the study reports. But with good quality data, that information is more readily available and accessible.
Meanwhile, data quality works in relation to data governance, which is defined in the study as “the disciplined processes that involve the organization, standardization, classification and coding of data into a database (or data lake) that can be applied across the organization.” It’s an ongoing process that individuals should be responsible for.
The study analyzed responses from 90 employees ranging from emerging practitioners to chief supply chain officers in a range of industries and countries. The results were released in January 2021.
Spend Matters talked with the study authors Yacura and Robert Handfield, PhD, a professor of supply chain management at NC State, to hear more about the findings.
Yacura advocates for businesses to have a structure that makes data vital to the whole business so that the information feeds strategic decisions.
“Sources of the data should be validated — the relevance of the data that’s being used and making that decision (and) the completeness of the data. Does it include all the data elements that you’d want to include in that decision?” Yacura said. “I think we need to sit back and think about it. We need to put a governance in place. Somebody who’s in charge of that data. … Oftentimes, we don’t really ask the right questions about the data before we use it. And as a result, we get sub-optimized decision making. Those decisions could be dramatically improved.”
Key findings of the study
Yacura and Handfield did point out that most of the findings of the survey were positive. It’s not a totally gloomy picture of data quality.
Among the survey, these were some of the key takeaways from the fourth installment:
- There is ongoing progress on how data is used in supply chain functions. However, there has only been incremental progress in the level of strategic decision making from data.
- Organizations are following a trend of spending less time looking for and cleansing data, indicating that data quality and availability are improving. Data cleaning and organizing still remain a significant workload for data analysts.
- The majority of organizations place a higher priority on physical assets rather than data assets — with 45% of organizations fully realizing the value of data and assigning the same priority to data assets.
- A fundamental building block for supply chain digitization is data literacy. About one-third of organizations consider themselves to have advanced data literacy, even among organizations that have a revenue greater than $2 billion annually. The study found that 33% of respondents consider themselves at the “extremely limited” or “novice” level of data literacy.
- Evidence is lacking to show that supply chain functions have a clear or defined data strategy. About 42% of survey respondents include data requirements when issuing RFPs to suppliers.
- Decision latency is impacted because of a lack of data literacy and data quality, especially among decision makers.
- Embedded AI technologies can offer a source to connect and correlate internal and external data from disparate sources across the supply chain.
Both Handfield and Yacura observed positive changes in the latest round of results. Things are trending better when it comes to the adoption, use and strategic decisions of data, but the trend is growing incrementally. With Covid-19 posing just a small threat at the time of these results, it’s likely that even better adoption will be seen this year. Supply chain digitization accelerated and caught up with the 21st century during the Covid-19 pandemic.
“When you look at it overall, companies still have a hard time assigning a value to data,” Yacura said. “Data is obviously an intangible asset, right? … You paid for this. And then you assume that you’re gonna have a lifecycle of X years and you have a depreciation method and you write off the value of that asset. But when you look at data and it’s intangible, how do you establish the value because you may not pay much for it at all? But the value is astronomical.”
Data lakes and better educational opportunities might be the answer to data governance issues
As Handfield and Yacura remain optimistic about data quality and governance going forward, they kept coming back to the idea that people need to learn to trust and have faith in data. That seems to be the biggest hurdle preventing true data quality and governance from taking a foothold in the supply chain today.
All people in an organization need to understand the value of data. From there, data can become a lifelong friend to any organization.
“The only way to get around this is if you start to create what’s called a data lake, where you kind of keep it separate from all of these legacy systems,” Handfield said. “People can use it as a place to put trusted, cleansed, reliable data, and to use that for your dashboards, your business intelligence and so forth. So it’s reliable. And you only allow data into that data lake. You gotta keep it cornered off from the rest of the organization, all of these other enterprise systems.”
Another key aspect in growing the trust around data is through education. It’s a powerful tool that is often overlooked. Handfield and Yacura both said the next priorities for organizations after reading this latest study is to place training and educational programs in place for people to learn about data literacy.
“Training and educating their people on data literacy,” Yacura said. “What does that mean? How do we use data? How do we present data? How do we interpret data? The tools are really good, but if you don’t know how to use them or how to explain it … It’s not every case, but the majority of the time people hire mathematical wizards, right? People who are really good in quantitative analysis. In most cases, those individuals aren’t as good with their analytical skills as they are with their business savvy and ability to explain the outcome of their analysis so that it’s actionable by the business.
“So I think we need to create these hybrid skills. Like supply chain management and some knowledge about data, maybe some quantitative skill sets, statistical analysis. Not to become the heavy-duty quantitative people, but enough to know what the output of these tools mean and how you should interpret them into some sort of action.”
-
-
-
-
CORE12/16/2020
-
CORE08/07/2019
-
-
-
-
CORE12/16/2020
-
CORE08/07/2019