The Most Boring Problem in Supply Chain — And Potentially the Most Important
- Jeremy Conradie.

- 2 days ago
- 3 min read

This is a post about a boring topic: data management.
But please, stay with me. I know you’d rather read about AI agents, robots, tariffs, or just about anything else other than data — but like your mom getting you to eat veggies as a kid, you gotta put data management on your plate in 2026.
The catalyst for this post was a new study published by IBM last month. “Based on insights from 1,700 Chief Data Officers (CDOs) worldwide, the study highlights a widening gap between AI ambition and readiness,” reports the press release. “Although 81% of surveyed CDOs report their organization’s data strategy is integrated with its technology roadmap and infrastructure investments — compared to 52% in 2023 — only 26% are confident their data can support new AI-enabled revenue streams. In addition, barriers such as data accessibility, completeness, integrity, accuracy, and consistency are preventing organizations from fully leveraging enterprise data for AI.”
Yes, you’ve heard this all before — the whole “garbage in, garbage out” spiel about what you get when software applications (and now AI) use outdated, inaccurate, or incomplete data. So, why is this still a problem?
Way back in the Stone Age of May 2001, almost a quarter century ago, the Harvard Business Review published an article titled “The Achilles’ Heel of Supply Chain Management.” Here’s an excerpt:
Ever since retailers equipped their cash registers with bar code scanners, we’ve been promised a brave new world of supply chain management. Stores would automatically track the flow of goods and electronically transmit precise replenishment orders. Suppliers would synchronize their production schedules to real-time demand data. Fewer goods would sit around in warehouses; fewer customers would find products out of stock.
It’s a great vision, and one that may still come to pass. But to get there, retailers will have to clean up their act. In an in-depth study of 35 leading retailers, we were dismayed to discover that the data at the heart of supply chain management are often wildly inaccurate.
Sadly, if the authors were to repeat this study today, they would still be dismayed. Many companies — maybe most — have yet to “clean up their act.”
I’ve written about this problem many times over the years too (see “The Big (Crappy) Data Problem In Supply Chain Management” from May 2014). And in June 2024, we explored this topic with members of our Indago supply chain research community, who are all supply chain and logistics executives from manufacturing, retail, and distribution companies. Three quarters (75%) of our respondents rated the overall quality of the data they receive from external trading partners as either “Average” (54%) or “Poor” (21%). None rated it “Excellent.”

Source: June 2024 Indago survey of 24 qualified and verified supply chain and logistics executives from manufacturing, retail, and distribution companies.
Remember Big Data? That was all the rage not too long ago. What about data warehouses and data lakes? Or maybe you’ve moved on and are now buzzing about newer approaches like those offered by Databricks and Snowflake.
Unfortunately, there is no silver technology bullet for data quality.
I’ll repeat what I said back in 2014: Solving the data quality problem requires answering two basic questions:
Who owns data quality management?
Do we really need all of this data and complexity?
Many operations people believe that IT is responsible for data quality, while IT points the finger back at operations and the countless trading partners (suppliers, customers, logistics service providers, and so on) that send them data. Simply put, the responsibility for data quality management is not clearly defined at most companies, or it’s assumed that data quality is everybody’s responsibility — but the required governance and accountability structures don’t exist.
Data governance and accountability standards? Sounds boring, doesn’t it?
However, as the gap between AI ambition and AI readiness widens, the leaders who pull ahead in 2026 will be the ones who finally treat data quality as a strategic asset — not an afterthought, not “IT’s job,” and certainly not something to get to “later.”
The bottom line: If you want extraordinary outcomes from your AI and enterprise software investments, you have to start by doing the “boring work” of data quality management.
This confirms what Nucleus has found. Technology provides useful tools, but the real value lies in aligned and coordinated incentives of supply chain players.
Source: Talking Logistics
Image Source: Shutterstock



Comments