MxD’s 22,000 square-foot manufacturing floor

ML Journal

ML Journal

Digital Transformation Project Success Depends on Quality Data

Focusing on the quality of data is a critical factor determining the success or failure of transformational projects. 

 

TAKEAWAYS:
Successful transformational projects have enabled manufacturers to gain a competitive edge by lowering costs, providing better customer service and enabling new innovative business models.
Not enough attention has been given to the impact that data and data quality have on determining the success or failure of transformational projects.
Data considerations are becoming even more critical as AI is evolving to become a major component for all future transformational projects.   

 

Whether called Industry 4.0, Manufacturing 4.0 or just digital transformation, it is well understood that manufacturers need to transform to compete effectivity and be prepared for the future. Embedded technologies with connected processes are enabling faster innovation, reduced latency and more resilience throughout their organizations. Successful projects have enabled manufacturers to gain a competitive edge by lowering costs, providing better customer service and enabling new innovative business models. The challenge is that not every digital transformation journey project that a manufacturer embarks on is successful. In a 2023 survey, the World Economic Forum noted that 70% of companies investing in Industry 4.0 technologies fail to ever move out of the pilot phase of the project. Bain and Company noted in 2024 that 88% of business transformations fail to achieve their original ambitions.

Much has been written about why projects fail with the predominate focus around the three pillars of people, process and technology. Lack of executive commitment, inadequate company resources, immature technologies and unavailable skillsets are all pointed to as reasons that projects fail. Another area where not enough attention has been given is the impact that data and data quality have had on determining the success or failure of transformational projects. Having been involved in many transformational projects, I have seen first-hand the impact that bad or incomplete data have had on the project success. Automating a business process with bad data only results in getting to the wrong result faster.

Data considerations are becoming even more critical as AI is evolving to become a major component for all future transformational projects. AI has always been heavily dependent on data; generating poor results when events such as forecasting, anomaly detection or planning optimization are based on bad data. The common phrase referring to garbage in and garbage out reflects the impact that bad data has on results. Now with generative AI, having the right quality data has become even more critical. With Large Language Models (LLMs) being trained on the public internet and with AI models built on incomplete or other suspect data, it should be no surprise that the results of a recent MIT study showed that 95% of AI pilot projects failed to deliver financial benefits. While there are other contributing factors including the isolated nature of these projects, the quality of the data directly impacts the results.

Understanding and proactively addressing data issues can bring business benefit and help resolve challenges before they have a negative project or business impact.

 

Understanding and proactively addressing data issues can bring business benefit and help resolve challenges before they have a negative project or business impact. A big driver for project failure is the lack of active user engagement or working outside the system as defined. Having a data strategy early in a project cycle is critical as obtaining user buy-in or encouraging reliance on the system becomes much more difficult once the users lack confidence in the results. For the users that want to do a good job but come to believe that the new system is built on or generates bad data, it can be difficult to ever get them successfully reengaged.

Having a comprehensive data strategy is critical to project success and requires an understanding of the challenges and opportunities for addressing those challenges. Related to digital transformation, some of the data quality elements that have a direct impact on transformation projects can include:

  1. Data accuracy
  2. Data timeliness
  3. Data context
  4. Data volume
  5. Data completeness
  6. Data uniqueness

Data accuracy refers to the correctness of each piece of data. For example, creating a digitized automated process that pushes a customer order through the system falls apart when the item ordered, the build lead time, the bill of material, or the required inventory levels are wrong. Any of these can cause the order to be delayed and the customer to be disappointed. Data accuracy can be improved in a variety of ways including manual checks, automated checks, audits, cycle counting, etc. User acceptance testing (UAT) with real data is a fundamental component of verifying data quality and is normally a key component of any project schedule.

Directly related to data quality, the timeliness of the data can determine whether the data being used is still accurate at the time it is needed. When ChatGPT first came out, it provided amazing results. However, many of the results were based on 1–2-year-old data providing incorrect results at the time of the request. That works well for learning about historical events such as the Viking journey to Iceland; however, it is not very useful in evaluating the current state of your supply chain or your supplier’s risk factors related to their financial health. Every process and data element should have a specific requirement related to the timeliness of the data. It is important to know the requirements of the business and develop a specific strategy defining the requirements that are supported by integrations, inputs, dependencies or data feeds that meet your needs. This strategy should be clearly communicated to support required business processes and to set expectations properly.

Having a comprehensive data strategy is critical to project success and requires an understanding of the challenges and opportunities for addressing those challenges.

 

Context is important. There was an injection molding company where connecting the equipment provided real-time results based on performance. The early results clearly showed that the evening and weekend shifts were under performing. It would be easy to attribute this to lack of supervision or employees slacking off. With additional data, it was determined that the real culprit was that the support staff including machine maintenance and material movement/stock support was lacking. Context is especially important when reporting on and providing data analysis. While AI is becoming a useful tool to provide additional context, the way that prompts are worded can have a direct impact on the resulting analysis.

While data is critical, having more data is not always better. If the data volume is so great that you are unable to analyze or make good use of it, it can be constraining. For example, IoT provides the connectivity that Industry 4.0 depends upon and is a key success factor for Industry 4.0 projects. The last few years, almost all manufacturing machines have multiple sensors for connectivity and companies have aggressively moved to connect their machines with IoT. While there have been good advancements in automated service or maintenance execution, the real benefit received across manufacturing companies has still been somewhat limited. One reason is that IoT produces large amounts of data that are difficult to analyze. The available analytical tools have not been robust enough to do so. New AI technologies now provide the ability to help resolve these challenges by being able to quickly review and analyze vast amounts of data to make sense of it.

Data completeness and uniqueness impact accuracy creating significant challenges that are made more difficult with a multitude of overlapping data source systems. Manufacturers are benefiting by consolidating systems to minimize required integrations to address this problem. Where this is not possible, a best practice is to have a single master for all data. While complimentary systems can serve to enrich the data, no data element should be mastered in more than one system. This is where a master data management strategy is leveraged to define where data will be housed and how it will be used. Planning for this at the beginning of a project reduces unforeseen issues later.

As manufacturers continue to embark on these transformational journeys to digitize their business and provide seamless connected processes, focusing on the quality of their data will be a critical factor determining their success. M

About the author:

 

John Barcus is Group Vice President, Manufacturing Industries & Emerging Technologies at Oracle.

View More