Harnessing data’s power requires the right methodology and organizational alignment. By Sath Rao
Harnessing the power of big data is a primary component of any digital transformation effort and central to realizing the promise of Manufacturing 4.0. According to IDC Data Age 2025, the global datasphere will grow to 175 zettabytes in 2025 from 33 zettabytes in 2018. This requires manufacturing leaders and their teams to develop new digital acumen. The industry also needs a new mindset, based on pragmatic principles and practices that converge seemingly conflicting approaches to extracting insights from data.
To get the most from data, organizations are learning that they must improve quality and reduce the cycle time of analytics efforts. In a world where resources were key contributors to product costs and profitability, the focus was on achieving economies of scale and employing tight control on resources. Under this earlier paradigm, manufacturing organizations attempted to eliminate three types of deviations, known as the Three M’s, that contributed to inefficient allocation of resources: muda (waste), mura (unevenness), and muri (overburden). As we move into an era where the business differentiators come from service delivery and customer experience, the focus will now include economies of learning, dominated by a workforce that is trained to recognize and seize opportunity by deriving value from data.
Extracting the 5th V from Big Data
For well over a decade, learning efforts around big data were governed by four primary characteristics known as the Four V’s (volume, velocity, variety and veracity) that help data scientists and others categorize and prepare data for analysis. Organizations have spent countless hours understanding and modernizing the systems needed to wrangle/store and access the increasing flows of data to enable analyses. However, most organizations have yet to step into the realm of consistent and broad application of analytics that allow the accelerated extraction of the 5th V – value (see Figure 1).
In this case, value relates to the worth of the specific, actionable insights that can be developed from the data that is often caught in organizational data swamps. What will characterize leaders will be their ability to achieve better value through superior analytics and machine learning/artificial intelligence-based insights generation – this is the holy grail.
We are familiar with the almost incomprehensible statistics about the massive quantities of data that are being generated daily and how there appears to be no end in sight. This unbridled growth places a greater emphasis on extracting value from data, from every corner of an organization. Indeed, having access to endless amounts of data is not valuable in itself. Unless that data can be transformed into insights, it remains essentially useless and without business value.
The Undeniable Need for Data Science and More
Former world chess champion Garry Kasparov recently wrote in the Wall Street Journal that the real promise of new generation of analytics infused with AI is creating new knowledge, not just good results. This is demonstrated in the recent release of DeepMind Technologies’ latest chess project, AlphaZero, a generic machine-learning algorithm that began with no chess knowledge beyond the rules of the game. In a few short hours, AlphaZero taught itself to play better than the strongest traditional program, tutored by existing knowledge of how humans might play based on established moves.
This is significant because it represents a tipping point in how we think about the data that surrounds us and how we might be able to derive value. We can no longer assume that human interpretation is necessarily the best, most insightful perspective on how to bring data together in a way that provides a solid foundation for decision making.
“Unless data can be transformed into insights, it remains essentially useless and without business value.”
In manufacturing settings, it is easy to imagine how machine-generated data will soon dwarf human-generated data as sensors proliferate, edge computing matures, and video systems are deployed throughout the process. Given the shift in how data is generated, leadership in organizations is now fundamentally charged with extracting the value from all types of data and systems. The previous focus was on optimizing parts of systems based on initiatives like engineering design integration, extended supply-chain integration and customer experience. Now the focus has shifted to developing the ability to drive optimization at the system-of-systems level.
Developing this capability will put unprecedented pressure on organizational leadership to focus on developing and changing skill sets for the next-generation workforce. Teams will need need to practice a DevOps convergence between IT and OT while also grasping Lean practices, domain competency and the foundations of agile. Clearly, this will require not just the addition of data scientists but true collaboration across a range of functional teams.
Enter DataOps – a series of principles and approaches that makes distributed data architecture work at the right speed to deliver business agility. As businesses accelerate their run-grow-transform cycles to manage day to day and prepare for disruptions ahead, the ability to orchestrate infrastructure agility, data agility and business agility will become a key differentiator.
Rather than just hiring data scientists into an organization in an attempt to solve every issue, one way to look at the new paradigm is to focus on aligning skill sets to delivering the Four C’s of the DataOps methodology (Figure 2): Connected, Curated, Contextualized, Cyber-Confidential. The Four C’s represent the way organizations handle data and make it available for analytics processing that enables value extraction. The workforce and culture within most organizations does not generally support this direction right now.
How the Converged Four C’s Put DataOps at the Center of Digital Transformation
If you think of an organization’s digital transformation as being cyclical and ongoing, then it can be viewed as constantly moving between run and transform states (see Figure 2). In the Run to Transform focus that organizations are deploying to navigate the overall industry shift, the focus has been predominantly on the architectural transformations brought about by new technologies (e.g., IIoT, AI and cloud-edge capabilities).
Recently, the focus was on DevOps and IT/OT convergence and not necessarily on developing competencies that support the Four C’s that are needed for DataOps success. Of course, DevOps and IT/OT convergence are still important and most organizations are still working to become proficient in these areas. However, going forward the convergence of all Four C’s and functional aspects of data integration, data engineering, data quality and data security-privacy is the capability and skill set that organizations need to focus on.
CONNECTED DATA. In the digital transformation journey, an organization starts by integrating data from a variety of sources – leveraging IIoT, cloud and edge connectivity, for example – to integrate “constrained” data from silos and move it into data lakes. Cloud-edge technology coupled with IIoT technologies have now made it widely accepted that data kinetics enable value maximization – that is, data storage decisions should be based on where it can be best accessed and curated vs. on lowest cost. By gaining access to all types of data, organizations can more readily blend, analyze and gain insights that would otherwise remain inaccessible.
CURATED DATA. Kinetics is valuable and the ability to learn from past behavior, act on customer responses, and drive agility requires data to be curated. As large volumes of structured and unstructured data continue to pour into data lakes, data flows are unattended, resulting in more of a data swamp, where it can be difficult to discern the relevant data to bring together. To be valuable, data engineering must occur so users can gain a curated view of what is available, making access to relevant data easy and effective.
It is important to note that curation is a data engineering function that includes everything from extraction, cleansing, normalizing, standardization to entity consolidation and cluster reduction and export. Increasing automation and leveraging platform tools and capabilities to facilitate curation will require the next wave of investments and development of data science skills by organizations.
CONTEXTUALIZED DATA: To extract actionable value from data, organizations will need to develop expertise for handling both structured and unstructured data with a premium on adding context based on unique operational situations. It will not be enough to merely blend and curate the data that is received – the next layer of value will be derived by adding data qualifiers that augment the meaning of the original data. Contextualization enables business to be more agile by bringing data together in new ways that shed new light on a business or operations issue.
For example, video/LiDAR (Light Detection and Ranging) data of human motions on the factory floor can be combined with parts quality defects data to deliver insight on what corrective action is needed. By knowing where defects occur and whether there are human factors involved at those points reveals an entirely new view of the factors affecting quality. In isolation, manufacturing data alone may not have exposed the actual cause of defects.
As with other areas of digital transformation, expanding this maturity across the organization and addressing data quality issues will require new skills and prioritization, understanding business scenarios, and developing automated/ML approaches to help apply context at scale.
“It is about getting the right data to the right place at the right time to enable data-driven innovation.”
CYBER-CONFIDENTIAL: One of the most intriguing promises of Manufacturing 4.0 is the prospect of mass customization. The ability to deliver on that promise relies on blending data from a variety of sources that includes data from customers, among many others. Because companies will be reliant on integrating customer-sourced data into production, the emphasis on protecting and securing data will become even more poignant than it is today. Organizations simply cannot risk exposing detailed data from customers when any change to the way data is handled can have unintended consequences. Maintaining confidentiality across more and varied input sources brings new levels of complexity to data governance, security, and privacy.
Similarly, as connected products are adopted , the threat surface increases exponentially. Every product, sensor and edge device is a potential attack point that must be safeguarded, which means that resource allocation to cybersecurity, data privacy, and compliance issues will need to keep pace. Digital twins of products and processes and the customer experience will open concerns of data security and privacy. Vulnerabilities from legacy controls systems and the potential of IP theft also need to be factored in as organizations accelerate their transformation cycles. As attack surfaces expand, threats like advanced malware, worms and advanced persistent threats coupled with GDPR compliance issues will require new workforce skills.:
DataOps: Accelerating
Time-to-Value All the Time/span>
DataOps in manufacturing merges lean manufacturing with the agile methodology utilized by DevOps, which improves responsiveness and operates with iterative, incremental work sequences known as sprints. DataOps creates value by maximizing return on data through the use of advanced analytics and by reducing data’s time-to-value. It is about getting the right data to the right place at the right time to enable data-driven innovation.
Over the last century, we saw the industry progress by expanding on manufacturing’s original 4Ms (men, methods, material, machine) to then become the 8Ms, adding mother nature (environmental factors), maintenance, measurement, and management. With DataOps, we now have the convergence of the Four C’s, placing a renewed emphasis on managing data factories to realize the factory of the future. For Manufacturing 4.0, DataOps is a key skill set that organizations must build, nurture, and use as a cultural driver.
:
DataOps for Manufacturing 4.0/span>
The focus of Manufacturing 4.0 will be maximizing return-on-data. This will entail an integrated approach to extracting value from data – from storage, to enrichment (edge-to-cloud infrastructure), to activation (intelligent data operations), to monetizing the value of data (through data-driven solutions). The milestones on this journey toward increased data maturity capability are infrastructure agility, to data agility, and onwards to business agility. Improving your organizational DataOps quotient and capability will be instrumental to win this race! M
1.Vision 2030 – A Framework for Future Factories Sath Rao Manufacturing Leadership Council
http://www.mljournal-digital.com/meleadershipjournal/february_2017?pg=25#pg25
2. Intelligent Machines Will Teach Us—Not Replace Us Garry Kasparov WSJ https://www.wsj.com/articles/intelligent-machines-will-teach-usnot-replace-us-1525704147
3. Data Curation: Weaving Raw Data into Business Gold Bill Schmarzo CTO IoT Analytics Hitachi Vantara
https://www.linkedin.com/pulse/data-curation-weaving-raw-business-gold-bill-schmarzo/