Smart Decisions from Smart Data with Digital Twins Powered by MQTT and UNS

·

,

The transition from smart data to smart decisions represents a critical evolution in modern manufacturing, and digital twins are at the heart of this transformation. By creating detailed, dynamic virtual replicas of physical assets and processes, digital twins empower organizations to harness the full potential of their data. They provide real-time insights, predictive analytics, and actionable intelligence that drive more informed and effective decision-making across every level of operations. In this blog post, we will cover the transformative impact of digital twins in turning smart data into smart decisions supported by a robust data management solution powered by MQTT and Unified Namespace (UNS).

What is a Digital Factory and How Does a Digital Twin Fit In?

The term “digital factory” is a transformative shift in business processes. At its core, the digital factory is about eliminating paper from the manufacturing process. It enables equipment to communicate directly with other equipment, applications, and processes. It enables real-time decisions, ensuring timely and accurate responses to ever-changing manufacturing needs. By harnessing historical data and combining it with current metrics, the digital factory can predict future trends, preempt challenges, and optimize operations like never before. 

Fig. 1: Industry 4.0 Technologies enabling the digital factory

In a digital factory, digital twin technology creates a virtual replica of physical objects, linking the physical and digital realms. This enables real-time data flow and enhances decision-making, perfectly aligning with the digital factory’s objectives.

Industry 4.0 provides technologies that enable the digital factory (Fig. 1) by offering real-time information that could transform decision-making, powering predictions that can enable enterprises to stay ahead of market trends, enabling autonomous factory operations, creating entirely new revenue streams for factories, helping fully integrate systems data, and enabling processes to be radically transformed with new technologies.

Industrial Digital Twins

Fig. 2: Digital twin representation of a building

The physical object which is used to create a digital twin in the industrial world could include an operational system such as a product line, an HVAC system, Electrical System or Data Center (Fig. 2). Sensors on the object collect different aspects of asset performance, energy output, and system health where downstream analytics monitor setpoints, schedules, sequences, or models to identify issues. This data is then relayed to a processing system and applied to the digital copy. A digital twin mirrors the state of an asset, person, process, or organization, and their interactions. It offers spatial context to the unified data, which helps with the decision process. The spatial element also provides deeper context and draws relationships to other elements.

Gartner estimates that by 2027, over 40 percent of large companies worldwide in the industrial sector will be using digital twins in their project. According to IoT Analytics Digital Twin Market Report, the digital twin market is expanding, with a projected CAGR of 30% between 2023 and 2027 (Fig. 3). 29% of global manufacturing companies have either fully or partially implemented their digital twin strategies.

Fig. 3: IoT Analytics Digital Twin Market

Digital twins are revolutionizing manufacturing by providing a detailed, real-time understanding of every aspect of production leading to additional revenues, cost savings, or productivity gains. Whether through optimizing processes, enhancing product design, or ensuring equipment reliability, the use cases for digital twins are broad and transformative. As the technology continues to evolve, its adoption will likely become essential for manufacturers aiming to stay competitive in an increasingly digital and data-driven industry.

Interoperable Digital Twins

Data Assimilation

The importance of a digital twin is to be able to bring in spatial, live, historic, and static data. To unify all of that data, there is a need for a common data model so that the information is synchronous, it is possibly in the same payload, it provides a snapshot in time, and it provides historical information so that the user can understand the true impact of what is going on in the organization. This data model could be a Unified Namespace (UNS), which we will cover later in this article.

Semantic Relationship and Data Orchestration

There also needs to be a context to the data which includes the semantic relationships to understand how an event downstream can impact an event upstream or vice versa. For example, a temperature sensor data (while holding a valuable piece of information regarding the factory) cannot function in isolation to provide an operating picture. It becomes more powerful in the context of semantic models where the data is orchestrated to form the digital twin. 

Visualization and Consumption Interface

This is all about providing the right amount of data to the right people in the format that they need to make their decisions. If we are going through the effort of constructing this visualization and consumption interface, this has to be significant enough for the operator. It must move the needle and impact their business process. It’s all about making their life easier and delivering better quality and quicker outcomes. 

Data Consumers Have Various Needs

Different data consumers might have different use cases. Based on who is using/consuming the data, surfacing the right data at the right time to provide an intelligent prepared response with detailed clear recommendations is key. 

Digital Twin Maturity Model

A digital twin project is always a journey. All such projects start with leadership buy-in and a realization that the world has changed. The workforce with institutional knowledge about operations have retired and compensate for that all that institutional knowledge needs to be baked into the digital processes that would offer recommendations and tools that would allow you to crunch data and make quick decisions. The starting point in the Digital Twin Maturity Model (Fig. 4) is the System of Systems approach and the Common Data Environment (CDE) looking at all of the data points that need to be brought together. 

Credits: Digital Twin Consulting Fig. 4: Digital Twin Maturity Model

Level 1: Descriptive

The first step is describing the physical environment and understanding it. At this point, the data is static in specific systems and there are no integrations, communication, or automation built in. 

Level 2: Diagnostic

The second step is diagnosing with a certain level of intelligence. We start to bring some related things together, create some automations, and build dashboards with operational insights.

Level 3: Predictive

The third step is to be able to predict outcomes by consolidating and contextualizing data. This would involve some more automation with connected data and systems. This would also involve machine learning models. 

Level 4: Prescribing

The fourth step is prescribing solutions based on real-time data from integrated systems and making recommendations for future optimizations by being situationally aware.

Level 5: Autonomous

The fifth and final step is to get to autonomous processes where continuous decisions can be made based on business intelligence. You would like a lot of AI usage especially Generative AI based on the mature data sets and the system is completely self sufficient. Many companies are not quite there yet as they still need the human element to train the models and reign some control.

Best Practices in Setting Up and Maintaining Digital Twins

Credits: Digital Twin Consulting Fig 5: Digital Twin setup process

As mentioned in Fig. 5, the best practices when it comes to setting up and maintaining digital twins is a very structured approach of various steps from defining objectives and requirements to institutionalizing them. Along the way, a number of actions need to be taken as has been described above.

The underpinning failures happen in an organization due to not foreseeing the operational impacts when institutionalizing these processes. It is important to understand the business problems we are trying to solve with this, what sort of data is needed for this (such as real-time data vs. non real-time data), and what sort of incremental value can the project drive to the company and build on its success to drive the rest of the digital twin projects.

Digital Twin Assessment Process (DTAP™)

Digital Twin Consulting’s Digital Twin Assessment Process™ (DTAP) is critical for setting up and maintaining digital twins, from defining objectives to institutionalizing the process and launching enterprise-wide implementations. Successful digital twin programs focus on strategic master planning and project roadmaps with agile project feedback loops to stakeholders to ensure use case deployments are successful, and project goals are met.

Credits: Digital Twin Consulting Fig. 6: DTAP process

The DTAP process involves 3 main steps as described in Fig. 6 above.

Phase 1: Strategic Planning

This step is all about understanding leadership goals, identifying business needs, gathering data about business systems, assessing data quality, identifying various use cases to drive enterprise value, understanding data and system dependencies, assessing required investments, uncovering critical gaps, and aligning with potential financial outcomes. 

Phase 2: Roadmap Development 

This step involves going through a gated process to develop tactical project roadmaps for each digital twin use case implementation, including project planning, estimating costs, scheduling tasks, ensuring that available technologies are scalable to the use cases, and gathering the necessary data points to support each project roadmaps

Phase 3: Implementation

This step involves oversight of each use case implementation project and program management of digital twin testing, training, launch, and adoption.

Data Management Best Practices for Digital Twins

Traditional Factory Architecture

Most existing industrial systems model traditional pyramidal network-and-system architectural structure (the ISA95 functional model). As depicted in Fig. 7 below, this architectural approach is characterized by a technology stack that includes factory-floor components at the bottom and enterprise/cloud components at the top. Within this stack, each layer is connected and only communicates to a layer that is directly above or below. So, data moves up or down one layer at a time using point-to-point connections.

Fig. 7: Traditional ISA95 industrial data architecture

While this architecture served its purpose when manufacturing systems were isolated, didn’t need to talk to each other, data didn’t need to be combined, and security technologies weren’t mature enough, in the Industry 4.0 digital use cases (such as digital twins) where data from each layer is needed, that traditional architecture falls short. It’s not scalable due to the point-to-point integrations needed to move data from shop floor to top floor, it stifles innovation, accrues technical debt, and results in data silos. The traditional Industrial Internet of Things (IIoT) client server architecture (Fig. 8), which was created to overcome these data silos, also becomes limiting due to the need to have multiple dedicated connections, which leads to a spaghetti architecture and strains network bandwidth. 

Fig 8: Typical client server based IIoT architecture

MQTT was created as an IIoT messaging protocol to overcome these challenges. It is a lightweight publish/subscribe-based messaging open standard protocol used in IIoT for data communication and is designed for efficient communication between industrial data sources and smart factory solutions in low-bandwidth, high-latency, or unreliable networks. Instead of needing to set up direct communication as in the case of client server architectures, in MQTT architecture both publishing and subscribing clients communicate through a centralized server also known as a message broker (Fig. 9). This enables multiple enterprise systems to subscribe and ingest various manufacturing data via MQTT topics, and allows devices to publish data to a broker and applications to subscribe to that data. MQTT solutions like HiveMQ Enterprise Broker can help address various traditional IIoT data challenges faced while supporting digital twin use cases by facilitating real-time data exchange, with high reliability, scalability, and security.

Fig. 9: MQTT publish/subscribe data architecture

Unified Namespace

Unified Namespace (UNS) is a data framework that helps overcome the challenges faced with a traditional ISA95 model architecture show above by providing:

  • The single source of truth for all data and information of the manufacturing operations
  • The place where the current state of the industrial operations lives
  • The hub through which the smart assets in the company’s business communicate
  • The architectural foundation of the company’s Industry 4.0 and digital twin use cases.

A UNS is able to take the manufacturing operations technology (OT) data which is typically organized in the ISA model of Enterprise->Site->Area->Line->Cell into a more open architecture (Fig. 10). In that architecture, the OT is combined with the Information Technology (IT) data providing the common data layer for all of the enterprise that is contextualized for advanced use cases like digital twins.

Fig 10: UNS based open data architecture

While UNS is a framework that can be implemented in many ways, an MQTT broker with its publish/subscribe architecture is an ideal way to implement it. Now let’s take another look at the traditional ISA95 architecture with UNS and MQTT on top (Fig. 11).

Fig. 11: Traditional  ISA95 architecture with UNS and MQTT

As can be seen in the above architecture, the data silos that were existent in the ISA95 architecture are broken. Data can be consolidated into UNS at the Area, Line, Site or Enterprise levels. This eliminates the need to do custom programming and enables digital twin use cases at various levels. Using broker federation, it’s easy to stack the data on top of each other and provide controlled access to the relevant data for various applications and personnel ensuring utmost safety in an efficient manner.

HiveMQ Data Management Enabling Digital Twins

HiveMQ is a proven enterprise MQTT platform that is reliable under real-world stress, is built for flexibility, security, and scale, and provides real-world solutions to support Smart Manufacturing digital twin data use cases. As can be seen in Fig. 12, HiveMQ is very suitable for implementing a UNS data architecture.

Fig. 12: HiveMQ IIoT Enterprise Data Architecture for Smart Manufacturing

Apart from supporting all the features in the latest MQTT 3.1.1 and MQTT 5 specifications (such as topic wildcards, persistent sessions with offline queuing, retained messages, and quality of service levels), HiveMQ offers additional enterprise-grade features in the areas of reliability, scalability, security and extensibility.

Specifically for digital twin use cases, the key need is real-time, high-quality, edge-consolidated, transformed, normalized, and contextualized data. This is to ensure that advanced digital twins can be built to support the various use cases mentioned above. 

HiveMQ provides a HiveMQ Edge gateway that can translate device sensor data points in various formats like OPC UA, Modbus etc. into MQTT; store and forward the data if connection is lost; transform, normalize and contextualize the data through Data Hub so that it can all be interpreted easily and bridge to an enterprise broker. Data can be further consolidated, transformed, and contextualized in the Enterprise MQTT broker using the Data Hub feature. This enables creation of the UNS at the different levels to enable the data needs for digital twin use cases.

Digital Twin Case Study – DFW Airport

Specifically for DFW Airport which is the second largest airport in the world, HiveMQ supports digital twin use cases for smart city and logistics needs. Some of the challenges faced included need for reliable real-time communication from 5 different controls systems, legacy communication protocols, and need for remote end device communications. HiveMQ provided the Enterprise MQTT platform solution which standardized IoT with Sparkplug B, and streamed data between the digital twin, remote devices, and controls. This enabled real-time fault detection and diagnosis (FDD) for the digital twins to help reduce energy use by 20% and drive operational workforce efficiency by 25%. It also enabled district water metering by reducing unaccounted water loss from 25%-10%. It also increased flow/pressure/quality communication from 6 hrs to 15 min intervals. Other benefits included integration of flight data with controls, fault detection, operational oversight for passenger bridge perfect turn, asset condition monitoring, and reduction of jet-fuel burn goals.

Harnessing the Power of Digital Twins to Make Smart Decisions

For the digital twins implementation to be effective, a robust data management solution is necessary. The IIoT data management solution from HiveMQ encompassing MQTT and UNS data frameworks can provide that to enable the digital twins use case. To learn more about HiveMQ, download and try out the software for free. To get started on your digital twins and get your DTAP assessment, contact Digital Twin Consulting. 


This blog is a guest article from Ravi Subramanyan.
Ravi Subramanyan is Director of Industry Solutions, Manufacturing at HiveMQ. He has extensive experience delivering high-quality products and services that have generated revenues and cost savings of over $10B for companies such as Motorola, GE, Bosch, and Weir. Ravi has successfully launched products, established branding, and created product advertisements and marketing campaigns for global and regional business teams.