Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Why composition is key to scaling digital twins

Couldn’t attend Transform 2022? Check out all the top sessions in our on-demand library now! Look here.


Digital twins allow enterprises to model and simulate buildings, products, production lines, facilities and processes. This can improve performance, identify quality errors quickly, and support better decision-making. Today, most Digital twin projects are one-off endeavors. A team can create one digital twin for a new gearbox and start from scratch by modeling a wind turbine that contains this part or the business process that repairs this part.

Ideally, engineers would like to quickly assemble more complex digital twins to represent turbines, wind farms, power grids and utility companies. This is complicated by the various components that go into the digital twin alongside the physical models, such as data management, semantic labels, security, and the user interface (UI). New approaches to assembling digital elements into larger assemblies and models can help simplify this process.

Gartner has predicted that the digital twin market will bridge the gap by 2026 to reach $183 billion by 2031, with composite digital twins offering the greatest opportunity. It recommends product leaders build ecosystems and libraries of pre-built features and vertical market templates to drive competitiveness in the digital twin market. The industry is starting to take note.

The Digital Twin Consortium recently awarded the: Possibilities Periodic System Framework (CPT) to help organizations develop composable digital twins. It organizes the landscape of assistive technologies to help teams lay the groundwork for the integration of individual digital twins.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to offer advice on how metaverse technology will change the way all industries communicate and do business October 4 in San Francisco, CA.

Register here

A new kind of model

There are significant similarities and differences in the modeling used to build digital twins compared to other analytics and artificial intelligence (AI) models. All these efforts begin with appropriate and timely historical data to inform the model design and calibrate the current state with model results.

However, digital twin simulations are unique compared to traditional statistical learning approaches because the model structures are not learned directly from the data, Bret Greenstein, data, analytics and AI partner at PwC, told VentureBeat. Instead, a model structure is surfaced by modelers through interviews, research and design sessions with domain experts to align with the strategic or operational questions that are predefined.

As a result, domain experts should be involved in informing and validating the model structure. This time investment can limit the scope of simulations to applications where ongoing scenario analysis is required. Greenstein also believes that developing a digital twin model is an ongoing exercise. Model granularity and system boundaries must be carefully considered and defined to balance time investment and model suitability with the questions they should support.

“If organizations are unable to effectively draw boundaries around the details a simulation model captures, ROI will be extremely difficult to achieve,” Greenstein said.

For example, an organization can create a digital network twin on a millisecond timescale to model the resiliency and capacity of the network. It can also have a customer acceptance model to understand the demand on the scale of months. This exploration of customer demand and user behavior at the macro level can serve as input for a micro-simulation of the network infrastructure.

Composable Digital Twins

This is where the DTC’s new CPT framework comes in handy. Pieter van Schalkwyk, CEO at XMPRO and co-chair of the Natural Resources Work Group at Digital Twin Consortium, said the CPT provides a common approach for multidisciplinary teams to collaborate earlier in the development cycle. An important element is a frame of reference to reflect on six capacity categories, including data services, integration, intelligence, UX, management and reliability.

This can help companies identify gaps in composition that they need to address internally or with external tools. The framework also helps identify specific capacity-level integrations. As a result, organizations can think about building a portfolio of reusable capabilities. This reduces duplication of shifts and efforts.

This approach goes beyond how engineers currently integrate multiple components into larger structures in computer-aided design tools. Schalkwyk said: “Design tools allow engineering teams to combine models such as CAD, 3D and BIM in design assemblies, but are typically not suitable for instantiating multi-use case digital twins and synchronizing data at a required twinning rate.”

Packaging options

In contrast, a composable digital twin draws from six clusters of capabilities that help manage the integrated model and other digital twin instances based on the model. It can also combine IoT and other data services to provide an up-to-date representation of the entity the digital twin represents. The CPT represents these various possibilities as a periodic table to make it agnostic to a particular technology or architecture.

“The goal is to describe a business requirement or use case only in terms of capabilities,” explains Schalkwyk.

Describing the digital twin in terms of capabilities helps align a specific implementation with the technologies that provide the right capabilities. This reflects the broader industry trend towards composable business applications. This approach allows different roles such as engineers, scientists, and other subject matter experts to assemble and reassemble digital twins for different business requirements.

It also creates an opportunity for new packaged business opportunities that can be used across industries. For example, a “leak detection” packaged business opportunity could combine data integration and technical analysis to provide a reusable component that can be used in a wide range of digital twin use cases, Schalkwyk explains. It can be used in digital twins for oil and gas, process manufacturing, mining, agriculture and water utilities.

Composability Challenges

Alisha Mittal, director of practice at Everest Group, said, “Many digital twin projects today are in pilot stages or targeting very unique assets or processes.”

Everest research has found that only about 15% of enterprises have successfully implemented digital twins across multiple entities.

“While digital twins offer huge potential for operational efficiency and cost reduction, the main reason for this slow scaled adoption is the composability challenges,” said Mittal.

Engineers are struggling to integrate the different ways equipment and sensors collect, process and format data. This complexity is exacerbated by the lack of common standards and frames of reference to allow easy data exchange.

Suseel Menon, senior analyst at Everest Group, said some of the critical challenges they heard from companies trying to scale digital twins include:

  • Nascent data landscape: Brushing up on data architectures and data flow is often one of the biggest hurdles to overcome before digital twins are fully scaled to factory or enterprise scale.
  • System complexity: It is rare for two physical things to be similar in a large operation, which makes integration and scalability difficult.
  • Talent Availability: Companies struggle to find talent with the right technical and IT skills.
  • Limited verticalization in out-of-the-box platforms and solutions: Solutions that work for assets or processes in one industry may not work in another.

Stringing the pieces together

Schalkwyk said the next step is to develop the composability framework on a second layer with more detailed descriptions of its capabilities. A separate effort for a digital-twin-capabilities-as-a-service model will describe how digital twin capabilities can be described and delivered in a zero-touch approach to a capacity marketplace.

Ultimately, these efforts could also lay the foundation for digital threads that help connect processes that span multiple digital twins.

“For the foreseeable future, we believe a digital, thread-centric approach will be central to enable integration, both at the data platform silo level and at the organizational level,” said Mittal. “DataOps-as-a-service for data transformation, harmonization and integration across platforms will be a critical opportunity to enable composable and scalable digital twin initiatives.”

The mission of VentureBeat is a digital city square for tech decision makers to learn about transformative business technology and transactions. Learn more about membership.



This post first appeared on Top Tech Easy, please read the originial post: here

Share the post

Why composition is key to scaling digital twins

×

Subscribe to Top Tech Easy

Get updates delivered right to your inbox!

Thank you for your subscription

×