Digital Twin

Digital twin technology holds the power to provide significant value to enterprises. Still, many organizations struggle to attain their anticipated usefulness from the digital technology. Our interviews with experts identified seven common mistakes they cited as reasons why some organizations fail to implement this technology effectively.

When designing, creating and implementing digital twins, it’s crucial to avoid: 

Oversimplification. In a rush to implement their digital twins, experts warn that organizations move too swiftly and oversimplify their design, implementation and potential impact. By viewing digital twins as simple virtual replicas, they fail to consider the intricate interplay between data, analysis and real-time data pipeline synchronization.

“Many organizations try to create a “one size fits all” digital twin. This doesn’t account for unique aspects within different processes or systems,” says Devin Yaung, senior vice president of group enterprise IoT products and services at telecommunications and IT services provider NTT.

“Digital twins should not be an afterthought, Yaung warns. “Digital twins need to be done in a holistic manner, with every consideration tied back to what your digital twin is trying to accomplish. This will be different for every industry and every business.”

A lack of clear objectives. Failing to define specific business goals or use cases for the digital twin can lead to wasted efforts and resources. Without clearly defining business goals, organizations find themselves lost in options and are lucky if they accidentally find themselves able to leverage the full potential of their digital twins. This will undoubtedly lead to wasted budget and time as teams flail in digital twin creation. This invariably leads to poor return on investment and drains executive enthusiasm toward budgeting future model development. No organization will be willing to continuously invest in flashy but ineffective digital twin models that don’t contribute directly to improved business outcomes.

“You have to build a strong business case for digital twins, as well as investments in automation, and understand how these investments will impact overall productivity and costs,” adds Keith Moore, CEO at AutoScheduler.ai.

Poor data quality. Digital twins are only as good as the data they’re built on. Inaccurate or incomplete data or faulty model assumptions will lead to unreliable model outputs. If the data that feeds the most accurate model designs turns out to be faulty, there’s no way the results can be trusted. The impact of the digital twin effort within the organization will be far-reaching as executives lose trust in the digital twin concept.

“When you try to create a digital twin, you must understand all of the processes involved, and the inputs must be accurate as well as the model itself,” says Ilya Smirnov, head of AI/ML software development firm Usetech.

“And without a deep comprehension of the underlying processes, organizations risk developing digital twins that cannot accurately predict and simulate real-world scenarios,” adds Smirnov.

Neglecting cultural and human factors. According to Yaung, plenty of experienced workers resist technological change. Yaung advises organizations to get these workers involved in helping to create the digital twins they will be using.

According to Yaung, this is especially true in areas where physical and digital are converging, such as with mechanics and factory floor workers. “They’ve been doing their jobs since before data was available in the OT environment,” Yaung notes. “The notion is: My intuition is going to be better than some college kid telling me all these vibration readings will lead to a failure, because I’ve been managing this pump for 20 to 30 years.”

The solution, Yaung says, is to get these workers involved in the digital twin development process. It will increase the models’ accuracy and their trust in the system. “Focusing solely on technology without considering the people who will use it can lead to resistance and implementation challenges, which is why it’s critical to have cross-functional input and a unified strategy, not just focus on the technology,” he says.

Inadequate understanding of physical processes. Whether modeling a new vehicle design, factory floor, chemical plant operations or warehouse operations — the digital twin must incorporate all of these systems’ physical dynamics.

Moore stresses the importance of understanding a warehouse’s physical processes and constraints when using digital twins to model smart warehouse technologies and automation. “Warehouses are inherently volatile and uncertain, with many factors outside the warehouse’s control,” explains Moore. “This means warehouse operators simply can’t model the warehouse down to the “nuts and bolts” level because there is too much uncertainty and variability in real-world operations,” he says. Instead, Moore advocates for what he terms an “operational twin,” which models the high-level flow of goods in and out of the warehouse and considers the key constraints and processes rather than a detailed digital simulation. “There’s no digital twin for a full warehouse that plays forward the future for the next 24 to 48 hours; that just doesn’t yet exist,” he says.

Ignoring data scalability. Failing to consider how the digital twin will scale as more data and complexity are added can limit its long-term usefulness, says Joseph Batista, a business technology consultant, and former chief creatologist at Dell Technologies. “Scalability isn’t just about handling more data – it’s also about integrating diverse data sources and technologies. As digital twins evolve, they often need to incorporate data from various systems, sensors and external sources. A scalable architecture allows for easier integration of new data streams and technologies without major overhauls,” says Batista.

Batista cites the data storage demands for a health care payer to collect standard sleep metrics: Sleep apnea, Fitbit and EKG data from 75 million members would create a data volume need of potentially 75 terabytes of incremental storage every week. “You must understand the data needs and be able to handle the scale.”

Batista adds that any organization that fails to plan for the exponential growth of digital twin data that will be both needed to feed their digital twins and generated by their digital twins will find themselves unable to scale their models.

Lack of cross-functional collaboration. A clear message from all of the experts we interviewed: Not involving all relevant departments in the development process can result in a digital twin that doesn’t meet the needs of all stakeholders. After all, when teams function discretely, they lose out on the diverse expertise in their organization, and digital twins fail to cover the complete complexity of the systems they are meant to model. This lack of collaboration can fuel the cultural and human factors mentioned earlier.

“Organizations that don’t foster cross-functional teamwork in their digital twin initiatives risk developing a disjointed and incomplete model, and their digital twins will fall short of their transformative potential,” says Yaung.

Organizations can create more effective and valuable tools to model their systems and operations by avoiding these mistakes and being thoughtful and comprehensive in digital twin development.