The concept of technology-enabled digital twins has been around for at least two decades, and many industries including energy and utilities have invested in digital twin programs in the recent past. However, the utility and energy industry, in particular, has faced some specific challenges in its digital transformation initiatives when attempting to scale the value of its digital twin efforts. Now, maturing industry standards and open architectures give energy producers and utility operators new tools to scale and find new applications for digital twins.
Digital twins are virtual models of physical systems that can simulate, monitor, analyze and optimize the physical world. Despite the complexity, industry has been successful building digital twins with relative ease for specific proof of concepts (POC). What’s proved difficult is getting the necessary data out of silos and legacy systems so digital twins can scale. Especially in energy and utilities, many digital twins got stuck in proof-of-concept phases that relied on proprietary programs with their own ways of naming and storing data. As a result, many organizations in the sector haven’t yet realized a strong return on their initial digital twin investment.
New Incentives and Solutions for Digital Twin Advancement
The Inflation Reduction Act offers incentives for energy producers and utility providers to modernize, including Energy Infrastructure Reinvestment loans and clean energy loans. Plans that include digital twin technology to reduce emissions, model the transition to renewable energy sources, or help harden the grid against climate events stand a better chance of getting funded than plans without technological foundations. However, digital twin proposals will only deliver ROI if organizations get new fundamentals in place.
Those fundamentals start with a clear scope for digital twin projects. Rather than try for an ambitious end-to-end, full-lifecycle digital twin, it’s better to start with one part of a process, role or asset that could benefit the most from digital twin-enabled modeling and optimization and then identify one stage of the lifecycle to model. For example, a digital twin for maintenance at a chemical plant that’s already operational will look very different from a plant design digital twin for a chemical plant that’s still in the planning stages.
With the scope of the twin defined, the next fundamental is interoperability. In fact, interoperability – decoupling data from the software that uses it – matters most. That’s because the digital twin will need data from multiple sources, and while each software solution classifies and stores data in its own way, all of that data needs to be standardized and consistent for the digital twin to use it. Now, organizations in all kinds of industries are working to standardize their data as part of the digital transition, but energy and utility organizations have an additional challenge: standardizing the operational technology devices that provide data, so that those devices have a consistent nomenclature based on their role and their location within the facility. For example, a pressure transmitter on a line, monitoring a particular process, may be the exact same model of transmitter that exists to monitor different processes in different parts of the plant. And the same array of transmitters, each with a different process to monitor, may exist at each of the organization’s plants around the world.
The naming standard for those transmitters needs to be consistent within and across facilities, and it needs to factor in the transmitter’s process and location. If these data sources are labeled consistently, the data they produce will have a common name that makes storage and retrieval easier. That saves the organization from needing to constantly clean up data, which would impede the ability to scale.
Now, organizations are working to adopt and use standardized data formats and ontologies. For example, ISO14224 and CHIFOS standardizes data collection and sharing of data on oil and gas equipment. These standards, and others like them, will allow enterprises to access and use their data across design, operations and maintenance, and across the lifecycle of the plant. Instead of pooling non-standardized data in data lakes, organizations can store and update standardized data in the cloud. Standardizing data also allows organizations to pool data from multiple facilities built years or decades apart, which use different technologies, for modeling and process management.
Cloud technology now enables digital continuity that wasn’t possible in the early days of digital twins. As cloud-based platforms continue to evolve new capabilities, energy and utility organizations can manage end-to-end product and asset life cycles, supply chains and other processes more efficiently.
Digital Twin Benefits for Energy and Utilities
Digital-twin modeling and cloud-based process management, driven by standardized, unified data, can deliver a range of benefits. These include delivery optimization, emergency preparedness, and enhanced resiliency in the face of new climate challenges. For example, a digital twin survey found that organizations using digital twins for efficiency improvements “have realized an average improvement of 16% in sustainability.” The survey also found that half of energy and utilities companies are already using digital twins for emissions predictions, to assist with planning and to track progress on greenhouse gas emissions reduction.
Digital twin-enabled models are also changing the employee experience for energy producers, in ways that make the work safer, greener and more efficient. One oil and gas company cited in the report is using digital twins of its platforms to allow engineers and other experts to do more than 4,000 hours of work onshore, rather than traveling to offshore platforms. Another producer has cut offshore hours by up to 50%, reducing the amount of time employees are exposed to risks on their platforms at sea.
Moving Beyond Data Fundamentals With Digital Twins
Once energy and utility organizations have their data unified, labeled and organized consistently, they can focus on other critical drivers that use the data: new technology architectures, new or revised roles for people who use the data to drive outcomes, and the optimal cloud platforms to drive maturity.
The digital twin maturity model starts with data, which is why we’ve spent so much time looking at it here. With data in place, the digital twin can begin modeling scenarios and facilities. Over time, the twin can be expanded or scaled up to provide performance assessment based on process and equipment data. Next, the addition of AI and machine learning capabilities can help organizations plan and deploy autonomous modeling operations. Finally, organizations that build use- and results-oriented models can deliver their results through their own platform as a service.
All these benefits rest on a solid data foundation. Now that the technology and the incentives are here to make the most of digital twins, it’s time to take the first step toward realizing them.