Sometimes it’s the smallest thing that makes a difference. The “discovery” of the number zero transformed science and math and paved the way for our current technology-enabled world. Modern epidemiology and GIS (geographic information systems) have their roots in John Snow’s seemingly straightforward work in 1850s London to plot cholera cases on a map. 

Even today, when the impacts of those discoveries are all around us, the transformative effects of seemingly minor changes continue to be seen. Take digital twins as just one example: often cited as one of the “hero technologies” for the digital transformation of oil and gas, the often-overlooked areas — like automated engineering tag management — help them retain their value.

Digitalization Dream

This is something that engineering companies have been wrestling with for some time as they provide digital twins to their asset-owning clients at the point of project handover. With its advanced analytics, visualizations, and advanced communications technology, the digital twin is expected to provide seamless access to trusted, fail-safe data supported by relevant documentation to operations and maintenance teams throughout the oil and gas sector. 

In the best-case scenario, a digital twin significantly increases operational efficiency while reducing HSE and compliance risk — especially valuable in high-risk offshore environments. Operations teams spend less time searching for content and can instead focus on value-added engineering tasks.

The prize is a great one: imagine the value of an offshore platform, fully mapped out in such a way, to both those conducting operations and maintenance tasks on-platform and for land-based engineers. And so, those engineering firms put in expensive and laborious processes for compiling a 3D digital model that incorporates varying degrees of design and operational data. 

But, if that is all it is, then what they hand over isn’t a digital twin. It’s an exercise in cartography. They’ve handed over a map different in form but not in content to the one that John Smith used as a starting point for his studies in the 19th Century. That can be useful in the right hands, as John Smith himself proved, but the point and potential of a digital twin are surely that the intelligence is built-in, not applied from the outside by a brilliant mind.

Missed potential

The map is a snapshot of a moment in time. It can be a useful navigational aid and yield plenty of valuable information about that moment to a well-trained eye. It is not a real-time representation of real-world topography. Such a map cannot drive greater efficiency and safety into the operation of the asset. 

This map cannot solve the same problems that arise in any scenario where data and documents are out of date. It still takes too long to locate the correct documents or data needed for a routine task, and it risks using out-of-date information in an operational environment. The results can be catastrophic for critical oil and gas activity if, for example, a shut-off procedure has changed but not been updated in documentation. 

We got to this point because, for many, a digital twin is conceived of as a smart-looking replacement for the reams of paperwork that previously accompanied a major asset handover, a  technological upgrade rather than a truly digital transformation. Although digital documents and 3D visuals can be valuable and certainly an improvement on centrally located physical files, this approach really is just dipping a toe in the water of what can be achieved. 

A digital twin is supposed to be living and dynamic. It is updated in lock-step with its real-world counterpart and offers the user an ever-evolving array of related, updated data at the click of a mouse. It is, in fact, closer to a 4D model with time — and the changes it brings — being the crucial element that manual processes and basic automation cannot capture. 

An obvious question then is how the engineering company can offer an evolving digital record of the asset it has designed and delivered, as well as its ongoing operations, once its team has handed over the keys and stepped back from a completed project.

This is where automated tag management comes in. 

Manual leftovers

For a digital twin to be fully useful, it needs to be “tagged.” In other words, every little component or system needs to have a tag attached that associates it with the relevant technical documentation, operational history, maintenance information and all the rest. 

Traditionally, tagging has been done manually or subcontracted to a third party to do manually. It is an immense job, whoever does it, and it adds huge amounts of time and expense to a project. 

Consider a large asset such as a North Sea platform that will typically have somewhere between 100,000 and 200,000 documents attached to it, which may be associated with 50,000 to 100,000 tags. Even if each document requires only 20 minutes of work to extract and validate tags — 10 minutes from the document controller plus 10 minutes from an engineer — that comes to nearly 4,200 days. Even smaller semi-sub platforms or FPSO-based projects could produce months of delay and expensive labor. 

Having spent the equivalent of 11 man-years on tagging, those tags then need to be kept up to date if the digital twin is to remain a reflection of the live asset. It requires repeat tag-extraction and data-gathering projects, either at regular intervals of the asset’s life or during standard project execution.

All of that is before we consider the need to prioritize tagging projects such that the most important information and facility-critical data are handled first, or the errors that inevitably occur when manual processes are long, detailed and repetitive.

Given the scale, time and expense of the task, it is perhaps clearer why digital twins are often not kept live and up to date. The sheer volume of asset documents and data to maintain can be overwhelming. And digital twins have been underachieving as a consequence. 

Automated tag management

Very simply put, automated tag management replaces these severely sub-optimal manual processes and eliminates the problems associated with them. As the name suggests, it automatically scrapes all the relevant tags associated with the asset and then automatically assigns them to the right data and documentation. The key to success is making data gathering a seamless part of project execution. By using a centralized project collaboration and document control solution that the entire oil and gas supply chain is connected to, data is gathered automatically and on an ongoing basis.    

As a bonus, it considerably streamlines the task of creating the digital twin in the first place, providing the solid foundations on which the digital twin is built.

Automatic tag management is beautifully simple and truly transformative at the same time. Asset owners have reported that the amount of time members of staff in operations and maintenance sides spend on locating necessary documents has been reduced by 50% because they are no longer chasing down missing or incorrect tag data.  

Those achievements are substantial in their own right. But if we pan out, we can see there is even more at stake here. There is now a record of failed initiatives and companies wasting millions on projects that have been underwhelming at best. If digital twins and related digitization projects continue to underdeliver, it becomes a barrier to further investment and risks stunting the progress of a very necessary digital transformation for the oil and gas industry, particularly when considering the digital needs of companies looking to demonstrate progress against the backdrop of the energy transition.

The promise of digitalization was always a better use of resources, lower costs, greater safety, and even improved sustainability. All advances that cannot be ignored. A digital twin and smart project management solution helps realize that, and automated tag management is the unsung hero technology that brings these tools to life. 

About the author: Steve Bruce, Product Director at Idox, leads the product strategy and development on Idox’s wide-ranging assets portfolio which includes EIM. He has over 20 years of experience in the software business and has been instrumental in leading the development of Idox’s Cloud developments. 

Having joined Idox in 2006, Steve now leads a multi-national team of developers, engineers and product managers.

A graduate of Strathclyde University, he has a BEng in Computer & Electronic Systems.

LEAVE A REPLY

Please enter your comment!
Please enter your name here