Digital Twin

Industry 4.0 is a popular topic these days. The “fourth industrial revolution” encompasses ambitious ideas including smart manufacturing, smart cities, and the industrial internet of things. Residing at the heart of smart manufacturing, and arguably the vanguard of Industry 4.0 as a whole is the “Digital Twin” concept.

We’re all familiar with the physical products that we encounter each day. Our cars, our appliances, our food. Well, all of these physical products have to come from somewhere, and those manufacturing facilities, farms, and distribution centers all touch our physical products at some point along the supply chain. The more complex the manufacturing or delivery process, the higher the chance our products have of being altered in some way.

It would be ideal if our frozen fish stayed frozen for the duration of its 400 mile trip from the coast – and if our phone battery never exceeded its maximal temperature tolerance on the manufacturing line – and if we could see that our car came from a provably high quality production batch before buying it. However, as of now, we don’t have a way of reliably assessing any of these metrics. This is where the digital twin comes into play.

The idea was first proposed in Gelernter’s 1991 book Mirror Worlds. Michael Grieves was the first to implement and publicly discuss his experiences with the digital twin idea in 2002. Subsequently, John Vickers formalized the idea while at NASA when he published a report that separated it into three parts:

  1. Physical product
  2. Virtual product
  3. Connections between the two products

Gartner threw its reputation behind the idea in both 2017 & 2018 by listing digital twin as a top trend and denoting it as a strategic technology, boldly predicting that digital twins would exist for billions of things in the “near future”.

As industry began to grasp the promises of digital twins, the concept was quickly subjected to an onslaught of three letter initialisms (fun fact: acronyms are abbreviations pronounced as words, while initialisms are abbreviations pronounced as their constituent letters. Think “EURO” vs “ECB”). Trust me, skip this next paragraph.

The digital twin prototype (DTP) is made of of everything that goes into the physical products … the digital twin instance (DTI) is a digital replica of the DTP once the DTP is created … aggregating the DTP and DTI creates the digital twin aggregate (DTA) … which when in the workplace is seen as a subunit of robotic process automation (RPA) … and can then be analyzed downstream and throughout the supply chain, as well as within the product life-cycle management (PLM).

Hopefully you skipped the initialism black hole above, because we can put this into layman’s terms and make it easier:

A digital twin is a digital replica of something that exists within the physical realm. As the product evolves and reacts to its environment in the physical realm, its digital twin simultaneously reflects these changes in the realm of bits. It’s as simple as that.

Traditionally, digital twins have been thought of as aiding firms with in-house manufacturing and quality control. Yet, as sensor technology improves and data storage becomes commoditized, there’s an increasing push to introduce digital twins that not only play a role in product origination and supply chain, but even in end-user interactions.

A German software company called Pickert is blazing the trail for end-to-end digital twin usage. Pickert specializes in software solutions for discrete manufacturing, and serves more than 350 clients both at home and abroad. Their reach ultimately includes hundreds of thousands of end users.

It was announced this month that Pickert is using the Tangle as their underlying data transfer and storage layer. A lot of times it’s difficult to get a glimpse into why companies choose one solution over another, or how they plan on implementing that solution into their tech stack. This is not one of those times!

Sven Rimmelspacher, a Pickert managing partner, graciously provided an in depth exploration of the IOTA collaboration from the perspective of his company.

Sven highlights the fact that modern enterprises are pressured to expedite the design and manufacturing process, often at the expense of accuracy and quality. As a result, mistakes are only realized after production and shipment. This is extraordinarily costly! Both Sven and IOTA’s Holger Kother mention the “rule of 10,” which says that as the product moves to the next step of production, the cost of fixing a defect increases 10-fold. In Germany alone, SMEs must spend nearly 50 billion Euros annually to correct low quality goods. Sven goes on to point out that companies spend 25% of their time correcting, rather then preventing manufacturing mistakes. Shockingly, 85% of these mistakes are made in early stages – apply the “rule of 10” to this number to get a feel for the cumulative costs.

Pickert is clear about what they’re striving to achieve: ZERO defects. They’re so committed that they have a zerodefects website and include “ZERO defects” in their company vision. It’s also the name of one of their products.

They recognize the industry’s hope for IT salvation, but point out that this hope amounts to frequently misguided IT investment culminating in a few extra consulting days. Data silos and communication mismatch remain prevalent.

To take a substantial leap past the status quo, they saw IOTA as the solution. Here are Sven’s thoughts on IOTA specifically:  

It is an open, secure and feeless infrastructure, globally accessible for everyone to document and exchange information and digital assets. IOTA has the advantage that it does not require any fees or even the use of cryptocurrency to store data decentralized and thus provides the optimal setting for digital twins to be safe against manipulation. It provides the freedom to share data globally across producers and with production partners which can add their own data and documentation. This way multiple partners are able to create an immutable digital twin across different production environments. Compared to other blockchain technologies, where fees cannot be forecasted and introduce a high financial risk, IOTA provides a strong economic advantage and security, as it is always zero.

As Sven says, with a digital twin implemented on top of an open, distributed, immutable information hub like IOTA, it’s easy to provide end users with all data that applies to the product they’re now interacting with. He goes on to laud the immutability characteristics of IOTA. There’s no more worry about data tampering or security when using an encrypted distributed ledger. It’s easy for a malicious actor to surreptitiously alter data stored in their own private databases. On the Tangle, this is impossible.  

Need to know the batch number of your towel rack as an end-user? No problem. But what about the intermediary company that needs to know granular information about one piece of the manufacturing process that preceded it? No problem. All sorts of meta data can be recorded and sent to every different type of stake-holder.

Go play around on this ZERO (DEFECTS!) traceability application to see just how detailed the information can get. Pickert is operating within their core competency of discrete manufacturing. But it’s difficult not to extrapolate forward when you see the roots of a great solution like this.

This week, more big news came across the desk: Object Management Group (OMG), the international digital standardization body with which IOTA has been working closely, announced formation of the Digital Twin Consortium.

The new Digital Twin Consortium’s mission statement couldn’t align with IOTA any better:

Digital Twin Consortium drives the adoption, use, interoperability and development of digital twin technology. It propels the innovation of digital twin technology through consistent approaches and open source development. It is committed to accelerating the market and guiding outcomes for users.

They openly acknowledge in their press release that digital twin solutions can be difficult to implement due to a “lack of open-source software, interoperability issues, market confusion, and high costs.” Dr. Solely highlights the difficulties arising from the lack of understanding and lack of standardization.

There appear to be four founding Consortium members including Microsoft, Dell, Ansys, and lendlease. Another 30 companies are listed as “Groundbreakers,” which have been members of Digital Consortium since its launch. IOTA is among these.

We look forward to collaborating closely with @ObjectMgmtGroup & other leading companies in the Digital Twin Consortium to accelerate the use of #digitaltwin technology across different industries. https://t.co/wKRrHMvCM1#IOTA

— IOTA (@iotatoken) May 19, 2020

Sensors will only continue to get smaller and cheaper. Interoperability will only continue to standardize. Imagine a world in which the Tangle hosts not just full manufacturing processes, but full supply chain tracking too. We could some day be using a phone-app to scan the barcode on our fresh produce to see how fresh it really is. Bright days are ahead for ambitious companies like Pickert who decide to build on the IOTA base-layer. The dream of end-to-end digital twins is speeding forward!

For our beloved German community, we’ve provided a text-only German translation of this article HERE!

Privacy Policy