A common first step in developing a more resilient supply chain is to better understand the behaviour and characteristics of the supply chain as it stands today. Not how it is documented, but how it “actually” works.
Where are the bottlenecks? How are resources being utilised across the supply chain? What instrumentation points exist? How trustworthy or secure are the interactions? How compliant are the supporting processes?
Answers to such questions are critical to understanding how the supply chain needs to evolve to respond better to small or large disruptions. To find these answers, we can borrow from traditional techniques, such as process mining. However, instead of looking at and analysing application logs in the context of a single organisation, we can utilise similar techniques against logs – and, subsequently, the log-producing apps and processes – that span organisations.
Capture data from key measurement points across the supply chain – the internet of things (IoT) can help here if the number or quality of instrumentation points is insufficient – analyse this data using advanced analytics to correlate and derive insight into supply chain transaction behaviour, and then identify areas of improvement. Without this information, it will be difficult to determine where to start with any supply chain transformation initiative.
Next, it often makes sense to create a digital twin of the supply chain itself, or at least of key processes within the supply chain. This digital twin is similar in concept to the digital twin that is discussed in IoT circles, but in this case we are creating a model of one or more supply chain processes/interactions. This allows us to “test” (simulate) changes to the elements that comprise a supply chain without having to physically implement all of the changes to the actual processes.
This digital twin, of course, is built on a combination of capabilities, including advanced analytics, cloud, artificial intelligence (AI)/machine learning/statistical models, and various data management/data governance mechanisms. Techniques such as knowledge graphs can also be useful, as can the incorporation of enterprise logic in the form of application programming interfaces (APIs).
You probably won’t implement all of the supply chain as a set of digital twins, but such an approach can be very useful for simulating and testing changes to key stages of the supply chain.
Finally, based on the developed digital twins or other types of analysis, improvements can be injected into the overall supply chain itself. Again, technologies such as cloud, edge computing, AI/machine learning, IoT, or even blockchain can be useful to create a supply chain with greater levels of visibility, flexibility and adaptability.
Incorporate more “real-time” behaviours into the supply chain mechanisms, and turn the supply chain into a type of “responsive mesh” that is based on dynamic data capture, advanced/real-time analysis and responses, “predictive” behaviours, and the dynamic assembly of supply chain components.
Include the right levels of monitoring, governance and security, and you can promote a supply chain that is more resilient in the face of adversity.
Nelson Petracek is global chief technology officer at Tibco.