Our actions and decisions are only as good as the data we use to rule them. That being true, when designing complicated systems, ensure the integrity of the data set involved becomes enormously important. Obviously, the data that resides in a system should be accurate. However, it also needs to have integrity, which means consistent and not unnecessarily redundant.
A good data set is invaluable. Done well, it puts all of the data elements in one place, making it easy to share, analyze, extrapolate, and use. Which system owns the data is potentially even more important. If you want to fundamentally alter the data based on business rule or a business logic, then it should be done in the system that owns the data—so it’s critical to track the lifecycle of the system and to refer users to the primary system for specific data.
Image courtesy: Pixabay
For example, sales order (SO) information is always owned by an enterprise resource planning (ERP) system, since that is where all the aspects of the order are handled and tracked throughout the lifecycle of the sale. That same sales order might be downloaded to a warehouse management system (WMS) for fulfillment. The WMS will use the order information to fulfill the order, to optimize the fulfillment process, for packaging instructions, and more. However, this system should only allow information lookup—while changes to the data should be done in the ERP.
Let me offer a real-world example. When I was working at multinational computer company, we sold computers all over the world. We had a complicated challenge of streamlining the system integrations pertaining to shipment data from various manufacturers worldwide. These suppliers would share advance shipment notifications (ASNs) through electronic data interchange (EDI). About 856 transactions in all, these provided all shipment information that allowed the recipient to be ready to receive the product. Our goal was to create a single consistent architecture so that the Global Supply Chain Systems Group could support it.
The company, which was 30 years old, had gone through multiple acquisitions and had teams all over the globe. Any change had the potential to impact a lot of people.
So how do you skin this cat? So how do you streamline all these system integrations in a smart way that makes the system manageable? We ended up leveraging GT Nexus, a global supply chain management platform. In its infancy, the company created a system that provided visibility to ocean shipment data including current status, origin, destination, ETA, path, modal etc. GT Nexus began accumulating this information, eventually it became the repository for this information worldwide. It was so easy to login to the system and get all the information about the shipment as long as you had a shipment number. By integrating into the system, GT Nexus became the system of record for all the shipment data until we completed our streamlining.
It is very important to have a system that owns the lifecycle of the data set to create a single source of truth. More important, maintaining a single, simple architecture eases integration into other systems
What are some of the data integrity and system of record challenges you have run into, while optimizing, integrating, streamlining, maintaining different supply chain systems in your organization? How did you overcome those challenges? Let us know your thoughts in the comments section below.