As a design guru, the software developer delves into logistics service providers' requirements like no other. He is passionate about exchanging information securely and efficiently and thus speeding up the physical logistics process.
Boost efficiency with central master data
Every logistics process has to start with master data: information about articles, customers and suppliers. The better it is maintained, the more successful a freight forwarder can be. Especially when the forwarder uses decentralised IT systems. However, these systems need to obtain their master data from a central source. This supports process quality because:
- The master data is required in each work step
- Process automation requires exact master data
- Master data is extremely difficult to synchronise
- Costly re-editing of incorrect master data lowers contribution margins
Deliveries can certainly be made without central master data. However, an efficient process requires error-free information in all production systems.
Decentralised systems – one master database
As logistics processes become increasingly specialised, there has emerged a large number of microservices tailored to these processes. They are used to better support complex workflows. They are executed independently of the transport management system (TMS). The TMS manages the shipping orders, while the orders are processed by running the microservices. That is why it is important for every single step in order processing to be linked to the shipping order. This can only be done using the master data. But caution is in order: mistakes can easily occur if the data is captured independently in each microservice. Each entry may produce inconsistencies, after all. It can quickly become very difficult to synchronise all these inconsistent entries, depending on the number of services used. It also harbours the risk of overwriting correct master data with incorrect, outdated data if the synchronisation is performed in the wrong direction at the wrong time. A central master database can fix this problem. As a single point of truth, it manages the master data for all the processes carried out at the freight forwarding company. It is the lead system used to capture or modify data. This system, in turn, shares changes with the microservices through an event sourcing platform.
One master database – many connected services
All freight forwarding applications – from freight payment terms to packaging data to time slots – use the same master data. If they change, the data is first carefully processed and stored in the central master database. The master database then sends the change to the event sourcing platform. Individual microservices then repeatedly query the platform about changes to master data of relevance to them. If there are changes, they usually accept a copy of the data in an appropriate format. For example, the accounting software will accept changes to payables data. The notification system will not – that data is not relevant to it. The advantage: no synchronisation is required, resulting in far fewer simultaneous queries. The data exchange remains asynchronous. This reduces the load on IT systems and makes it possible to work offline. If the data source is not accessible, the systems can work autonomously and then apply the changes when it reconnects.
Enter or consolidate
It does not matter what system a forwarding company chooses as its central master database. What matters more is that the data is diligently maintained. This is because errors in the initial data entries will propagate to all connected processes. That explains why experts do not recommend transferring data directly from legacy systems. Before the data is used, it should at least be cleansed and checked for duplicates using artificial intelligence to consolidate redundant data. The cleanest and safest way to build an error-free master database is to recreate the entire database and then maintain it accurately. It helps to designate a dedicated process officer to perform these tasks. Once the master data has been verified and is maintained, it offers significant benefits that go well beyond operational processes. It becomes comparatively easy to use automated processes for analysing data from day-to-day operations and then generate reports on the analysis with BI systems. For example, forwarding agents can report on their contribution margins with a particular customer. However, this can only be done if all the systems are using identical customer data. Otherwise, data using different spellings would be aggregated. – So what initially seems to be drudgery turns out to be the starting point for efficiency and the key to effective financial analysis: central master data.
How do you manage your master data?