Rotterdam harbour
Project

Industry 4.0 improves the position of the Netherlands as a distribution country

Niels Noordijk, business consultant for Logistics

  • 4 May 2022
  • 6 min

Situated by the sea, the Netherlands are a major logistic junction in Europe. To maintain this leading position, the logistics sector in our country will have to keep improving its efficiency and reducing CO2 emission. Industry 4.0 could make an important contribution.

The port of Rotterdam aims to be the world's smartest port. The port operator focuses on the energy transition, digitalisation, and innovation. Following their example, other distribution and logistics companies also start to understand that these are the main issued for the future. Issues that have everything to do with 'smart data handling'. In this context, ICT developed a data factory for a large logistics service provider: a data management and governance platform that enables businesses to take control themselves and develop dashboards and other tools they need in a data-driven business strategy. Niels Noordijk explains.

Adjust and predict in real time

Niels: “Every minute that a ship is moored alongside the quay costs money. Every lorry that returns empty uses diesel unnecessarily. The more efficient you make your logistic processes, and the better you make them fit together, the better you will do in terms of money, FTEs, time, and CO2 emission.”

A data platform can have two paths: a hot path or a cold path. A hot path is intended to stream data and respond to them in real time. If you use a cold path, it will take longer for data to reach you, either because more calculations are made or because the source systems are unable to stream data in real time. Niels: “With a data factory, you can develop both types of data products. For example, alerts that indicate that something is not going according to schedule, dashboards that provide insight in real time, predictions based on which you can improve your planning, and analyses that will help you determine your strategy.

Combining internal and external data

Companies often start with their own internal data, created by multiple systems, such as WMS, ERP, and data from IoT devices. After a while, they need to unlock external data as well, such as data from the total supply chain. Niels: “At first, this often involves data from customers, such as orders. Then it involves increasing amounts of data from public sources, such as Eurostat, the IMF, or weather and traffic information.” To make it easier to add such internal and external sources, ICT develops data factories so that business can decide for themselves which data they want to use. We often use a DataVault for this, a modelling method that enables you to add new data through an automated process very flexibly. That makes it very agile.

 

In the data factory, we record what data sources we connect to it, where they are located, and what data team is responsible for what source. With data definitions, we make sure that the platform keeps apples apart from oranges.

Niels Noordijk
Business consultant Logistics
Niels Noordijk

If one application states a date as 1-2-2022, another as 1 February 2022, and the third uses the American notation 20-1-2022, these data are automatically equalised. This will enable you to consistently analyse data from different sources. The company can then use the self-service ‘data factory’ to develop their own predictions, dashboards, and alerting systems.

Automatic data quality, performance, and security monitoring

As a business consultant, Niels is often working on the quality of the platform, including data quality, performance, security, user interface functioning, and functioning of the platform in general. Niels: “We achieve data quality with well-implemented data definitions and a diversity of quality checks. For instance, we can run a fully automated check to see if a certain reference value actually exists and if the data are complete. Another aspect of data quality is being able to trace where data come from and recording how certain KPIs are structured. An example is the calculation of the number of crane movements per hour. You can make this calculation based on the total number of working hours, but you can also deduct the time that the crane was not operational. If you want to compare KPIs from different terminals, teams, or cranes with each other, it is important that the same calculation method is used and that you are able to explain how you got the numbers. That is also important if you are going to develop new versions of your dashboard in which you will change the formulas or data sources. In that event, you will want to be able to follow and test the changes. A test management module ensures that the quality of your KPIs is guaranteed in the future as well.”

Performance is mostly about the platform's processing speed. Niels: “The entire process of retrieving data from data sources, distributing them to the various applications, and making the analyses, is orchestrated by the platform. The individual tasks have to link up with each other, and this has to be done within an acceptable time frame too, to make sure that the users get their information on time.”

Security is an important spearhead as well, says Niels. “An organisation stores all data in one central place. As a rule, that is quite a risk. Data are moved, stored, and distributed, and some of those data are marked as sensitive. This will have to be done in a controlled and secure way. The business can indicate themselves which data are to be regarded as sensitive, and extra security procedures will be applied to them. Additionally, we make every step that involves data relocation, storage, or dissemination subject to a review and an approval process.”

Short time to market for data products

Data solutions are used by many companies, but also by students and scientists. In an ocean of data, they try to discover patterns based on which they can produce predictions. One example of a large field of research is an ETA (Estimated Time of Arrival) predictor for the various modalities (ships, trains, and lorries).
Niels: “But the most interesting thing about a data management and governance platform is that you can do everything yourself without any specific data engineering knowledge. All you need is domain knowledge and strategic knowledge. That leads to a very short time to market for data products, and allows companies to respond much faster to current demands, which makes them a lot more flexible.”

His conclusion is: “Any company that intends to maintain their competitive position in the future, will have to become data-driven according to Industry 4.0. The first requirement for this is to have a data platform and use it properly.”