Empty truck cab interior with morning light through the windscreen and dashboard in view

Your trucks are moving. Your drivers are scanning. Your GPS is tracking. Every trip leaves a digital trail: who drove where, how fast, for how long, with what load and for which client. That is operational data that keeps your business running — and it is personal data. In part, it is legally mandated data.

For most transport companies, the AI conversation starts with: “Can we automate our route planning?” The conversation that needs to happen first is: “What are we already processing, for whom, on whose servers — and what are our obligations?” That sequence makes the difference between a useful tool and a legal liability.


Your Driver Is Your Most Monitored Employee

Continuous GPS tracking, tachograph records, driving behaviour scores, rest period logs — the volume of personal data you process about your drivers exceeds that of almost any other employee in a small business. The GDPR requires a lawful basis for that processing — usually legitimate interest for operational purposes — but that basis has clear limits.

Proportionality is the key word. Location data collected for route planning and customer service cannot simply be repurposed for appraisals or disciplinary procedures. Your drivers have the right to know what data is collected, for what purpose, for how long it is retained and who has access. Knowing there is a GPS in the truck is not the same as knowing that their location data is stored for three years on a server outside the EU and used to train an AI model. That distinction is legally significant.

Works councils and union representatives in both Belgium and the Netherlands hold consultation rights before monitoring systems are introduced. In Belgium, Joint Committee 140 (PC 140) governs the collective labour agreement for road transport — the introduction of AI monitoring systems must be assessed against information and consultation obligations under social law. FEBETRANS and TLN both publish sector guidance on driver monitoring that functions as a practical reference.


Warehouse worker checking a tablet in a logistics aisle with pallets and warm industrial light

The AI Act Makes It Concrete: Annex III, Point 4

The moment you use AI to analyse your drivers’ driving behaviour — braking patterns, speed profiles, cornering — and that analysis feeds into any assessment or decision about those drivers, you are the deployer of a high-risk AI system under AI Act Annex III, Point 4: employment and worker management.

That has concrete consequences. You must document the system. You must be transparent with your drivers about how the AI works and which decisions it influences. You must ensure human oversight: a score automatically forwarded to HR without human review is legally problematic. A driver who receives a poor safety rating has the right to understand how that assessment was reached.

The use of the scores determines the legal risk — not the system itself. Driving behaviour analysis used for safety coaching is one thing. Those same scores applied in wage negotiations or disciplinary proceedings is a different legal territory, where collective labour agreement obligations also apply. That line is easier to cross than it looks.

“Your vehicles generate continuous location and behaviour data for every driver, every shift. That data has a legal basis — but it also has GDPR boundaries. And if you use AI to analyse it, the AI Act classifies that as high-risk. That is not theoretical. It means documentation, transparency to drivers, and human oversight of every AI-generated assessment.”

Close-up of a digital tachograph display against a neutral background

A Scenario You Will Recognise

A transport company with thirty drivers switches to a modern cloud fleet management platform — route planning, driving behaviour scoring and customer communication all in one environment. The platform is certified. The provider “handles GDPR compliance.”

But: every driver’s GPS position is transmitted every thirty seconds to servers outside the EU. The driving behaviour score is calculated daily using aggregated platform data — including your drivers’ historical data, which now trains a model shared with competitors. If the provider suffers a data breach, you are the party with the notification obligation: you are the controller, not the provider. When the social inspectorate requests tachograph records for an investigation, the data sits on servers you do not control. And if a union representative requests the AI scoring model documentation on behalf of a driver who received a poor rating, you cannot provide it.

The provider is a processor. You are the controller. A data processing agreement allocates responsibility — it does not eliminate it. The liability in a breach, the reporting obligation to the supervisory authority and the accountability to your drivers all remain with you.


Haulage truck parked at a Belgian motorway rest stop at dusk, rear view

Three Steps That Make the Difference

Local AI does not solve everything automatically, but it returns the control you are already legally required to have. Route optimisation based on your own client addresses and route history, tachograph compliance checked on your own server, driving behaviour analysis that does not leave your network — that is the architecture that makes both GDPR and AI Act compliance manageable.

Step 1: Map your processing activities. What data do you process about your drivers? Where does it sit? Who has access — internally and externally? For each processing activity: what is the lawful basis, how long is the data retained, and for what purpose is it permitted to be used? This record is not a bureaucratic exercise — it is the foundation of any GDPR defence in an inspection or breach scenario.

Step 2: Separate compliance data from analysis data. Retaining tachograph data for inspectorate access is a legal obligation under EU Regulation 561/2006. Feeding that same data into an AI model for driving behaviour scoring is a separate processing activity — with its own lawful basis, documentation requirements, and AI Act obligations. The two do not automatically coincide, even when the underlying data source is the same.

Step 3: Choose local processing for employee data. GPS location data and tachograph data belong in a secure environment you control. Route optimisation and compliance analysis can run locally — your operational intelligence does not need to feed an external platform. Most of the business-specific value lies in your own data: your client addresses, your delivery windows, your historical route times. That value does not need to leave the company.

Ron Spoelstra — Belgium · March 2026 · info@ronspoelstra.be