Empty GP consultation room with examination table and desk

A general practitioner seeing thirty patients a day spends an average of four minutes per consultation on documentation. That is two hours daily — every working day, year after year. Referral letters, medical certificates, correspondence with specialists come on top. AI can take over a large part of that administrative burden.

The promise is concrete: dictate a brief summary after each consultation, and an AI generates the structured consultation note. Referral letters in minutes. Certificates for employers or schools in seconds. But only if the architecture is right — because medical data sits at the top of what European law protects most strictly.


The most sensitive data category that exists

Medical and dental data is special category data under GDPR Article 9. That is not overstated: cancer diagnoses, psychiatric conditions, addictions, HIV status, genetic information — this is information that affects someone’s insurability, can damage their career, and can destroy their personal relationships. Information that people carefully shield from their surroundings.

Patients share this information with their doctor because they know it goes no further. That trust is the foundation of medicine itself. If it were to break — if patients knew that their consultation notes were being processed via a server in the United States — they would change doctors. And rightly so.


What the law says: three layers of protection

Blank patient record form and stethoscope on a desk

GDPR Article 9. The healthcare exception covers the doctor themselves and care providers under their supervision. It does not cover their cloud AI tool. That tool is an external processor, and transferring Article 9 data to an external processor requires a valid legal basis — a DPA is necessary but not sufficient.

Medical professional secrecy. In Belgium, Article 458 of the Criminal Code. Breaching it is a criminal offence, not merely a disciplinary matter. In the Netherlands, Article 7:457 of the Civil Code (WGBO) and Article 88 of the BIG Act provide equivalent protection. The question is not whether a cloud AI tool has a data processing agreement — the question is whether the processing itself is legally defensible.

eHealth regulation. In Belgium, the eHealth platform governs the regulated exchange of medical data. In the Netherlands, the Nuts network plays a comparable role. Medical data processed via cloud AI leaves that regulated ecosystem. That is an additional compliance layer on top of GDPR — and an argument that carries weight with supervisory authorities.

“Your patients assume their medical information stays in your practice. If they knew their diagnosis was processed via a server in the US, they would switch doctors.”

A concrete scenario

A busy GP uses a cloud speech-to-text tool combined with an AI assistant for his documentation. After the afternoon session he dictates: “Patient Jan Janssen, 52, smoker, type 2 diabetes, recent blood work 8.4, considering insulin, domestic stress mentioned, referral to psychologist discussed.”

Name, age, diagnoses, lab results, psychosocial context — everything is now on a server in the United States, subject to the CLOUD Act. There is no valid GDPR legal basis for that transfer. Medical professional secrecy has potentially been breached. The patient has given no consent. And he does not know it happened.


The solution: local speech, local structure

The technical solution exists and is available today. whisper.cpp is open-source speech-to-text software that runs entirely locally on a standard PC or laptop — comparable accuracy to cloud alternatives, no external connection. Ollama with an open-weight language model structures the transcript into a SOAP note or a referral letter.

The result for the GP: dictate findings after each consultation, the system processes them locally, the structured note is ready for import into the EHR. Two hours of daily administration becomes thirty minutes. And patient data never leaves the practice — meaning GDPR, professional secrecy, and eHealth compliance all remain intact.


Three steps you can take now

1. Audit your current AI use
Are you using cloud dictation, an AI writing assistant, or cloud-based planning software? Which categories of patient data are involved? An honest audit is your starting point.

2. No patient names or diagnoses in cloud AI tools
Until a local architecture is in place, this is the minimum: whatever cloud AI tool you use, ensure no patient-specific information enters it. No name, no diagnostic code, no lab result.

3. Talk to your EHR provider about local AI
An increasing number of EHR providers offer integration options for local AI, or have on-premise versions of their software. The conversation is worth having. Local speech-to-text combined with your existing EHR is not a future prospect — it exists today.

Ron Spoelstra — Belgium · March 2026 · info@ronspoelstra.be