
Therapists and psychologists hear things people do not say at home. Trauma, addiction, suicidal thoughts, abuse, relationship breakdowns that no one is supposed to know about. The trust between therapist and client is the core of the therapeutic work itself. Without absolute confidentiality, that trust cannot exist.
AI can help therapists significantly: structuring session notes in SOAP or DAP format, drafting treatment plans, writing progress reports. Five to seven hours of administrative work per week can be reduced by more than half. But seeking that efficiency through cloud AI means trading administrative convenience for a serious legal and ethical problem.
The most sensitive data category that exists
Medical and psychological data is the most protected category of personal data in European law. Not arbitrarily — because a leaked therapy file can destroy a person’s life. Psychiatric diagnoses, addiction histories, sexual abuse, suicidal episodes: information people carefully shield from employers, family members, and neighbours.
The GDPR groups health data under Article 9 as a special category. Processing it is in principle prohibited, unless an exception applies. The healthcare exception covers the therapist themselves — not the cloud tools they use.
Two legal problems with cloud AI

Problem 1: professional secrecy is a criminal law matter. Article 458 of the Belgian Criminal Code makes breaching professional secrecy a criminal offence. The therapist’s duty of confidentiality is not only a deontological obligation — it is a criminally protected principle. In the Netherlands, the BIG Act and the WGBO provide equivalent protection.
When a therapist enters client information into a cloud AI tool, that information travels to an external server. The server is operated by a company that is not a therapist, is not bound by professional secrecy, and sits outside the direct control of the practitioner. That is a potential breach of Article 458 — even if the intent is purely administrative.
Problem 2: consent does not work here. You cannot ask a client to consent to their session data being processed by an external AI. The therapeutic relationship is characterised by a fundamental power imbalance: the client is in a vulnerable position, the therapist in an authority role. The GDPR (Article 7(4)) provides that consent cannot be freely given where there is dependency. For minor clients, even stronger protections apply.
“The therapeutic relationship is trust in its most absolute form. Whoever outsources that trust to an external server gives it away.”
A concrete scenario
A therapist has five sessions to document after a long day. She dictates her notes into a cloud speech-to-text tool: the client’s name, age, the content of what was discussed, suicidal thoughts that came up, the next step in the treatment plan.
The audio goes to a server in the United States. The transcript goes to a cloud AI for structuring. A neatly formatted SOAP note appears on screen. Sixty minutes saved.
Legally, in that moment: a transfer of GDPR Article 9 data without valid legal basis, a potential breach of criminal professional secrecy law, a violation of trust that would end the treatment relationship if the client knew — and data on servers subject to the CLOUD Act.
The solution: local dictation, local structuring
The alternative is architectural and available today. whisper.cpp is open-source speech-to-text software that runs entirely locally. Ollama with an open-weight language model processes the transcript and generates the SOAP note. Everything happens on hardware inside the practice. Nothing leaves the building.
The result is identical — structured session notes in seconds. The difference is that the most sensitive information stays in your practice. GDPR-compliant. Professional secrecy intact. And above all: consistent with the trust your clients place in you.
Three steps you can take now
1. Stop using cloud dictation for session notes
Tools such as Otter.ai, Google Speech, or similar cloud services are legally indefensible for therapeutic work. Even without names: the combination of complaint, age, and therapeutic context is often enough to identify a person.
2. Separate administrative and therapeutic tasks
Not all AI use carries the same risk. Using a cloud AI to draft a neutral information leaflet or manage your scheduling does not put client data at risk. The line sits at anything that contains even indirect client-specific information.
3. Explore local speech-to-text
whisper.cpp runs on an ordinary laptop. Local transcription quality is comparable to cloud alternatives. Setup requires technical knowledge — once. After that, you dictate locally, a local model structures the notes, and the result exports to your EHR. Your client data never leaves the practice.