
An accountant knows their clients in ways their own bank does not. Bank statements, payslips, tax returns, annual accounts, contracts — the complete financial story of someone’s life or business sits in their files. That trust is the foundation of the profession. And professional secrecy is its legal anchor.
AI tools can save accountants significant time — transaction classification, invoice processing, report generation, answering tax questions. But most cloud AI tools carry a risk that is widely underestimated: they process your client data outside your office, on servers you do not control.
What is at stake
The data an accountant handles every day is not ordinary business information. Bank statements are personal data under the GDPR. Payslips contain financial information about employees and their families. Tax returns reveal a complete picture of someone’s income, assets, and liabilities.
When you enter that data into a cloud AI tool — even just for a quick question or a summary — that data leaves for an external server. Legally, that is a transfer to a third party. And that opens two problems at once.
Professional secrecy: more than a DPA clause

Professional secrecy for ITAA accountants (Belgium) and NBA registered accountants (Netherlands) is enshrined in law. Article 58 of the ITAA Act imposes confidentiality on all information received in the course of professional duties. The Netherlands has equivalent obligations.
The problem with cloud AI is not that a data processing agreement is missing — you can sign one. The problem is more fundamental: professional secrecy is designed to restrict the transfer of information itself. A DPA governs how a processor handles data once received. Professional secrecy determines whether they may receive it at all.
“A data processing agreement is not a licence to process client data outside your office. It governs what happens afterwards — but the transfer itself is already the problem.”
Legal opinions differ on where precisely the line falls. But the direction is clear: the more sensitive the client data, the greater the legal risk in cloud processing.
DORA and the supply chain pressure
There is a second route by which AI risks can reach your practice: through your clients. The DORA regulation requires financial institutions — banks, insurers, investment firms — to assess and document the ICT security of their suppliers. An accountant serving financial institutions is exactly such a supplier.
In practice: your client asks which cloud AI tools you use and how you protect their data. If your answer is “ChatGPT with a DPA,” that is insufficient for most financial institutions. Local AI becomes a competitive argument: your answer becomes “nothing leaves our office.”
A concrete scenario
A busy accountant copies a client’s bank statements into ChatGPT Team and asks for a summary of the largest expense categories. The tool delivers a clear answer. Twenty minutes saved.
What has happened legally: the client’s name, account number, and transactions with amounts and counterparties are now on a server in the United States, subject to the CLOUD Act. There is no valid GDPR legal basis for that transfer. Professional secrecy has potentially been breached. The client does not know.
If the client knew, they would switch accountants.
The solution: AI that stays in the office
Local AI runs on your own hardware — in your office, on your server, behind your firewall. No cloud. No transfer. The time savings are identical: transaction classification, invoice processing via OCR, report generation, tax question answering. The difference is architectural: every byte of client data stays within your office.
For an ITAA accountant or registered accountant, that is not only legally sound. It is the only model consistent with professional secrecy — and with the trust that clients place in you when they hand over their most sensitive financial information.
Three steps you can take now
1. Audit which AI tools you currently use for client work
Are you using ChatGPT, Copilot, or another cloud AI for tasks involving client data? What data leaves with each conversation? An honest audit is the foundation of every next step.
2. Stop entering client data into cloud AI
Until you have a local solution in place, the safest measure is the simplest one: no client data in cloud AI tools. Not for a quick question, not for a test, not “just internally.”
3. Explore local alternatives
Ollama with an open-weight model on a mini-PC in your office is available and affordable today. Setup requires technical knowledge — but only once. After that it runs daily, without a subscription, without any data transfer, with full control over your client data.