You probably already use AI. A chatbot to draft an email. A tool that summarises documents. An assistant that answers questions. Perhaps for longer than you realise.
You are not alone. Millions of European business owners do the same. Fast, practical, free or nearly free. It works. Why would you stop to think about it?
Because something is happening that most business owners don’t yet have on their radar. Over the past few years, Europe has built a series of laws that together say one thing: the way you handle data and AI is no longer your business alone.
Why Europe Is Regulating This
There is an economic reality most business owners don’t consider. The AI tools you use today run on servers in the United States. Three American companies together control roughly two thirds of the global cloud market. The data you enter — about your customers, your employees, your business — leaves Europe. And US law gives US authorities the right to request that data, regardless of where it is physically stored.
That is not a conspiracy theory. That is what the law says. Europe has responded. Not with one law, but with a series of them. They build on each other, reinforce each other, and together make clear that the coming years will be fundamentally different from the past ten.
The Laws in Plain Language
GDPR — the foundation
You know it. In force since 2018. Regulates how you may process personal data of customers, employees and others. The core principle: you may only use data for what is necessary, you must be transparent, and you must be able to demonstrate compliance. Fines up to twenty million euros or four percent of your global turnover.
What most business owners don’t yet realise: every AI tool you use for work involving personal data is a third party processing that data. You need a legal basis for that. And a data processing agreement. And oversight.
AI Act — the new layer
The world’s first binding AI law. The deadline for high-risk systems is expected to move to 2 December 2027 (Parliament committees + Council both agreed, March 2026 — pending plenary and trilogue). It classifies AI systems by risk level. The more sensitive the application, the heavier the obligations.
What falls under high risk? More than you might think. Screening CVs with AI. Assessing creditworthiness. AI in healthcare. AI in education. AI that makes decisions affecting people. For all these applications, strict requirements apply: documentation, transparency, human oversight, explainability. Fines up to thirty-five million euros or seven percent of your turnover.
Read more: The AI Act in detail — risk model, obligations and three steps you can take now →
NIS2 — cybersecurity becomes mandatory
In force since October 2024. Extends cybersecurity obligations to eighteen sectors — far broader than before. Energy, transport, healthcare, food, chemicals, public services. And crucially: the obligations flow down the supply chain. If you supply to a company covered by NIS2, you may be asked to document and demonstrate your own security practices. Directors face personal liability if the organisation fails to comply.
DORA — specifically for finance
In force since January 2025. Aimed at the financial sector. Banks, insurers, investment firms, but also insurance brokers and accounting firms providing financial services. Requires a register of all external ICT providers — including AI tools. Exit strategies. Contractual requirements. Incident reporting within four hours.
A cloud AI service is an ICT provider under DORA. It belongs in the register. With all the obligations that entails.
Data Act — the right to leave
In force from September 2025. Gives you the right to request your data back from cloud providers. Prohibits switching costs from January 2027. Sounds technical, but the message is simple: Europe wants to ensure you are not trapped with a single provider.
What They Mean Together
Five laws. Different focus. But one shared direction.
Europe is saying: the data of your customers, your employees, your business is not a free-for-all. The intelligence you hire to process that data is not a free-for-all. You are responsible. You must be able to demonstrate it. And if something goes wrong, the consequences are real.
That is not a hostile message. It is an honest one. Europe is building a framework that protects businesses — including yours. But protection has a price: awareness and preparation.
What You Need to Know Now
Not everything is urgent at the same moment. The AI Act brings the heaviest obligations for those who use AI in sensitive contexts — the expected deadline is now 2 December 2027.
The first question to ask yourself is simple: which AI tools do I use today, for which tasks, and which data travels with them?
Not to stop everything. Not to panic. But to know where you stand. Because the business owner who gets informed now has choices. The one who waits until enforcement begins no longer has them.
In the following articles we go sector by sector. What does this mean concretely for an accountant, a therapist, a lawyer, a doctor? Which risks are greatest, which steps are most urgent, and which solutions already exist?
But those conversations begin here. With this foundation. With the question that each of them ultimately asks: whose intelligence are you using — and where does it live?