What Article 4 says

Article 4 of the EU AI Act — "AI Literacy" — came into force on 2 February 2025. It establishes that providers and deployers of AI systems must ensure that their staff and other persons dealing with AI on their behalf have a sufficient level of AI literacy.

This is not limited to companies that build AI. If your business uses ChatGPT, AI-powered scheduling tools, AI features in your CRM, or any software that mentions "AI" or "machine learning" in its capabilities — you are a deployer. Article 4 applies to you.

Key date: Enforcement of Article 4 begins on 2 August 2026. From that date, national market surveillance authorities can audit and impose penalties for non-compliance.

Who is affected

The short answer: almost every business in the EU that uses modern software. Specifically:

The regulation applies proportionally — a 5-person clinic does not need the same depth of training as a bank. But the obligation exists regardless of company size.

What "AI literacy" means in practice

The regulation does not prescribe a specific curriculum. It requires that training be appropriate to the context, taking into account:

In practical terms, this means your team needs to understand: what AI tools they use, what those tools can and cannot do, the risks involved, and how to use them appropriately within business processes.

What most companies get wrong

The biggest mistake is treating this as an IT problem. AI literacy is not about developers understanding neural networks. It is about the receptionist understanding that the chatbot can give wrong information, the marketing manager knowing that AI-generated content needs human review, and the manager understanding the limitations of AI-assisted decision-making.

The second mistake is buying generic e-learning courses. Article 4 explicitly mentions that training must consider the context of use. A generic course on "what is AI" does not satisfy that requirement.

How to prepare

1. Map AI usage across the company

Before training anyone, you need to know what AI tools the company actually uses. This is often more than you think — AI features are embedded in everyday software that employees do not even recognise as AI.

2. Assess the risk level

The EU AI Act classifies AI systems by risk level: unacceptable, high, limited and minimal. Most SME tools fall under limited or minimal risk, but some — particularly in HR, healthcare or finance — may be high risk. The training requirement scales with the risk.

3. Train in the real context

The most effective approach: train people on the specific AI tools they use, in the context of their actual workflows. Not abstract theory — practical understanding of how AI behaves in their daily work.

4. Document everything

Compliance is about proof. Keep records of what training was given, to whom, when and what it covered. If you are audited, you need to demonstrate that you took the obligation seriously.

The opportunity behind the obligation

Most companies see Article 4 as a burden. It does not have to be. The same process of mapping AI usage and training your team also reveals automation opportunities you did not know existed. Companies that approach this proactively — not just for compliance, but as a genuine assessment of how AI fits into their operations — consistently find ways to save significant time and money.

Need help preparing?

D'One helps you map AI usage across your company, assess the risks and prepare your team for EU AI Act compliance — with practical training, not generic slides.

Get in touch