AI Act compliance checklist by role: DPO, HR, CIO, Executive Board

Interactive checklist to verify your AI Act compliance by role. DPO, HR Director, CIO, Executive Board and SME leaders.

AI Act compliance is a collective effort

The most common mistake when facing the AI Act: believing that a single department can carry compliance alone. The DPO cannot do everything. Neither can the CIO. AI Act compliance mobilises legal, technical, HR and strategic competencies — it can only succeed if each role assumes its specific responsibilities.

This chapter offers an interactive checklist by role to identify precisely what concerns you and track your progress.

Who does what in AI Act compliance

The DPO: cartographer and compliance guarantor

The DPO (or compliance officer) plays a central role. They must inventory all AI systems, classify them according to the risk levels defined by the regulation, and ensure that prohibited practices are not in use within the organisation. In the UK, this role interfaces closely with the ICO for data protection aspects and with the UK AI Safety Institute for broader AI governance.

They are also the link between the board, the CIO and the HR Director: it is their role to coordinate efforts and prepare evidence in view of a potential audit by national competent authorities — or, in the UK, by the ICO, FCA or other sector-specific regulators.

Article 26, paragraph 5Règlement (UE) 2024/1689

Deployers of high-risk AI systems shall monitor the operation of the high-risk AI system on the basis of the instructions of use and […] shall inform the provider or distributor and the relevant market surveillance authority.

The HR Director: pillar of upskilling

Article 4 places AI training at the heart of compliance. The HR Director is best placed to deploy an upskilling programme tailored to each staff profile.

This goes beyond organising a webinar. You need to be able to document the training pathway, measure the outcomes and produce evidence that the level of AI literacy is proportionate to each person’s role.

📄AI Act Article 4: the AI training obligation explained

The CIO: technical vigilance and traceability

The CIO is responsible for mapping all deployed AI systems — including those adopted by business units without approval (shadow AI). For each system classified as high-risk, they must implement the technical documentation, ensure the traceability of algorithmic decisions and integrate the required human-oversight mechanisms.

The most common challenge: AI tools embedded in third-party software (CRM, ERP, office suites). The CIO must identify these “invisible” AI components and verify their risk level. For UK organisations, this includes AI features within NHS clinical systems, HMRC automated workflows and FCA-regulated platforms at institutions such as Barclays and HSBC.

📄Shadow IT in business: risks and solutions in 2026

The Executive Board: strategic vision and resource allocation

Senior leadership must approve the compliance strategy, allocate the budgets (particularly for Article 4 training) and appoint an AI Act lead. They track progress through a compliance dashboard and anticipate regulatory deadlines to avoid unpleasant surprises.

The main risk for the board: budget underestimation. AI Act compliance has a cost — but it is far lower than the potential penalties (up to EUR 35 million or 7% of worldwide turnover).

The SME leader: pragmatism and prioritisation

SMEs are not exempt from the AI Act. But the approach must be proportionate: identify whether the business uses AI systems (even indirectly through third-party software), train teams in good practices, verify supplier compliance and document usage.

For an SME, the absolute priority is Article 4: a documented training programme, even a simple one, is enough to demonstrate the organisation’s good faith.

Your interactive checklist

Select your role and tick the actions already completed. Your progress is saved locally in your browser — you can come back to it at any time.

Selectionnez votre role :

0 / 7 actions0 %

How to coordinate efforts

The checklist by role is useful for clarifying individual responsibilities. But AI Act compliance requires active coordination between functions. Here are the essential mechanisms.

1. Create a cross-functional AI Act committee

Bring together the DPO, HR Director, CIO and a board member in a dedicated committee that meets monthly. The objective: share progress, unblock obstacles and arbitrate priorities. Without this committee, each function advances in a silo and blind spots multiply.

2. Centralise the AI system inventory

The DPO coordinates, the CIO provides the technical mapping, the HR Director identifies business usage. The inventory must be unique, shared and regularly updated. A spreadsheet may suffice to start — the important thing is that it exists and is comprehensive.

3. Align training with risk classification

The training programme led by the HR Director must be calibrated according to the risk classification established by the DPO. Staff who use high-risk systems need more in-depth training than those who use a simple conversational assistant.

4. Prepare audit evidence now

Do not wait for an inspection to build your file. Every compliance action must be documented with date, owner and deliverable. The burden of proof rests with the organisation: it is up to you to demonstrate that you have acted.

📄AI governance in business: 5 steps to get structured

The traceability imperative

The AI Act does not demand perfection — it demands proof of effort. A documented training programme, an up-to-date AI system inventory, compliance committee minutes: these elements form your first line of defence in the event of an inspection. A total absence of documentation, on the other hand, is a clear-cut failing.

📄AI usage charter: the 8 essential points