TRACEGov
EU AI Act
12 min read

The Complete Guide to EU AI Act Deployer Obligations

Everything organizations deploying AI in the EU need to know about Articles 26-29 — what's required, when enforcement begins, and how to prepare.

TT
TraceGov Team
February 20, 2026
Share

The EU AI Act (Regulation 2024/1689) is the world's first comprehensive AI regulation, and it fundamentally changes how organizations must govern their use of artificial intelligence. While much attention has focused on AI providers — the companies building AI systems — the obligations for deployers are equally significant and often less understood.

Who Is a "Deployer" Under the EU AI Act?

A deployer is any natural or legal person that uses an AI system under their authority. If your organization uses ChatGPT, GitHub Copilot, AI-powered analytics tools, or any other AI system in your operations — you are a deployer.

This is a critical distinction. You don't need to build AI to have obligations under the EU AI Act. You just need to use it.

Key Deployer Obligations (Articles 26-29)

Article 26: Fundamental Use Requirements

Deployers of high-risk AI systems must:

  • Implement appropriate technical and organizational measures to ensure the AI system is used in accordance with its instructions for use
  • Monitor the operation of the AI system based on the instructions for use
  • Inform the provider when the AI system presents risks
  • Keep logs automatically generated by the system for at least six months

Article 27: Fundamental Rights Impact Assessment

Before putting a high-risk AI system into use, deployers must conduct a fundamental rights impact assessment covering:

  • The deployer's processes in which the AI system will be used
  • The period of time and frequency of use
  • The categories of natural persons and groups likely to be affected
  • The specific risks of harm likely to impact those persons

Article 28: Transparency Obligations

Deployers must inform natural persons that they are subject to the use of an AI system. This includes clear communication about:

  • The fact that an AI system is being used
  • The purpose of the AI system
  • The contact details of the deployer

Article 29: Record-Keeping

Deployers must maintain documentation of their AI governance, including:

  • Risk assessments and mitigation measures
  • Human oversight procedures
  • Evidence of compliance monitoring
  • Incident reports and corrective actions

Enforcement Timeline

August 2, 2026 is the enforcement date for high-risk AI obligations. Penalties for non-compliance are severe: up to €35 million or 7% of global annual turnover for the most serious violations.

How TraceGov Helps

TraceGov was built specifically for deployer governance. Our TRACE Protocol scores every AI interaction across five dimensions — Transparency, Reasoning, Auditability, Compliance, and Explainability — giving you continuous governance signals, not just periodic compliance checks.

The Merkle-chain audit trail provides cryptographic proof of your governance activities, and the Governance Library maps your obligations across 50+ regulatory frameworks including the EU AI Act, GDPR, DORA, and more.

Start mapping your obligations today. Sign up for free — no credit card required.

Related Articles