Before a high-risk AI system can be deployed in the EU, it must undergo a conformity assessment. This is the formal process by which a provider demonstrates that their AI system meets the EU AI Act's requirements. Until the conformity assessment is complete, you cannot legally place the system on the EU market.
This article explains what conformity assessment involves, who conducts it, and what you need to prepare.
What Conformity Assessment Determines
The conformity assessment verifies that a high-risk AI system:
- Has complete and accurate technical documentation
- Was developed with appropriate data governance
- Meets accuracy, robustness, and cybersecurity requirements
- Has transparency and human oversight measures in place
- Has been tested and validated before deployment
- Complies with any applicable harmonised standards
A successful conformity assessment results in CE marking (where applicable) and allows the provider to register the system in the EU AI database and deploy it in the EU.
Two Routes: Self-Assessment vs Third-Party Notified Body
Not all high-risk AI systems require third-party assessment. The AI Act distinguishes between two routes:
Route 1: Internal Self-Assessment
Available for most high-risk AI systems listed in Annex III (excluding those embedded in certain regulated products). The provider conducts the conformity assessment themselves by:
- Auditing the system against all AI Act requirements
- Preparing the full technical documentation package
- Completing an internal review checklist
- Signing a Declaration of Conformity
- Affixing CE marking (where applicable)
- Registering in the EU AI database
Self-assessment does not mean light-touch. The documentation requirements are substantial, and the provider is legally responsible for the accuracy of the assessment. Market surveillance authorities can audit self-assessed systems.
Route 2: Third-Party Notified Body
Required for AI systems that are:
- Safety components in regulated products covered by specific EU product safety legislation (e.g., medical devices under MDR/IVDR, machinery)
- AI systems used by public authorities in law enforcement, migration, and justice contexts
- AI systems intended for biometric identification
Notified bodies are independent conformity assessment organisations accredited by EU member states. They review the technical documentation, may conduct additional testing, and issue a conformity certificate.
For most B2B SaaS companies, self-assessment is the applicable route unless the product is embedded in an MDR-regulated medical device.
What You Need for Self-Assessment
Technical Documentation
The core of any conformity assessment. Must include:
- System description: Purpose, intended use cases, user population
- Risk classification: Justification for Annex III classification
- Architecture description: How the system works (not necessarily full model weights, but functional description)
- Training methodology: How the model was developed, key design choices
- Dataset documentation: Training data sources, composition, known gaps and biases
- Performance metrics: Accuracy, reliability, false positive/negative rates, broken down by relevant subgroups
- Bias and fairness assessment: What testing was conducted, what was found, what mitigations are in place
- Human oversight description: How the system enables and supports human review
- Transparency measures: How outputs are communicated to users and affected individuals
- Security measures: How the system is protected against adversarial manipulation
- Post-market monitoring plan: How you will track performance after deployment
Declaration of Conformity
A formal document signed by the provider (or authorised representative) stating that the system meets all applicable AI Act requirements. This document:
- Identifies the system (name, version, intended purpose)
- References the specific AI Act requirements satisfied
- Names the provider and responsible officer
- Is kept on file and available to authorities on request
CE Marking
Required for high-risk AI systems placed on the EU market. Applied after successful conformity assessment. The AI Act CE marking confirms compliance with the AI Act specifically — separate from CE marking under product safety regulations.
Conformity Assessment for AI in Medical Devices
If your AI system is embedded in a CE-marked medical device (MDR or IVDR), the conformity assessment is conducted through the existing notified body process. The AI Act requirements are incorporated into the MDR/IVDR assessment:
- The notified body assesses both the medical device aspects and the AI aspects
- The technical documentation must satisfy both MDR/IVDR and AI Act requirements
- CE marking under MDR/IVDR covers the AI Act obligations for the device
The EU has committed to aligning the two frameworks to avoid duplicate assessments. In practice, there will be additional AI Act-specific sections in the technical file.
How Long Does Conformity Assessment Take?
For self-assessment: the time is driven by how long it takes to prepare complete technical documentation. For a well-prepared company with documented development processes:
- 4–8 weeks for documentation preparation
- 2–4 weeks for internal review
- 1–2 weeks for signing and registration
For third-party notified body assessment: 3–9 months, depending on notified body capacity and system complexity. Notified body capacity for AI systems is limited in 2026 — early engagement is advisable.
What Triggers Re-Assessment
A completed conformity assessment is not permanent. You must re-assess or update the assessment when:
- The AI system is substantially modified (new training data, significant architecture changes, new use cases)
- New evidence emerges about performance or failure modes that was not known at assessment
- The system is deployed in a significantly different context than assessed
Minor updates (bug fixes, UI changes, minor model improvements) typically do not require full re-assessment but must be documented.