Conformity Assessment Under the EU AI Act
Guide to conformity assessment procedures for high-risk AI systems — internal control, third-party assessment, CE marking, and EU declaration of conformity explained.
Before a high-risk AI system can be placed on the EU market or put into service, it must undergo a conformity assessment. This is the process by which the provider demonstrates — and a competent authority can verify — that the AI system meets all applicable requirements of the EU AI Act (Regulation 2024/1689). Without a completed conformity assessment, a provider cannot issue an EU declaration of conformity, apply the CE marking, or legally make the system available in the European Union.
Conformity assessment is not a new concept in EU product regulation. It has been used for decades in sectors like medical devices, machinery, and personal protective equipment. What is new is its application to AI systems. The EU AI Act extends this well-established framework to artificial intelligence, adapting it to account for the specific risks and characteristics of AI technology.
This guide explains the conformity assessment procedures under the EU AI Act, when third-party assessment is required, what happens during the process, and how to prepare for it.
The Legal Framework: Articles 43 and 44
Article 43 of the EU AI Act establishes the conformity assessment procedures for high-risk AI systems. It provides two pathways:
- Conformity assessment based on internal control (Annex VI): The provider conducts the assessment themselves, without third-party involvement.
- Conformity assessment involving a notified body (Annex VII): A third-party organization designated by an EU Member State reviews the AI system and issues a certificate.
Which pathway applies depends on whether the AI system falls under existing EU harmonization legislation (such as the Medical Devices Regulation, the Machinery Regulation, or the Radio Equipment Directive) and whether the specific high-risk category requires third-party involvement.
Article 44 establishes the requirements for certificates issued by notified bodies, including their maximum validity period of five years and the conditions under which they may be suspended or withdrawn.
When Is Third-Party Assessment Required?
This is one of the most consequential questions for providers of high-risk AI systems. The answer depends on the classification of your system.
Internal Control (Most High-Risk AI Systems)
For the majority of high-risk AI systems listed in Annex III of the regulation — including systems used in employment, education, credit scoring, law enforcement, migration, and access to essential services — the provider may conduct the conformity assessment through internal control under Annex VI. No third-party notified body is required.
This might seem surprising given the potential impact of these systems, but the regulation follows a logic: for standalone AI systems (not embedded in regulated products), the primary accountability mechanism is internal compliance, backed by market surveillance.
Third-Party Assessment (Specific Cases)
Third-party conformity assessment under Annex VII is required in two main scenarios:
1. AI systems covered by existing EU harmonization legislation listed in Annex I, Section A. When a high-risk AI system is a safety component of a product (or is itself a product) covered by legislation listed in Section A of Annex I, the conformity assessment follows the procedures already established by that legislation. For example, an AI system embedded in a medical device follows the Medical Devices Regulation's conformity assessment procedures, which typically involve a notified body.
2. Biometric identification and categorization systems. High-risk AI systems intended for biometric identification and categorization of natural persons (as listed in Annex III, point 1) are subject to the third-party conformity assessment procedure set out in Annex VII. This reflects the particularly sensitive nature of biometric processing and the severe consequences of errors.
Even where internal control is permitted, providers must still apply harmonized standards or common specifications where available. The choice of internal control does not reduce the substantive compliance requirements — it only changes who verifies compliance.
Harmonized Standards and the Internal Control Path
Article 43(2) plays a critical role. If harmonized standards have been published for the relevant requirements, and the provider has applied those standards in full, the provider can conduct the conformity assessment through internal control, even in cases where third-party assessment might otherwise be required.
However, as of early 2025, the development of harmonized standards for the EU AI Act is still underway. CEN/CENELEC Joint Technical Committee 21 (JTC 21) is developing these standards, but most are not yet finalized or published in the Official Journal. Until harmonized standards are available, providers face greater uncertainty about whether their internal assessment will be accepted by market surveillance authorities.
The Internal Control Procedure (Annex VI)
When internal control applies, the provider conducts the conformity assessment by verifying that the quality management system and the technical documentation meet the requirements of the regulation. The procedure includes the following steps:
Step 1: Verify the Quality Management System
The provider must verify that its quality management system complies with Article 17. This system must cover:
- A strategy for regulatory compliance, including compliance with conformity assessment procedures and management of modifications to the high-risk AI system
- Techniques, procedures, and systematic actions for the design, development, and examination of the AI system
- Techniques, procedures, and systematic actions for testing before, during, and after development
- Technical specifications, including standards, to be applied and, where relevant harmonized standards are not applied in full, the means used to ensure compliance
- Systems and procedures for data management, including data acquisition, collection, analysis, labeling, storage, filtration, mining, aggregation, retention, and any other operation regarding data performed before and for the purpose of placing the system on the market
- The risk management system described in Article 9
- Post-market monitoring, including a post-market monitoring plan
- Procedures related to reporting of serious incidents
- Communication with competent authorities, notified bodies, and other operators
- Systems and procedures for record-keeping of all relevant documentation and information
- Resource management, including security-of-supply related measures
- An accountability framework
Step 2: Verify the Technical Documentation
The provider must verify that the technical documentation complies with the requirements of Article 11 and Annex IV. This includes confirming that the documentation is complete, current, and covers all elements specified in Annex IV — from the general system description through risk management, data governance, testing and validation, to the post-market monitoring plan.
Step 3: Verify Compliance with All Requirements
The provider must assess and confirm that the AI system complies with all requirements in Chapter III, Section 2:
- Risk management (Article 9)
- Data and data governance (Article 10)
- Technical documentation (Article 11)
- Record-keeping (Article 12)
- Transparency and provision of information (Article 13)
- Human oversight (Article 14)
- Accuracy, robustness, and cybersecurity (Article 15)
Step 4: Issue the EU Declaration of Conformity
Once the provider is satisfied that all requirements are met, it draws up the EU declaration of conformity in accordance with Article 47 and affixes the CE marking in accordance with Article 48.
Internal control requires rigorous self-assessment. "We checked our own work" will not satisfy regulators if the assessment was superficial. Document every verification step, maintain evidence, and consider engaging external expertise to support the process, even if a notified body is not legally required.
Preparing for conformity assessment?
Ctrl AI helps you build the quality management system, technical documentation, and audit trails needed to pass conformity assessment — whether internal or third-party.
Learn About Ctrl AIThe Third-Party Procedure (Annex VII)
When third-party assessment is required, a notified body examines the AI system and its provider's quality management system. The procedure under Annex VII has two main components.
Quality Management System Assessment
The notified body evaluates the provider's quality management system to determine whether it ensures compliance with the requirements of the regulation. This assessment covers the same elements as the internal control procedure, but with external verification.
The notified body must be given access to the premises of the provider and must be able to carry out audits. The decision must be communicated to the provider and must include the conclusions of the audit, the conditions for certification, and any corrective actions required.
Technical Documentation Assessment
The notified body also examines the technical documentation to verify that the AI system complies with the relevant requirements. This may include:
- Reviewing the risk management documentation
- Examining the data governance measures
- Evaluating the testing and validation results
- Verifying the human oversight mechanisms
- Assessing the accuracy, robustness, and cybersecurity measures
Certification
If the notified body concludes that the quality management system and the AI system comply with the regulation, it issues a certificate. This certificate is valid for a maximum of five years and can be renewed. The notified body may impose conditions on the certificate or require corrective actions.
The certificate can be suspended or withdrawn if the notified body finds that the provider no longer meets the requirements, or if corrective actions are not implemented within the prescribed timeframe.
The EU Declaration of Conformity
Article 47 sets out the requirements for the EU declaration of conformity. This is a formal document in which the provider states that the AI system meets all applicable requirements. It must include:
- The name and type of the AI system, and any additional unambiguous reference allowing identification
- The name and address of the provider, and where applicable, their authorized representative
- A statement that the EU declaration of conformity is issued under the sole responsibility of the provider
- A statement that the AI system is in conformity with the regulation, and where applicable, with any other relevant Union legislation
- References to any harmonized standards or common specifications used
- Where applicable, the name and identification number of the notified body, and a reference to the certificate issued
- The place and date of issue, and the identity and signature of the person authorized to sign on behalf of the provider
The declaration must be kept up to date and made available to market surveillance authorities on request, for a period of 10 years after the system has been placed on the market.
CE Marking
Article 48 requires that the CE marking be affixed to the high-risk AI system, or where that is not possible, to its packaging or accompanying documentation. The CE marking indicates that the AI system complies with the requirements of the regulation and has undergone the applicable conformity assessment procedure.
The CE marking must be visible, legible, and indelible. For AI systems provided digitally, a digital CE marking may be used, provided it is easily accessible through the interface.
The CE marking is already familiar in many product sectors. For AI systems, it carries the same legal significance: it is the provider's declaration that the system conforms to all applicable EU requirements. Misuse of the CE marking — affixing it to a non-conforming system — is subject to penalties.
Notified Bodies: Who They Are and How They Work
Notified bodies are organizations designated by EU Member States to carry out third-party conformity assessments. Article 28 establishes the requirements for notification, and Articles 29-39 set out the rules governing notified bodies.
Requirements for Notified Bodies
To be designated, a notified body must demonstrate:
- Independence and impartiality — it must not have conflicts of interest with the providers it assesses
- Technical competence — its staff must have expertise in AI technology and the requirements of the regulation
- Adequate resources — it must have the financial and organizational resources to carry out assessments
- Confidentiality — it must protect commercially sensitive information obtained during assessments
The Current Landscape
As of early 2026, EU Member States are in the process of designating notified bodies for the AI Act. The designation process takes time, as bodies must be assessed against the requirements by their national accreditation body (typically the national member of the European Accreditation cooperation). Organizations expecting to need third-party assessment should monitor the development of notified body capacity and begin engaging early.
Preparing for Conformity Assessment: A Practical Checklist
Whether you are conducting internal control or preparing for third-party assessment, the following steps will help you prepare:
1. Complete your technical documentation. Ensure your Annex IV documentation is complete, current, and organized. This is the single most important preparation step — both internal control and third-party assessment depend heavily on documentation quality.
2. Implement a quality management system. If you do not already have one, establish a quality management system that meets the requirements of Article 17. If you have an existing QMS (for example, ISO 9001 certified), assess whether it covers all the elements required by the AI Act and supplement it where necessary.
3. Conduct risk management. Ensure your risk management system under Article 9 is fully implemented and documented. This includes identification of risks, evaluation of risk measures, and documentation of residual risks.
4. Test and validate. Conduct thorough testing and validation of your AI system and document the results. Ensure your testing covers accuracy, robustness, cybersecurity, and bias detection.
5. Prepare your declaration of conformity. Draft the EU declaration of conformity in advance, so that it is ready to be issued as soon as the conformity assessment is complete.
6. Engage with harmonized standards. Monitor the development of harmonized standards under the AI Act and apply them where available. Using harmonized standards provides a presumption of conformity with the requirements they cover.
7. Plan for post-market monitoring. Your post-market monitoring plan must be in place before the conformity assessment is completed. This plan is both a requirement of the technical documentation and an element that notified bodies will evaluate.
Modifications and Re-Assessment
Article 43(4) addresses what happens when a high-risk AI system that has already undergone conformity assessment is substantially modified. A substantial modification requires a new conformity assessment. The regulation defines a substantial modification as a change that affects the compliance of the system with the requirements of Chapter III, Section 2, or that results in a modification to the intended purpose for which the system has been assessed.
This means providers must establish clear change management procedures to evaluate whether modifications to their AI system trigger a re-assessment obligation. Minor updates — such as security patches or performance improvements that do not change the system's risk profile — generally do not require re-assessment. But changes to training data, model architecture, or intended purpose likely will.
The distinction between substantial and non-substantial modifications is fact-specific. When in doubt, err on the side of treating a modification as substantial. Placing a non-conforming system on the market carries far greater consequences than conducting an additional assessment.
Penalties
Placing a high-risk AI system on the EU market without completing the required conformity assessment is a serious violation. Administrative fines for non-compliance with the conformity assessment obligations can reach up to 15 million EUR or 3% of total worldwide annual turnover, whichever is higher. In addition, market surveillance authorities can order the withdrawal or recall of non-conforming systems.
Conclusion
Conformity assessment is the gateway to the EU market for high-risk AI systems. Whether your system follows the internal control pathway or requires third-party assessment, the underlying work is the same: comprehensive technical documentation, a robust quality management system, thorough risk management, and rigorous testing and validation.
Organizations should not view conformity assessment as a one-time hurdle. It is an ongoing obligation — substantial modifications trigger re-assessment, certificates have limited validity, and market surveillance authorities can request documentation and evidence of compliance at any time. Building the systems and processes to support conformity assessment on an ongoing basis is an investment in sustainable market access.
Make Your AI Auditable and Compliant
Ctrl AI provides expert-verified reasoning units with full execution traces — the infrastructure you need for EU AI Act compliance.
Explore Ctrl AIRelated Articles
Technical Documentation Requirements for AI Systems
What technical documentation is required under the EU AI Act — Annex IV requirements, risk management records, data governance documentation, and how to maintain compliance.
Human Oversight Requirements Under the EU AI Act
Guide to Article 14 human oversight obligations — what deployers must implement, automation bias prevention, and the right to override AI decisions in high-risk systems.
AI Literacy Requirements: Article 4 of the EU AI Act
Understanding the AI literacy obligation under Article 4 of the EU AI Act — what it means, who must comply, and how to implement AI literacy programs in your organization.