Advanced

AI Act Relationship & Dual Marking

A practical guide to ai act relationship & dual marking for AI conformity-assessment practitioners.

What This Lesson Covers

AI Act Relationship & Dual Marking is a key topic within MDR/IVDR for AI Medical Devices. In this lesson you will learn the underlying conformity-assessment discipline, the controlling regulations and standards, how to apply the procedures to real AI systems, and the open questions practitioners are actively working through. By the end you will be able to engage with ai act relationship & dual marking in real AI conformity-assessment work with confidence.

This lesson belongs to the Sectoral Conformity Regimes category of the AI Conformity Assessment track. AI conformity assessment sits at the intersection of product regulation, accredited certification, sectoral regulation, and AI engineering. Understanding the sectoral conformity regimes (medical devices, automotive, aviation, financial services) that operate alongside the AI Act is what lets you build a conformity programme that survives notified-body assessment, certification body audit, and market surveillance scrutiny.

Why It Matters

Conform AI medical devices to EU MDR (Regulation 2017/745) and IVDR (Regulation 2017/746). Learn classification rules under MDR Annex VIII (where AI sits), GSPR (general safety and performance requirements) mapping, clinical evaluation and post-market clinical follow-up (PMCF), the relationship between MDR/IVDR and the AI Act (high-risk AI in medical devices), and the dual-marking and combined notified-body assessment that is now required.

The reason ai act relationship & dual marking deserves dedicated attention is that AI conformity assessment is a young, rapidly maturing discipline. The EU AI Act conformity-assessment provisions become enforceable in stages between 2025 and 2027, ISO/IEC 42001 certificates started issuing in 2024, and CEN-CENELEC JTC 21 is producing harmonised standards on a rolling basis. Practitioners who can reason from first principles will navigate the next standard or the next interpretation far more effectively than those who only know today's rules.

💡
Mental model: Treat every conformity assessment as a chain — classification, criteria, route, evidence, declaration, marking, registration, post-market. Each link must be defensible to a sophisticated reviewer (notified body, accreditation body, market surveillance authority, court, peer reviewer). Master the chain and you can run any AI conformity-assessment regime that lands tomorrow.

How It Works in Practice

Below is a practical conformity-assessment pattern for ai act relationship & dual marking. Read through it once, then think about how you would apply it to a real high-risk AI system in your portfolio.

# Sectoral conformity pattern
SECTORAL_STEPS = [
    'Confirm the sectoral regime applies (FDA, EASA, UNECE, MDR, SR 11-7)',
    'Map the sectoral lifecycle to the AI Act lifecycle',
    'Build the joint evidence package (no duplicate work)',
    'Engage the sectoral assessor (FDA reviewer, NB, supervisor)',
    'Issue sectoral approval / clearance / supervisory non-objection',
    'Operate sectoral post-market obligations alongside AI Act PMS',
]

Step-by-Step Analytical Approach

  1. Establish the criteria — What regime, standard, or scheme will this assessment measure against (EU AI Act Annex III + harmonised standards, ISO/IEC 42001, FDA SaMD + GMLP, sectoral regulation)? Document the criteria up front; conformity without explicit criteria is opinion, not assurance.
  2. Confirm the assessment route — Map the regime to the route (first-party, second-party, notified-body / certification-body assessment). For high-risk AI specifically, prefer the route that is mandated and defensible rather than the cheapest.
  3. Plan the evidence — Map every essential requirement to the evidence that will demonstrate compliance (technical documentation section, eval report, model card, datasheet, audit log). Use the MDR/IVDR for AI Medical Devices pattern from this topic.
  4. Collect sufficient appropriate evidence — Multiple sources, time-stamped, hash-pinned, secured. The bar is what a sophisticated reviewer (notified body, certification body, regulator) would expect to support the conformity conclusion.
  5. Form the declaration — Compare evidence to criteria; identify nonconformities; resolve before declaration; produce the DoC, certificate, or attestation; affix marking; register where required.
  6. Operate post-market — Maintain monitoring, handle incidents, manage substantial modifications, refresh evidence at every release, and prepare for surveillance.

When This Topic Applies (and When It Does Not)

AI Act Relationship & Dual Marking applies when:

  • You are placing or putting into service a high-risk AI system in the EU and need to demonstrate AI Act conformity
  • You are pursuing or maintaining ISO/IEC 42001 certification
  • You operate in a regulated sector (medical devices, automotive, aviation, financial services) and need sectoral approval for an AI-enabled product
  • You are responding to a customer or procurement requirement that asks for a conformity declaration, certificate, or attestation
  • You are consuming third-party conformity artefacts (DoCs, certificates, attestations, technical documentation) and need to assess their quality

It does not apply (or applies lightly) when:

  • The AI system is genuinely outside any conformity-assessment regime (most internal-use, low-risk AI)
  • The work is design-stage advisory rather than independent conformity assessment
  • The AI system is a research prototype not placed on market
Common pitfall: Practitioners often treat conformity assessment as a documentation exercise — producing the technical file, signing the DoC, and considering the work done. Conformity assessment is a living obligation: post-market monitoring, serious incident reporting, substantial modification management, and surveillance audits will catch you if the underlying engineering does not actually meet the essential requirements. Build the evidence as you build the AI; do not retro-fit it for the assessor.

Practitioner Checklist

  • Are the criteria for this conformity assessment explicit, written, and agreed with the relevant assessor?
  • Is the assessment route correct (first-party / NB / certification body) for the regime and risk level?
  • Is evidence preserved with integrity (timestamp, hash, immutable storage, 10-year retention)?
  • Are nonconformities tracked and closed with verified effectiveness, not self-attestation?
  • Do you have a written post-market monitoring plan, with named owners and feedback channels?
  • Is there a substantial-modification process that triggers re-assessment when needed?
  • Are surveillance audits and recertification on the calendar with named accountable owners?

Disclaimer

This educational content is provided for general informational purposes only. It does not constitute legal, regulatory, or professional advice; it does not create a professional engagement; and it should not be relied on for any specific conformity assessment, certification, or compliance matter. AI conformity-assessment regimes vary by jurisdiction and change rapidly. Consult qualified regulatory affairs, quality, and legal professionals for advice on your specific situation.

Next Steps

The other lessons in MDR/IVDR for AI Medical Devices build directly on this one. Once you are comfortable with ai act relationship & dual marking, the natural next step is to combine it with the patterns in the surrounding lessons — that is where doctrinal mastery turns into a working conformity programme. AI conformity assessment is most useful as an integrated discipline covering classification, route selection, technical documentation, declaration, marking, registration, post-market monitoring, and substantial-modification handling.