Skip to content
EU AI Act

EU AI Regulatory Sandboxes: How to Apply

5 min readUpdated 2 May 2026

The EU AI Act mandates that member states establish AI regulatory sandboxes — controlled environments where companies can develop, test, and validate AI systems under regulatory supervision before full compliance obligations apply. For startups and scale-ups building novel AI systems, sandboxes offer a pathway to market that is faster, more certain, and more collaborative than navigating compliance alone.

This article explains what sandboxes offer, who qualifies, and how to apply.


What Is an AI Regulatory Sandbox?

A regulatory sandbox is a supervised programme run by a national regulator (or consortium of regulators) that allows companies to:

  • Develop and test AI systems with direct input from the regulator
  • Receive informal guidance on how the system is likely to be classified and what compliance steps are required
  • Test the system in a controlled environment without full enforcement exposure
  • Identify and address compliance gaps before market launch

The sandbox does not suspend compliance requirements — it provides a structured pathway to meet them, with regulatory guidance along the way. Companies participating in sandboxes still need to comply with the AI Act; they get help and protected space to do it.


Why the EU AI Act Created Sandboxes

The AI Act acknowledges that it is difficult for innovative AI companies — especially startups — to interpret complex regulatory requirements for genuinely novel systems. The prohibited/high-risk/limited-transparency framework was designed for categories of AI that already exist. But AI development moves faster than legislation.

Sandboxes provide:

  • Regulatory certainty — companies know what compliance looks like before investing in it
  • Faster innovation — not waiting for guidance documents that may take years
  • Reduced compliance risk — sandbox participants are not exposed to enforcement during the process
  • Better regulation — regulators learn from real systems, improving future guidance

Who Can Apply

Article 57 of the AI Act sets eligibility criteria. Sandboxes are open to:

  • SMEs and startups as a priority (the Act explicitly requires regulators to give priority to smaller companies)
  • Providers of high-risk AI systems (Annex III)
  • Providers of GPAI models with systemic risk
  • Any company developing novel AI systems where classification is unclear

Companies from outside the EU can participate in sandboxes if they are planning EU market access and have an EU-based partner or representative.


What You Need to Apply

Applications to sandboxes typically require:

System description:

  • What the AI system does and its intended purpose
  • The deployment context and affected individuals
  • Initial risk self-assessment (even if incomplete)
  • What compliance challenge you are seeking guidance on

Company information:

  • Legal entity details
  • Team background and relevant expertise
  • Funding status (relevant for SME priority access)

Development status:

  • Current development stage (concept, prototype, MVP)
  • Expected timeline to market
  • What validation or testing you have already conducted

Applications are assessed by the national authority on: innovation potential, public benefit, genuine regulatory uncertainty, and whether the sandbox will help the company reach compliance faster.


Member State Sandbox Programmes

Sandbox programmes are being established at the national level. As of 2026, the most active programmes include:

Spain — Spain's AESIA (Agencia Española de Supervisión de la Inteligencia Artificial) launched one of the EU's first AI regulatory sandboxes. Strong focus on generative AI and healthcare AI applications.

Germany — The German AI sandbox programme is in development, coordinated between BNetzA and BaFin for financial and telecoms AI. Particular interest in DORA-relevant fintech AI.

Netherlands — The Dutch DPA (AP) has sandbox provisions for AI systems with significant data protection implications, aligning AI Act and GDPR compliance.

Denmark and Sweden — Nordic sandboxes are running, with particular interest in public sector AI and AI in regulated professions.

EU-Level — The European AI Office can run its own sandbox programmes for GPAI models and cross-border AI systems. The Commission coordinates sandboxes with specific focus on AI in health, climate, and transport.


What Participating in a Sandbox Gives You

Legal protection during the sandbox period: Regulatory enforcement actions for AI Act compliance are generally paused for sandbox participants who are operating in good faith within the sandbox framework.

Informal regulatory guidance: Direct access to the regulator's view on your classification, documentation approach, and compliance pathway. This is not a formal ruling — it is guidance — but it significantly reduces compliance uncertainty.

CE marking pathway clarity: For systems requiring conformity assessment, sandbox participation can clarify whether self-assessment or notified body assessment applies, and what the notified body will look for.

Credibility with customers and investors: Sandbox participation signals that a company's AI development is being conducted with regulatory awareness — a meaningful differentiator in enterprise sales.


Limitations

Sandboxes are not:

  • A way to delay compliance — you still need to comply before market deployment
  • A guarantee of approval — the sandbox informs compliance, it does not pre-approve the system
  • Available for prohibited AI practices — sandboxes do not create exceptions to the Article 5 prohibitions

The sandbox period has a defined duration — typically 12–24 months. At the end, you must be compliant to deploy.


How to Find and Contact Your National Sandbox

To apply, identify the relevant national competent authority for AI in your member state. The European AI Office maintains a directory. Typical contact route:

  1. Check the national AI strategy or digital/innovation ministry website for sandbox announcements
  2. Email the designated national authority for AI Act implementation
  3. Prepare a concise one-page overview of your system and what guidance you are seeking

Applications are competitive in high-demand sandboxes. Submit early.

ComplyOne classifies your AI systems against the EU AI Act risk tiers and generates the required documentation automatically.

Run your AI Act risk assessment →