UK proposes AI Growth Lab: a new regulatory sandbox for real-world AI testing
The question
What does the UK’s proposed AI Growth Lab mean for AI developers?
The key takeaway
The UK Government has opened a call for evidence on the AI Growth Lab, a proposed large-scale regulatory sandbox that would allow supervised, real-world testing of AI systems under temporary, targeted regulatory modifications. The initiative aims to accelerate responsible AI innovation while generating evidence to inform long-term regulatory reform.
The background
As the UK continues to favour a sector-led approach to AI oversight - rather than adopting an EU-style horizontal framework - many developers report that outdated or rigid rules in areas such as healthcare, transport, planning, financial services, and autonomous systems are slowing deployment. While sectoral regulators have trialled advisory or experimental sandboxes, these initiatives often cannot override statutory requirements, limiting their usefulness for testing higher-risk or regulated AI applications.
With business adoption of AI still relatively low, the Government is exploring whether supervised, temporary easing of regulatory constraints could unlock innovation and productivity growth. International peers are pursuing similar models, including the EU’s mandated AI sandboxes under the AI Act and proposed US federal sandbox legislation.
The development
The Government’s proposal would establish an AI Growth Lab to enable real-world testing of AI systems within time-limited “sandbox pilots”. These pilots would be jointly overseen by sector regulators and the Government, and would operate under bespoke licences setting out what is permitted and under what conditions.
Key features include:
- targeted statutory modifications - participating organisations could test AI systems under temporary adjustments or exemptions to selected legal requirements, where justified and safe to do so;
- strong safeguards and continuous oversight - every participant would operate under a licence defining permitted activities, risk thresholds, reporting duties, monitoring obligations, and audit requirements. The Lab would have powers to suspend or terminate a pilot immediately if risks emerge or conditions are breached;
- non-modifiable “red-line” protections - certain legal requirements would remain off-limits for modification, including consumer protection, health and safety, fundamental rights, intellectual property, and worker protections;
- a pathway to permanent reform - if sandbox pilots show that a modified rule is safe, proportionate and innovation-enabling, the Lab may recommend integrating the change into the wider regulatory framework through streamlined legislation or regulatory updates.
Operating model options
Two models are being considered:
- a centralised, cross-sector Lab operated by Government; or
- regulator-led Labs tailored to specific sectors or cross-cutting use cases.
Why is this important?
For AI developers - especially in highly regulated sectors - the AI Growth Lab could create new opportunities to deploy and refine technologies that existing rules currently restrict.
Equally important, participants would have a structured channel to shape the future regulatory landscape. Successful pilots could feed directly into revised legislation, regulatory guidance, or new statutory exemptions. If implemented effectively, the Growth Lab could become a significant driver of responsible AI adoption across the UK economy.
Any practical tips?
The AI Growth Lab remains at the proposal stage. The Government’s call for evidence is open until 2 January 2026.
Organisations should consider:
- responding to the consultation, particularly if regulatory constraints materially impede deployment of your AI system;
- identifying specific legal barriers facing your technology and whether temporary modification could enable meaningful testing;
- assessing which operating model - centralised or regulator-led - would best support your sector;
- preparing case studies or evidence demonstrating how real-world testing could accelerate safe innovation.
To submit a response, visit:
Winter 2025
Stay connected and subscribe to our latest insights and views
Subscribe Here