Intellectual property
Written by Joshy Thomas and Ciara Cullen
Key developments in 2025
The way generative AI models are trained using data sets comprised of IP works scraped from publicly available websites, and liability for AI generated outputs have continued to receive significant attention this year providing for a degree of uncertainty for both AI developers and IP rights holders.
Content creators such as news providers, musicians, authors and visual content agencies allege that their work is being unlawfully used to train AI models. The High Court judgment in the Getty Images (US) Inc v Stability AI Ltd case, the most prominent case making these kinds of allegations in the UK, was handed down in November 2025. The court's findings left many questions unanswered. A primary copyright claim – whether training an AI model on copyright works, without consent, infringes copyright in the UK – was dropped during the trial on territorial grounds. The secondary infringement copyright claim, concerning the importation of the AI model into the UK, failed on the basis that the AI image generator model (as opposed the data it was trained on) did not store copyright works and was therefore not an infringing copy of any of Getty's works. Getty Images has been granted permission to appeal this finding. The trade mark infringement claim was successful, but highly fact based. Some AI model outputs, from earlier models, were found to contain the Getty's watermark, infringing Getty's registered trade mark. Later models largely filtered out these watermarks and so this is a finding less likely to be replicated as filtering technology improves.
A similar case in Germany was decided differently, and as cases in this area are likely to be highly fact and evidence based, it is likely to remain fertile ground for disputes.
What to look out for in 2026
UK policy on copyright and AI, and more generally AI regulation is expected to be a key area of interest and focus. This has been a pressing issue since 2022 when the UK IPO signalled its intention to introduce a new copyright and database exception that would allow text and data mining (TDM) for any purpose including commercial use. The proposal was unpopular with the creative industries and was subsequently withdrawn pending an assessment of the implications for key stakeholders. A working group of key stakeholders tried and failed to agree on an effective voluntary code of conduct to resolve the main issues of labelling and metadata for the outputs of generative AI, transparency of inputs, and licensing and permissions.
In 2025, it was widely anticipated that these issues would involve formal government intervention to move forward. However, despite another year of intense lobbying by the creative industries, it is now thought that an AI Bill may remain elusive, with the government focusing instead on boosting the capabilities of key regulators such as Ofcom, the CMA and the ICO and using existing regulation such as data protection, competition, equality legislation, and online safety.
The Data (Use and Access) Act 2025, which came into force this year, does not set out a copyright and AI regime, however it requires the government, by March 2026, to publish an economic impact assessment considering each of the policy options described in the Copyright and AI consultation and publish a report on the use of copyright works in the development of AI systems.
Uncertainty and fluidity in this area makes it challenging for developers, deployers and users to correctly allocate risk and for insurers to assess liability, worst case damages and claim frequency. Coverage disputes may increase (does AI sit inside existing wording?). In the UK, policy direction is being worked through and is still shifting. In light of this, insurers finding themselves in the role of quasi-enforcers through policy conditions may increasingly look for evidence that policyholders have guardrails such as licensing and provenance checks, human review, logging, external AI supplier controls, and incident playbooks.
Stay connected and subscribe to our latest insights and views
Subscribe Here