Take 10 - 15 May 2026
Welcome back to RPC's Media and Communications law update where we recap on the key media judgments and developments over the last few months.
"Article 10.1: Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers."
Ofcom launches consultation for two draft Codes regulating video-on-demand services
Ofcom have published their draft Tier 1 Standards Code (the Code) regulating content on Tier 1 services (on-demand programme services whose average number of monthly UK users exceeds 500,000), which includes Netflix, Amazon Prime Video and Disney+. Whilst the Code supplements existing rules that apply to on-demand services, the Code will be the first time that on-demand content is regulated in a similar way to linear broadcasters. The Code broadly mirrors the layout of the current broadcasting code however it caters for differences between linear and on demand broadcasting. For example, since scheduling and the 9pm watershed do not protect children in the same way on on-demand services, the Code requires on-demand service providers to introduce "appropriate measures" to protect under-eighteens from harm including age ratings, content warnings, parental controls and age assurance mechanisms. Ofcom have also published a new Accessibility Code requiring Tier 1 service providers to ensure programmes are accessible to disabled people by, for example, introducing quotas for programmes featuring subtitles and audio description. The consultation for both codes is open for responses until 5pm on 7 August 2026.
Thorne & Anor v Protheroe-Beynon – Interim injunction extended following Defendant's harassment of opposing lawyers
The High Court extended interim injunctions against Defendant David Protheroe-Beynon, a litigant in separate family proceedings (the Proceedings), to restrain his alleged harassment of two legal representatives of the opposing party in the Proceedings who he blamed for the course the Proceedings were taking. The Defendant accused the Claimants in correspondence and on TikTok of acting unprofessionally in the Proceedings. Whilst the inter partes correspondence could not be relied on as contributing to the course of conduct amounting to harassment (Iqbal applied), for the first claimant the inter partes communications were considered good evidence of the Defendant's attitude towards her and his potential future actions if not restrained. The Defendant was evidently determined to “destroy” her professionally through his public allegations, and knew the likely psychological impact this would have on her. The Court considered the trial judge would find that his TikTok videos (at least those threatened) would meet the seriousness threshold required for harassment and that a reasonable person in the Defendant's position would know this. As for the second claimant, the Defendant's threatening correspondence towards her was not inter partes correspondence and therefore could be considered as part of the course of conduct, and the Defendant's TikTok videos accused her of being unprofessional in stark terms. Aidan Eardley KC held that the law of harassment was "much more concerned with the "tone, manner and frequency with which allegations are published" as opposed to the truth of the allegations and considered it likely that if not restrained the Defendant would continue to act in a "highly offensive and threatening way" towards the Claimants. The injunctions were extended since the Defendant's conduct and threatened conduct was causing the Claimants distress and an injunction would not prevent the Defendant from participating in the Proceedings or restrict his right to make complaints to regulators or the police.
Transparency and Open Justice Board publishes report on progress and next steps
On 30 April 2026, Mr Justice Nicklin, Chair of the Transparency and Open Justice Board, published a summary of the Board's current work and key areas of focus for 2026-2027. He also discussed some of the Board's work in a speech on open justice given on 13 May. The Board has now started to implement the key objectives agreed in July 2025. Compliance with these key objectives will be monitored through a Change Impact Assessment programme which will, using a traffic light system, assess where the current system aligns with the key objectives and where further work is needed across the various jurisdictions. This programme will be the Board's main focus throughout 2026. The Board is also continuing to progress several practical transparency initiatives, including: relaunching the Access to Core Documents Pilot in the Court of Appeal, running a new pilot scheme to improve public access to documents filed via CE-File in the Commercial Court, London Circuit Commercial Court and the Financial List; extending the broadcasting of proceedings to the Administrative Court; working to improve publication of First-tier Tribunal Immigration and Asylum Chamber decisions; and introducing regional Open Justice Champions within HMCTS to act as key points of contact for court staff, the judiciary, the media and the public to improve support regarding access to hearings.
Sands v Bond: Northern Irish High Court sets aside judgment in £1.8m libel case which ran for two years without notice to the defendants
The High Court in Belfast has set aside a record £300,000 judgment plus costs previously granted in favour of Neil and Donna Sands against the operator of commentary website Tattle Life along with a £1.8m worldwide freezing order, declaring the proceedings were never validly served on the defendants and have expired. The Sands had brought claims in defamation, privacy, harassment and data protection against several defendants including the operator of Tattle Life. The proceedings ran for over two years through several hearings in the defendants' absence and resulted in the Sands obtaining a £1.8m worldwide freezing order, indemnity costs and record libel damages in Northern Ireland of £300,000, including aggravated damages, against the defendants. Mr Bond only became aware of the proceedings after they were concluded and his accounts had been frozen. The plaintiffs had identified Mr Bond as the likely operator of Tattle Life before issuing proceedings in early 2023, but proceeded to issue proceedings against "persons unknown" without telling the Court of their findings. The Court found "egregious, repeated" breaches of the duty of full and frank disclosure by the plaintiffs and held that, had the true position been disclosed, the court would not have permitted substituted service by email, or given default judgment to the plaintiffs and awarded damages. The Court also held that service was ineffective. Permission had been given to serve via email, but the plaintiffs had sought to serve documents via a file-sharing link and were aware that the documents had not been downloaded. None of this was brought to the attention of the Court, which was instead told that the defendant was ignoring proceedings. The judgment illustrates the cardinal duty of full and frank disclosure in without-notice applications, the importance of safeguards around "persons unknown" claims and the severity of the consequences of plaintiffs who mislead the court and gain litigation advantage. RPC represents Bond alongside Northern Irish firm Mills Selig.
Meta launches challenge against Ofcom over fees and potential fines under the Online Safety Act 2023
Meta is challenging Ofcom in the High Court over its calculation of fees and penalties for breaches of the Online Safety Act 2023 (OSA). Under the OSA, Ofcom can impose penalties on companies earning over £250m a year of up to 10% of their qualifying worldwide revenue (QWR) or £18m, whichever is greater (Schedule 13 paragraph 4(1) OSA). Ofcom recovers the costs of running the online safety regime through fees payable by firms with a QWR of over £250m. Platforms therefore may pay penalties and fees calculated by reference to their global turnover, despite those earnings not being wholly generated by UK-based services. For a platform such as Meta, which last year reported a global revenue of $201bn, an Ofcom fine could equate to $20.1bn. Meta argues Ofcom's methodology for calculating these fees and penalties is disproportionate and unlawful, resulting in fines exceeding any imposed by a UK regulator and leading to a few companies, including Meta, bearing most of Ofcom's operating costs, despite the wide range of online services regulated under the OSA. Meta also disputes the calculation of penalties where several companies in the same group are found jointly liable for OSA infringements. Ofcom has said it will "robustly defend" its position. The next hearing is due to take place in June, with a full hearing anticipated in October. Separately, Ofcom continues to implement significant fines under the OSA: on 13 May, it published a decision to impose a £950,000 fine on the provider of an online suicide forum for non-compliance with illegal content duties under sections 9, 10, 20, 21, 23 and 102(8) of the OSA.
ICO guidance for public authorities on AI-generated FOI requests
According to guidance published by the ICO on 6 May 2026, freedom of information (FOI) requests to public authorities are increasingly drafted using AI which is leading in some cases to complex, lengthy and unclear FOI requests. In news which will be welcome to those making FOI requests, the ICO confirmed that a public authority cannot refuse a request just because AI has been used to draft it, or because it contains inaccuracies such as hallucinated case law. A FOI request must still be responded to provided it meets the requirements of s.8 FOI Act 2000 i.e. it's in writing, gives the name and correspondence address of the requestor and describes the information requested. If the request is so broad that it exceeds the cost limit for responding under s.12 FOI Act 2000, the public authority "must provide appropriate advice and assistance" (emphasis added) to help the requester reformulate their request. Regular readers may recall the government was reportedly considering reducing this cost cap – see our previous Take 10 here. However, the guidance clarifies that a request may be refused under s. 14 FOI Act 2000 if it constitutes a vexatious or repeated request. For example, AI is being used to make repeated requests for substantially similar information or matters already dealt with, or with the intention of disrupting the public authorities' work.
High Court overturns the Office for Students' decision to fine University £585,000 for curtailing free speech and academic freedom
The High Court ruled (press summary here) that the Office for Students (OfS)'s decision to fine the University of Sussex £585,000 in March 2025 for allegedly failing to uphold the principles of freedom of speech and academic freedom in its governing documents was unlawful. The OfS investigated the university in relation to its "Trans and Non-Binary Equality Policy Statement" (the Policy) and decided that the university had breached two registration conditions on the basis that the Policy was a governing document which failed to uphold freedom of speech and academic freedom (Condition E1) and because the university failed to follow proper procedures approving certain versions of the Policy (Condition E2). The Court held that the OfS did not have jurisdiction to find the university in breach of Condition E1 as the Policy was not a "governing document" per s.14(1) Higher Education and Research Act 2017 (HERA). Even if it had had jurisdiction, the OfS had misinterpreted the meaning of "freedom of speech within the law" because it had i) assumed that lawful speech could not be restricted; ii) failed to read the Policy as a whole; and iii) failed to read the Policy alongside other key documents which together protected freedom of speech. The High Court found the OfS had manifestly erred in law in its interpretation of s.2 HERA, which protected academic freedom by preventing academics from jeapordizing their job because of their views, as the Policy had not placed academics at such risk. Whilst the OfS had jurisdiction to find a breach of Condition E2, it erred in law by failing to consider whether the alleged breaches had been remedied prior to its decision. In any event the Court held the OfS decision was vitiated by bias since the OfS had unlawfully predetermined the decision.
The Crime and Policing Act and Children's Wellbeing and Schools Act expand online safety
Following our previous reporting on the Crime and Policing Bill's journey through Parliament, it received Royal Assent on 29 April 2026. The Crime and Policing Act 2026 (CPA) aims to tackle criminal and anti-social behaviour by expanding powers for the police and partner agencies such as Ofcom. When it enters into force later this year, the CPA will significantly expand on the Online Safety Act 2023 (OSA). For example, s.248 of the CPA (which introduces a new s.216A to the OSA) allows the Secretary of State to amend the OSA to bring any "internet service capable of generating AI-content" no matter what proportion of the service is AI-generated within the scope of the OSA's illegal content rules. Section 99 of the CPA adds a new s.66I to the Sexual Offences Amendment Act 2003 to criminalise the making or supply of intimate image generators (i.e. “nudification” tools) with potential personal liability for tech executives for breaches, unless they can show they took "all reasonable steps to prevent" the use of the service for this purpose. Meanwhile, ss.200 – 203 of the CPA contain provisions granting anonymity to authorised firearm officers, which conflicts with the principle of open justice causing concern amongst reporters. See our previous reporting for more detail on this. Separately the Children's Wellbeing and Schools Act 2026, which also received Royal Assent on 29 April 2026, grants new powers to the Secretary of State under new s.214A of the OSA to introduce regulations requiring online service providers to prevent or restrict children accessing certain online services.
Ofcom issues new advice on accuracy standards for detecting terrorism and CSEA content
On 8 May Ofcom published advice on the minimum standards for accuracy for technology being used to identify terrorism and child sexual exploitation and abuse (CSEA) content. Under s.121 of the OSA, Ofcom has the power to issue a Technology Notice requiring a service provider to use or develop "accredited technology" approved by Ofcom for the detection and prevention of terrorism and CSEA content on its platform. Ofcom has also published guidance on how it proposes to use these powers. The Secretary of State is now expected to approve and publish minimum accuracy standards based on Ofcom’s advice, after which Ofcom will establish an accreditation process for technologies meeting those standards.
US AI platform, Objection, launches a private AI tribunal for media disputes
Founded by businessman Aron D'Souza and backed by Peter Thiel, US AI platform, Objection, purports to offer a paid for, private "AI tribunal" service to help the public more easily challenge media reporting. Any individual can file an objection to the platform where they think they have been the subject of unfair media reporting for an initial price of $2,000. According to Objection, investigators including "former FBI, NSA, and CIA professionals" will examine the story and send their findings to an AI tribunal for determination. An AI "jury" will then assess the veracity of the claims made and issue a decision. During the investigation, journalists will be given the right to reply and provide evidence to defend their claims and the full record of the case including the decision is published online, including all documents, communications and responses. The platform will also rate journalists based on various criteria including their use of anonymous sources, emotive and political language, "click bait" writing. The decisions are not legally binding for those who do not sign up to the platform, unlike the decision of a UK court, however, this platform poses a potentially chilling effect on journalists' right to free speech.
Quote of the fortnight
"This was like a SLAPP on steroids. A wealthy couple were able to shut down unwanted speech about themselves by securing a court judgment without even the most basic legal process. Anyone who worries about the powerful or wealthy using high-powered lawyers to shut down free speech should be terrified by this case."
Lord Young of Acton from the Free Speech Union commented on Sands v Bond case
Stay connected and subscribe to our latest insights and views
Subscribe Here