Commission presses platforms on appeals notice and action and transparency under DSA

Published on 02 January 2026

The question

What mechanisms has the European Commission established under the Digital Services Act (DSA) to ensure users can appeal content moderation decisions?

The key takeaway

The Commission’s preliminary findings signal that under the DSA, accessible, manipulationfree reporting and robust appeals are core obligations, so companies should simplify user journeys and strengthen transparency now to avoid facing fines up to 6% of global revenue for shortfalls.

The background

Under the DSA, online platforms operating within the European Union are legally required to establish accessible mechanisms enabling users to contest content moderation decisions, such as the removal of content or the deletion of accounts. This regulatory obligation is designed to promote greater transparency and safeguard users impacted by enforcement actions.

In connection with formal investigations initiated in 2024 into Meta and TikTok’s compliance with the DSA, the European Commission has issued preliminary findings indicating that both companies may have failed to meet the DSA’s standards for transparency and user protection. These findings underscore the critical importance of robust appeal mechanisms and reinforce the necessity for platforms to uphold their obligations under the DSA.

Additionally, the Commission has identified that Facebook, Instagram, and TikTok have imposed unduly burdensome processes for researchers seeking access to public data. These restrictions have resulted in researchers obtaining only partial or unreliable datasets, thereby hindering independent scrutiny and oversight.

Meta and TikTok now have the opportunity to review the Commission’s investigation files and respond in writing to the preliminary findings. They may also address any identified breaches during this process. The European Board for Digital Services will be consulted as part of the ongoing investigation.

It is important to emphasise that these preliminary findings are part of ongoing formal investigations into both companies, and the Commission continues to assess other potential breaches of the DSA. For further details on the DSA’s specific requirements, please see our winter 2024 Snapshots edition here.

The development

The Commission’s preliminary findings highlight concerns regarding the lack of an accessible ‘Notice and Action’ mechanism for users to report illegal content, including child sexual abuse material and terrorist content. The investigation revealed that the current reporting processes on major platforms impose unnecessary steps on users, which may discourage them from flagging harmful material. Additionally, the use of dark patterns and deceptive interface designs was found to confuse and dissuade users, rendering the existing ‘Notice and Action’ mechanisms largely ineffective.

Meta and TikTok’s response will be pivotal for understanding the exact standards platforms are required to provide under the DSA in relation to content moderation. Moreover, the Commission may issue a non-compliance decision resulting in significant penalties, including fines of up to 6% of the companies’ global annual revenue as well as ongoing penalty payments to enforce compliance.

Why is this important?

These preliminary findings matter because they provide an early indication of the Commission’s enforcement priorities under the DSA - namely that usable reporting and appeal pathways are substantive obligations, not design choices. By signalling potential noncompliance decisions with fines up to 6% of global revenue and periodic penalties the Commission emphasises the cost of dark patterns, friction in reporting, and restricted researcher access. Clarity on these standards will drive product and governance changes across platforms, strengthen transparency and scrutiny, and accelerate accountable content moderation in the EU and beyond.

Any practical tips?

For online platforms, the near‑term priority is to streamline notice‑and‑action and appeals by mapping the user journey, removing unnecessary steps and manipulative design, using plain language, and setting clear timelines for receiving, reviewing, and resolving reports. They should also strengthen oversight and record‑keeping by assigning clear ownership of DSA compliance, keeping reliable moderation logs, regularly reassessing risks and mitigations, and training staff and partners to avoid manipulative design.

Finally, companies should enable transparency and researcher access by establishing a straightforward process for approved access to public data under the DSA, improving transparency reports with the required facts and figures, and maintaining secure, privacy‑respecting systems that can respond promptly to regulators.

Winter 2025

Stay connected and subscribe to our latest insights and views 

Subscribe Here