Coimisiún na Meán’s Investigations Team has today (02.12.2025) commenced two formal investigations into the TikTok and LinkedIn platforms, under the EU Digital Services Act (DSA). The investigations will assess whether these platforms have contravened Articles 16(1), 16(2)(c) and Article 25 of the DSA.
In September 2024, Coimisiún na Meán’s Platform Supervision Division commenced a review of online providers’ compliance with Article 16 of the DSA. Article 16 concerns the ‘Notice and Action’ mechanisms which providers are required to have in place to allow people to report content that they suspect to be illegal.
As part of the review, concerns arose in relation to potential ‘dark patterns’, or deceptive interface designs, of the illegal content reporting mechanisms, specifically that the reporting mechanisms were liable to confuse or deceive people into believing that they were reporting content as illegal content, as opposed to content in violation of the provider’s Terms and Conditions. If this is correct, this might mean that the illegal content reporting mechanisms are not effective in preventing the dissemination of illegal content and the rights of people under the DSA might be undermined.
The investigations will look into:
- Whether the illegal content reporting mechanisms implemented by TikTok and LinkedIn are easy to access and user-friendly – Article 16(1)
- Whether the illegal content reporting mechanisms provided by TikTok and LinkedIn allow people to report suspected child sexual abuse material anonymously – Article 16(2)(c)
- Whether the illegal content reporting mechanisms provided by TikTok and LinkedIn deceive people from reporting content as illegal – Article 25
John Evans, Digital Services Commissioner at Coimisiún na Meán said: “The Digital Services Act has marked a step change for online safety in Ireland, and across the EU, providing people who use online providers with greater rights, and placing new obligations on providers to keep people safe online.
At the core of the DSA is the right of people to report content that they suspect to be illegal, and the requirement on providers to have reporting mechanisms, that are easy to access and user-friendly, to report content considered to be illegal.
Providers are also obliged to not design, organise or operate their interfaces in a way which could deceive or manipulate people, or which materially distorts or impairs the ability of people to make informed decisions.
Following the opening of a review last year into the compliance of a number of online providers with Article 16 of the DSA, Coimisiún na Meán has requested information from several of those based in Ireland to ensure that people have a right to report content that they suspect to be illegal, through an accessible and user-friendly reporting mechanism.
This review has resulted in several actions which we can announce today, including the opening of an investigation into two online platforms – TikTok and LinkedIn. In the case of these platforms, there is reason to suspect that their illegal content reporting mechanisms are not easy to access or user-friendly, do not allow people to report child sexual abuse material anonymously, as required by the DSA, and that the design of their interfaces may deter people from reporting content as illegal.
A number of other providers have made significant changes to their reporting mechanisms for illegal content, following engagement with Coimisiún na Meán. An Coimisiún is currently assessing these changes for their effectiveness.
Our message is clear, we expect providers to comply with their obligations under the DSA and to engage with us when making changes to their reporting mechanisms. Where concerns exist of significant non-compliance by any provider, we can use our regulatory tools, up to and including investigations, to ensure providers meet their obligations to keep people safe online.
We have requested further information from several other providers to assess their compliance with Article 16 and Article 25 of the DSA, and at this point, we are not ruling out further regulatory action, if needed, to ensure compliance with the DSA.
For people in Ireland and across the EU who use online providers that are based in Ireland, we would encourage you to report suspected illegal content to the provider where you see it. If you can’t find an easy way to do this, or if you’re not happy with a platform’s response, our Contact Centre can provide advice and support and can escalate issues to our Complaints Team if required. For the largest platforms, we collaborate closely with the European Commission to ensure the DSA produces good outcomes for European citizens.”
Coimisiún na Meán, as Ireland’s Digital Services Coordinator, is responsible for the application of the DSA, and for supervising providers established in Ireland for their compliance with the DSA. Coimisiún na Meán also supervises compliance with the Online Safety Code and the EU Terrorist Content Online Regulation under its Online Safety Framework.
The investigation will be conducted pursuant to Part 8B of the Broadcasting Act 2009, as amended. If a provider is found in violation of the DSA, Coimisiún na Meán can apply an administrative financial sanction, including a fine of up to 6% of turnover. During an investigation concerning a possible breach of the DSA, Coimisiún na Meán and the provider can enter into a binding Commitment Agreement, in which the provider agrees to take measures that appear to An Coimisiún to address any issue relating to compliance by the provider.