Harmful content on all platforms
Platforms have a legal obligation under the Online Safety Framework to have rules about acceptable content and include them in their terms and conditions, or community rules. They also have to enforce these rules. They all provide mechanisms for users to report content they consider is in breach of the rules.
Some social media platforms allow users to report misinformation or disinformation. However, misinformation or disinformation is not necessarily illegal so the obligations relating to illegal content will not apply to disinformation or misinformation unless the content is illegal for some other reason.
The large social media platforms have a duty to assess a range of risks that their services may pose. This includes risks to civic discourse and electoral processes, to public health or to public security. They also have to implement measures to mitigate those risks. This process will cover risks posed by some types of misinformation and disinformation.
The European Commission is tasked with assessing the adequacy of risk assessments and mitigation measures, including those that target misinformation and disinformation.
Neither Coimisiún na Meán nor any other public body has the authority to require content to be taken down, solely on the basis that the content represents misinformation or disinformation.
For detailed information on complaints mechanisms that are provided for under the Digital Services Act, please visit our dedicated complaints guidance page.
Harmful content on video-sharing platforms
Video-sharing platforms based in Ireland are required to protect users from harmful video content, including video content which:
- may harm the physical, mental, or moral development of children, or
- contains incitement to hatred or violence.
They are also required to protect users from harmful audiovisual commercial communications (video advertising), ensuring that:
- video advertisements are readily recognisable,
- video advertisements don’t use subliminal techniques,
- video advertisements don’t harm human dignity, include or promote discrimination, encourage harm to health or safety, or encourage harm to the environment,
- video advertisements don’t advertise cigarettes, other tobacco products, electronic cigarettes and refill containers, prescription medicines or medicinal products,
- video advertisements for alcohol are not aimed at children, and they don’t encourage excessive consumption of alcohol, and
- video advertisements don’t cause physical, mental, or moral harm to children.
Video-sharing platforms are required to have reporting and flagging systems in place so that users can report or flag these kinds of harmful video content to them. Video-sharing platforms must also inform users about what they’ve done once video content has been reported or flagged.
Video-sharing platforms must also provide complaints-handling systems so users can complain to them about how they have implemented reporting and flagging systems.
If you are concerned about video content or video advertising you have seen on a video-sharing platform, you should report or flag the content to the platform first.
You can contact Coimisiún na Meán’s Contact Centre (+ 353 1 963 7755 or usersupport@cnam.ie) with information about harmful video content or video advertising. We can use this information to better understand how video-sharing platforms are complying with their obligations under the Online Safety Framework.