By Melike Pala
BRUSSELS (AA) - The European Commission and the Board of Digital Services Coordinators on Wednesday published the first-ever Digital Services Act (DSA) systemic risk report, outlining key threats emerging on major online platforms and search engines across the EU.
The report identified a broad range of systemic risks, including the spread of illegal content, threats to fundamental rights, and growing concerns linked to mental health and the protection of minors.
It also reviewed the initial mitigation measures taken by very large online platforms (VLOPs) and search engines (VLOSEs) under the DSA's transparency rules.
According to the findings, platforms face recurrent risks related to public health misinformation, harmful or illegal goods sold online, and large-scale coordinated disinformation campaigns.
Regulators also flagged the misuse of generative AI, including its role in producing manipulated media or child sexual abuse material.
A significant portion of the report focused on risks to minors, such as exposure to child sexual abuse material (CSAM), grooming, sextortion, cyberbullying, and harmful social media challenges.
Civil society groups also raised concerns about the commercial exploitation of child influencers and the lack of accessible reporting tools for young users.
The report highlighted mental health impacts, including addiction-like use of social media, exposure to self-harm and suicide-related content, and anxiety linked to continuous consumption of distressing news.
Excessive screen time and unrealistic body standards were also cited as contributing to physical and psychological harm.
The European Commission said this first edition will serve as a reference point for monitoring systemic risks in the EU and improving transparency and accountability.