Sunday, March 9, 2025

THE PLATFORMS AND THE DIGITAL SERVICES ACT - DSA

 Filenews 9 March 2025  by Michalis Paraskevas



On 19/10/2022, Regulation 2022/2065 of the European Parliament and of the Council on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) was voted.

The Digital Services Act (DSA) was established to create a safer and more responsible online environment. The DSA introduces a single set of rules for digital service providers, such as online platforms, marketplaces, social networks and search engines, to protect users and enhance platforms' transparency and accountability.

Previous legislation, such as the 2000 E-Commerce Directive, no longer responded to the modern challenges of the internet, necessitating a revision and modernisation of the legal framework.

The increasing spread of illegal and harmful content online, as well as disinformation, put citizens' fundamental rights, such as freedom of expression and privacy, at risk.

The lack of transparency and accountability by large online platforms had led to a decline in user trust, necessitating enhanced transparency and accountability.

The dominance of some large platforms in the market created inequalities and barriers for smaller companies, limiting competition and innovation.

Online platforms are now required to have clear procedures in place to swiftly remove illegal content, products or services when they receive notices.

Platforms must provide users with clear information about the ads displayed, including who is responsible for displaying them and the main parameters that determine their appearance.

Targeted advertising based on sensitive user data, such as race, religion or sexual orientation, as well as targeted advertising to minors, is prohibited.

Platforms with more than 45 million users in the EU (around 10% of the population) are subject to stricter obligations, such as carrying out annual risk assessments on the dissemination of illegal content and taking measures to limit these risks.

Researchers and law enforcement authorities can request access to data from platforms to understand and address systemic risks, such as the spread of disinformation.

The very big ones – VLOP

The DSA imposes stricter rules on major online platforms, such as Facebook, Instagram, TikTok and Twitter (now X), with the aim of quickly removing illegal content, curbing disinformation and better protecting users, especially minors.

Meta (Facebook, Instagram) reported that, between June and December 2022, Facebook had around 255 million monthly active users in the EU, while Instagram had around 250 million.

This classifies them as "very large online platforms" (VLOPs) with the DSA, forcing them to comply with the new requirements. Meta has acknowledged the seriousness of celebrity-related scams and announced testing new techniques to detect these misleading and illegal ads.

Tik Tok said that, from August 2022 to January 2023, it had an average of 125 million monthly active users in the EU, making it a VLOP as well. TikTok has pledged to comply with DSA requirements, though specific details about the measures it took are not available.

Twitter (X) reported nearly 101 million monthly active users in the EU in the last 45 days prior to the report, also ranking it as a VLOP. However, there have been reports that Twitter is not fully complying with its legal obligations under the DSA's "Notice and Action" framework by rejecting requests to remove misleading content.

In cases of non-compliance, these platforms may face significant penalties, including fines that can reach up to 6% of their global annual turnover.

What does DSA impose on platforms?

The DSA aims to tackle manipulation of users through algorithms, as happened in the case of the Cambridge Analytica scandal involving the complicity of Facebook.

It imposes increased transparency and accountability obligations on online platforms, especially very large online platforms (VLOPs), for the management of illegal and harmful content.

Platforms are required to provide clear information on how their algorithms work, including the criteria used to recommend and moderate content.

VLOPs need to identify, analyse and mitigate systemic risks arising from their operation, such as the spread of disinformation and election manipulation.

The DSA allows researchers and regulators to access platform data to assess their compliance with regulations and understand the impact of their algorithms

In addition, the European Union has taken other legislative initiatives to protect citizens from manipulation through algorithms such as the Artificial Intelligence Act (AI Act) and the Political Advertising Transparency Regulation.

Overall, the DSA, together with other EU legislative initiatives, seeks to limit manipulation of users through algorithms by enhancing transparency, accountability and protection of fundamental rights in the digital environment.

Advocates-Legal Consultants