On 17 February, the Digital Services Act (DSA), the EU's landmark rulebook that aims to make the online environment safer, fairer and more transparent, starts applying to all online intermediaries in the EU.
Under the DSA, EU users are better protected against illegal goods and content and have their rights upheld on online platforms where they connect with other users, share information, or buy products.
New responsibilities for platforms and empowered users
All online platforms with users in the EU, with the exception of small and micro enterprises employing fewer than 50 persons and with an annual turnover below €10 million, must implement measures to:
- Counter illegal content, goods, and services: online platforms must provide users with means to flag illegal content, including goods and services. More so, online platforms will have to cooperate with 'trusted flaggers', specialised entities whose notices will have to be given priority by platforms.
- Protect minors: including a complete ban of targeting minors with ads based on profiling or on their personal data.
- Empower users with information about advertisements they see, such as why the ads are being shown to them and on who paid for the advertisement.
- Ban advertisements that target users based on sensitive data, such as political or religious beliefs, sexual preferences, etc.
- Provide statements of reasons to a user affected by any content moderation decision, e.g., content removal, account suspension, etc. and upload the statement of reasons to the DSA Transparency database.
- Provide users with access to a complaint mechanism to challenge content moderation decisions.
- Publish a report of their content moderation procedures at least once per year.
- Provide the user with clear terms and conditions, and include the main parameters based on which their content recommender systems work.
- Designate a point of contact for authorities, as well as users.
In addition to online platforms, the Digital Services Act also applies to hosting services (e.g. cloud services or domain name systems, background services which connect users to requested website addresses), as well as to online intermediaries (e.g. internet service providers, or domain). Hosting services and online intermediaries are subject to a subset of obligations under the DSA.
Since end of August 2023, the DSA has already applied to the 19 Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs) designated in April 2023 (with more than 45 million monthly users on average). Three other platforms designated as VLOPs in December 2023 have until end of April to comply with the most stringent obligations under the DSA. However, they will have to comply with the general DSA obligations from tomorrow.
Digital Services Coordinators in Member States
Platforms not designated as VLOPs or VLOSEs will be supervised at Member State level by an independent regulator acting as the national Digital Services Coordinator (DSC). It will be the responsibility of the DSCs to ensure that these platforms play by the rules. DSCs will supervise and enforce the DSA for the platforms established on their territory.
In practice, the Digital Services Coordinators will:
- Be the first port of call for complaints by users on infringements against the DSA by any platform, including VLOPs and VLOSEs. The Digital Services Co-ordinator will, when appropriate, transmit the complaint to the Digital Services Co-ordinator of the platform's Member State of establishment, where appropriate, accompanied by an opinion.
- Certify existing out-of-court appeal mechanisms for users to address complaints and challenge content moderation decisions.
- Assess and award the status of trusted flaggers to suitable applicants, or independent entities that have demonstrated expertise in detecting, identifying, and notifying illegal content online.
- Process researchers' requests for access to VLOPs and VLOSEs data for specific research. The DSCs will vet the researchers and request access to data on their behalf.
- Be equipped with strong investigation and enforcement powers, to ensure compliance with the DSA by the providers established in their territory. They will be able to order inspections following a suspected infringement of the DSA, impose fines on online platforms failing to comply with the DSA, and impose interim measures in case of serious harm to the public sphere.
The European Board for Digital Services
The Digital Services Coordinators and the Commission will form an independent advisory group, the European Board for Digital Services, to ensure that the DSA is applied consistently, and that users across the EU enjoy the same rights, regardless of where the online platforms are established.
The Board will be consulted on the enforcement of the DSA and advise on arising issues related to the DSA and can contribute to guidelines and analysis. It will also assist in the supervision of Very Large Online Platforms and Very Large Online Search Engines and will issue yearly reports on the prominent systemic risks and best practices in mitigating them.
The Board will meet for the first time on 19 February 2024.
Next Steps
In March 2024, the Commission intends to adopt Guidelines on risk mitigation measures for electoral processes. A public consultation on the data access delegated act is expected in April with adoption by July and entry into force in October 2024. In May, the Commission plans to adopt an Implementing Act on transparency report templates. More details on the tentative calendar are included in the annex.