Today, the countdown has started for 19 designated very large online platforms and search engines (having more than 45 million monthly active users in the EU) to fully comply with the special obligations that the Digital Services Act imposes on them.
With a clear deadline to comply: 25 August 2023.
4 months from today's designation, they will not be able to act as if they were "too big to care".
The 19 Very Large Online Platforms (VLOPs) or Search Engines (VLOSEs) are:
- AliExpress
- Amazon Store
- AppStore
- Bing
- Booking
- Google Maps
- Google Play
- Google Search
- Google Shopping
- Snapchat
- TikTok
- Wikipedia
- YouTube
- Zalando
We consider that these 19 online platforms and search engines have become systemically relevant and have special responsibilities to make the Internet a safe and trustworthy space, namely on four points:
I. Greater protection, control and choice for their users
These very large platforms and search engines will have to address any risk they pose on society, including public health, physical and mental well-being.
They will no longer be able to hide behind lengthy and obscure terms and conditions, but will have to empower users with plain-language summaries of such documents, in all EU languages, and give them the choice to opt out from recommendation systems based on profiling.
Any type of advertising (be it display ads or search ads) based on sensitive data, for instance revealing racial or ethnic origin, or political opinions, will be banned.
And they will have specific risk mitigation obligations for the use of generative AI services like ChatGPT or MidJourney. For example, AI-generated information, such as synthetic video, deep fake images, when creating risks such as disinformation, will have to be clearly marked when displayed in response to a query on a search engine.
II. Stronger protection for minors online
The 19 systemic platforms and search engines will have to redesign their systems to ensure a high level of privacy, security, and safety of minors.
They will need to include age verification and parental control tools.
Bing and Google Search as very large online search engines will be required to block harmful material which may respond to kids' search queries. Any type of targeted advertising towards children will be prohibited.
III. More diligent and trustworthy content moderation, less illegal content and less disinformation online
Curbing the spread of illegal content, tackling disinformation and protecting freedom of speech will no longer be just a civic responsibility, but a legal obligation.
Very large online platforms and search engines will be obliged to adapt their recommender system to prevent algorithmic amplification of disinformation.
I cannot overemphasise the importance of this point, as evidenced by current events. Malicious actors are actively exploiting online platforms to distort the information environment.
They do this especially in the run-up to elections, as is the case for example in Slovakia, where elections will be held in September and there are concerns on the hybrid warfare happening on social media.
Looking at this example, I am particularly concerned by the content moderation system of Facebook, which is a platform playing an important role in the opinion building of the Slovak society.
Now that Facebook has been designated as a very large online platform, Meta needs to carefully investigate its system and fix it where needed.
IV. More transparency and accountability
The shortlisted platforms and search engines will have to identify advertisements clearly and explain to their users why they are seeing an ad and who is promoting it. They will have to explain how their recommender systems work and what data they collect.
They will be subject to yearly independent audits – which they should expect to be very tough – as well as rigorous supervision by the Commission, facing sanctions including fines of up to 6% of their group's global turnover and, as last resort, a temporary ban from the EU in case of repeated serious breaches threatening to the life or safety of persons.
We will do our outmost, and count also on independent auditors, to cast a wide and tight net and catch all points of failure in a platforms' compliance.
Organising "stress tests"
The 19 platforms and search engines are facing significant obligations within a strict timeline.
We are here to support their compliance efforts, including what I call "stress tests" – voluntary mock exercises to check readiness to comply with the new obligations ahead of the 25 August deadline.
At the end of June, at the invitation of Elon Musk, my team and I will carry out a stress test at Twitter's HQ in San Francisco.
We are also committed to run a stress test with TikTok which has expressed interest. I look forward to an invitation to ByteDance HQ to understand better the origins of TikTok and the other innovations that ByteDance is developing.
I am ready to talk to Mark Zuckerberg as well to take stock of progress made by Meta, in particular on fighting disinformation.
And of course, we continue to stand ready to undertake a stress test with other interested very large online platforms and search engines.
In fact, I encourage them to do so to ensure they are "ready for take-off" on 25 August.