The EU Commission released their list of 18 U.S. Very Large Online Platforms & Search Engines that are the targets of their Digital Services Act
The Apple AppStore, five Alphabet subsidiaries, two Meta platform units, two Microsoft businesses and Twitter are among the 18 U.S. companies subject to landmark EU online content rules that were issued today by the EU industry chief Thierry Breton.
Considering that that there was only two non-American companies on the list, it's as clear as day that the EU's Digital Services Act is targeting U.S. companies that they'll aggressively attack to assist European tech companies going forward. The following information was issued by the European Commission earlier today.
The full list of 'Very Large Online Platforms' (VLOPs) are presented below:
- Alibaba AliExpress
- Amazon Store
- Apple AppStore
- Google Play
- Google Maps
- Google Shopping
The two 'Very Large Online Search Engines' (VLOSEs) listed includes Google Search and Bing.
The Next Steps for Designated Platforms and Search Engines
Following their designation, the companies will now have to comply, within four months, with the full set of new obligations under the DSA. These aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools. This includes:
More user empowerment:
- Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
- Users will be able to report illegal content easily and platforms have to process such reports diligently;
- Advertisements cannot be displayed based on the sensitive data of the user (such as ethnic origin, political opinions or sexual orientation);
- Platforms need to label all ads and inform users on who is promoting them;
- Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.
- Strong protection of minors:
- Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
- Targeted advertising based on profiling towards children is no longer permitted;
- Special risk assessments including for negative effects on mental health will have to be provided to the Commission 4 months after designation and made public at the latest a year later;
- Platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.
- More diligent content moderation, less disinformation:
- Platforms and search engines need to take measures to address risks linked to the dissemination of illegal content online and to negative effects on freedom of expression and information;
- Platforms need to have clear terms and conditions and enforce them diligently and non-arbitrarily;
- Platforms need to have a mechanism for users to flag illegal content and act upon notifications expeditiously;
- Platforms need to analyse their specific risks, and put in place mitigation measures – for instance, to address the spread of disinformation and inauthentic use of their service.
- More transparency and accountability:
- Platforms need to ensure that their risk assessments and their compliance with all the DSA obligations are externally and independently audited;
- They will have to give access to publicly available data to researchers; later on, a special mechanism for vetted researchers will be established;
- They will need to publish repositories of all the ads served on their interface;
- Platforms need to publish transparency reports on content moderation decisions and risk management.
By 4 months after notification of the designated decisions, the designated platforms and search engines need to adapt their systems, resources, and processes for compliance, set up an independent system of compliance and carry out, and report to the Commission, their first annual risk assessment.
Platforms will have to identify, analyze and mitigate a wide array of systemic risks ranging from how illegal content and disinformation can be amplified on their services, to the impact on the freedom of expression and media freedom. Similarly, specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated. The risk mitigation plans of designated platforms and search engines will be subject to an independent audit and oversight by the Commission.
A New Supervisory Architecture
The DSA will be enforced through a pan-European supervisory architecture. While the Commission is the competent authority for supervising the designated platforms and search engines, it will work in close cooperation with the Digital Services Coordinators in the supervisory framework established by the DSA. These national authorities, which are responsible as well for the supervision of smaller platforms and search engines, need to be established by EU Member States by 17 February 2024. That same date is also the deadline by which all other platforms must comply with their obligations under the DSA and provide their users with protection and safeguards laid down in the DSA.
To enforce the DSA, the Commission is also bolstering its expertise with in-house and external multidisciplinary knowledge and recently launched the European Centre for Algorithmic Transparency (ECAT). It will provide support with assessments as to whether the functioning of algorithmic systems is in line with the risk management obligations. The Commission is also setting up a digital enforcement ecosystem, bringing together expertise from all relevant sectors.
Access to Data for Researchers
Today, the Commission also launched a call for evidence on the provisions in the DSA related to data access for researchers. These are designed to better monitor platform providers' actions to tackle illegal content, such as illegal hate speech, as well as other societal risks such as the spread of disinformation, and risks that may affect the users' mental health. Vetted researchers will have the possibility to access the data of any VLOP or VLOSE to conduct research on systemic risks in the EU. This means that they could for example analyze platforms' decisions on what users see and engage with online, having access to previously undisclosed data. In view of the feedback received, the Commission will present a delegated act to design an easy, practical and clear process for data access while containing adequate safeguards against abuse. The consultation will last until 25 May.
On 15 December 2020, the Commission made the proposal on the DSA together with the proposal on the Digital Markets Act (DMA) as a comprehensive framework to ensure a safer, more fair digital space for all. Following the political agreement reached by the EU co-legislators one year ago, in April 2022, the DSA entered into force on 16 November 2022.
The DSA applies to all digital services that connect consumers to goods, services, or content. It creates comprehensive new obligations for online platforms to reduce harms and counter risks online, introduces strong protections for users' rights online, and places digital platforms under a unique new transparency and accountability framework. Designed as a single, uniform set of rules for the EU, these rules will give users new protections and businesses legal certainty across the whole single market. The DSA is a first-of-a-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries.
The quote below is from Margrethe Vestager, Executive President for a Europe Fit for the Digital Age.
In the Reuters report, they added that EU industry chief Thierry Breton said on Tuesday that "We consider these 19 online platforms and search engines have become systematically relevant and have special responsibilities to make the internet safer.
The companies will have to do more to tackle disinformation, give more protection and choice to users and ensure stronger protection for children or risk fines as much as 6% of their global turnover."
Breton said he was checking to see whether another four to five companies fall under the DSA, with a decision expected in the next few weeks.
Breton singled out Facebook’s content moderation system for criticism because of its role in building opinions on key issues.
“Now that Facebook has been designated as a very large online platform, Meta needs to carefully investigate the system and fix it where needed ASAP,” he said.
Twitter and TikTok also showed up high on Breton’s radar.
“At the invitation of Elon Musk, my team and I will carry out a stress test live at Twitter’s headquarters in San Francisco,” he said.
“We are also committed to a stress test with TikTok which has expressed also interest. So I look forward to an invitation to ByteDance’s headquarters to understand better the origin of TikTok,” Breton said.