[news release] Today [16 Nov] a landmark new set of EU rules for a safer and more accountable online environment enters into force with the Digital Services Act (DSA). The DSA applies to all digital services that connect consumers to goods, services, or content. It creates comprehensive new obligations for online platforms to reduce harms and counter risks online, introduces strong protections for users’ rights online, and places digital platforms under a unique new transparency and accountability framework. Designed as a single, uniform set of rules for the EU, these rules will give users new protections and businesses legal certainty across the whole single market. The DSA is a first-of-a-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries.
New responsibilities for digital services
The DSA introduces a comprehensive new set of rules for online intermediary services on how they have to design their services and procedures. The new rules include new responsibilities to limit the spread of illegal content and illegal products online, increase the protection of minors, give users more choice and better information. The obligations of different online players match their role, size and impact in the online ecosystem; an overview is available here.
All online intermediaries will have to comply with wide-ranging new transparency obligations to increase accountability and oversight, for example with new flagging mechanism for illegal content. But a special regime is introduced for platforms with more than 45 million users: for such very large online platforms or search engines, further obligations include wide-ranging annual assessments of the risks for online harms on their services – for example with regard to exposure to illegal goods or content or the dissemination of disinformation. Under the DSA, suitable risk mitigation measures will have to be put in place, and subject to independent auditing of their services and mitigation measures.
Smaller platforms and start-ups will benefit from a reduced set of obligations, special exemptions from certain rules, and most crucial increased legal clarity and certainty for operating across the whole EU’s single market.
Enhanced safeguards for fundamental rights online
The new rules protect users’ fundamental rights in the EU also in the online environment. New protections for the freedom of expression will limit arbitrary content moderation decisions by platforms, and offer new ways for users to take informed action against the platform when their content is moderated: for example, users of online platforms will now have multiple means of challenging content moderation decisions, including when these decisions are based on platforms’ terms and conditions. Users can complain directly to the platform, choose an out-of-court dispute settlement body or seek redress before Courts.
New rules also require platforms’ terms to be presented in a clear and concise manner and to respect users’ fundamental rights.
Very large online platforms and search engines will in addition have to undertake a comprehensive assessment of risks to fundamental rights, including the freedom of expression, the protection of personal data, and freedom and pluralism of the media online as well as the rights of the child.
New supervisory powers for the Commission
The DSA creates an unprecedented level of public oversight of online platforms across the Union, both at national and EU level. The Commission has powers to directly supervise VLOPs and VLOSEs, companies which individually reach more than 10% of the EU population, approximately 45 million people. Additionally, each Member State will have to designate a Digital Services Coordinator, who will supervise other entities in scope of the DSA as well as VLOPs and VLOSEs for non-systemic issues. The national coordinators and the European Commission will cooperate through a European Board of Digital Services. This EU-wide cooperation mechanism will be established between national regulators and the Commission.
The Commission is setting up a European Centre for Algorithmic Transparency (ECAT) to support its supervisory role with in-house and external multidisciplinary knowledge. The Centre will provide support with assessments as to whether the functioning of algorithmic systems are in line with the risk management obligations that the DSA establishes for VLOPs and VLOSEs to ensure a safe, predictable and trusted online environment.
Following the entry into force of the DSA today, online platforms will have 3 months to report the number of active end users (17 February 2023) on their websites. The Commission is also inviting all online platforms to notify to it the published numbers. Based on these user numbers, the Commission will make an assessment as to whether a platform should be designated a very large online platform or search engine. Following such a designation decision by the Commission, the entity in question will have 4 months to comply with the obligations under the DSA, including carrying out and providing to the Commission the first annual risk assessment exercise. EU Member States will need to empower their Digital Services Coordinators by 17 February 2024, the general date of entry in application of the DSA, when the DSA is fully applicable for all entities in its scope.
On 15 December 2020, the Commission made the proposal on the DSA together with the proposal on the Digital Markets Act (DMA) as a comprehensive framework to ensure a safer, more fair digital space for all. The DMA entered into force on 1 November 2022.
Digital services include a large category of online services, from simple websites to internet infrastructure services and online platforms. The rules specified in the DSA primarily concern online intermediaries and platforms. For example, online marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.
For More Information
With the Digital Services Act, we now have clear legislation. Online platforms are at the core of some of the key aspects of our daily lives, democracies, and economies. It’s only logic that we ensure that these platforms live up to their responsibilities in terms of reducing the amount of illegal content online and mitigating other online harms, as well as protecting the fundamental rights and safety of users.Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age – 16/11/2022
This European Commission news release was sourced from: ec.europa.eu/commission/presscorner/detail/en/ip_22_6906