Understanding the Digital Services Act: A New Era for Online Safety and Transparency

The European Union has taken a significant leap forward in regulating the digital space with the application of the Digital Services Act (DSA) across all online platforms operating within its jurisdiction. Starting from 17 February 2024, the DSA aims to foster a safer, fairer, and more transparent online environment for users. As an eLearning company committed to keeping our audience informed, TechClass presents the overview of the DSA, its implications for online platforms, and what it means for users and digital service providers.

A Brief Overview of the Digital Services Act

The DSA represents the EU's ambitious endeavor to regulate the sprawling digital landscape. It sets forth a comprehensive rulebook designed to protect users from illegal goods and content while ensuring their rights are upheld on online platforms. This landmark legislation targets all online intermediaries and platforms, with a few exceptions, and introduces a set of new responsibilities aimed at fostering a safer online environment.

Key Provisions and Responsibilities under the DSA

  • Counter Illegal Content: Platforms are required to enable users to flag illegal content effectively and to prioritize responses to notices from "trusted flaggers."
  • Protect Minors: There's a complete ban on targeting minors with ads based on profiling or personal data.
  • Transparent Advertising: Users will be informed about the advertisements they see, including the rationale behind them and the advertisers' identities.
  • Ban on Sensitive Data Targeting: Advertisements cannot target users based on sensitive data, such as political beliefs, sexual preferences, etc.
  • Right to Explanation: Platforms must provide reasons for content moderation decisions, including content removal or account suspension, and record these in the DSA Transparency database.
  • Access to Complaint Mechanisms: Users can challenge content moderation decisions through an accessible complaint mechanism.
  • Annual Reporting: Platforms must publish reports on their content moderation procedures at least once a year.
  • Clear Terms and Conditions: The platforms are required to disclose the main parameters of their content recommender systems in their terms and conditions.

Supervision and Enforcement

The DSA establishes a dual supervision model: national Digital Services Coordinators (DSCs) will oversee platforms not classified as Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs), ensuring compliance with DSA provisions. Additionally, the European Board for Digital Services will serve as an advisory group, ensuring consistent application of the DSA across the EU and advising on enforcement and guidelines.

Looking Forward

The DSA's application marks a pivotal moment in digital regulation, aiming to mitigate systemic risks and promote best practices across the EU. With the European Board for Digital Services meeting for the first time on 19 February 2024, the stage is set for a collaborative effort towards a safer and more transparent online ecosystem. Moreover, the upcoming guidelines on risk mitigation for electoral processes and the public consultation on the data access delegated act illustrate the EU's commitment to refining and expanding the DSA's scope.


The material on this blog post is not intended to serve as legal advice for you or your organization in compliance with the Digital Service Act (DSA). This post's content is intended solely for educational reasons and gives you background information to better understand the act. Please visit the official web pages for further information and insight.

To read more, visit the following page: https://ec.europa.eu/commission/presscorner/detail/en/IP_24_881

Leave a comment

All comments are moderated before being published

TechClass Articles

Upcoming services, features, and training.