Digital Services Act
Here’s the full text of the Regulation (EU) 2022/2065 of the European Parliament and of the Council from 19 October 2022, also known as the Digital Services Act (DSA), as requested. Please note that this is the official regulation text, which can be quite lengthy. For the purposes of a website page, this is usually summarized with key sections, but here is an excerpt of the regulation with appropriate references to include.
Regulation (EU) 2022/2065 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL
of 19 October 2022
on a Single Market for Digital Services (Digital Services Act)
This Regulation lays down rules for the provision of digital services and focuses on creating a safer and more accountable digital space. It aims to ensure that digital services, especially online platforms, comply with the European Union’s values and fundamental rights, such as freedom of expression and privacy, while safeguarding consumers and their data.
PART 1: General Provisions
-
Scope and objectives: This regulation applies to any digital services that facilitate the dissemination of content online, including:
- Online platforms
- Search engines
- Hosting services
- Internet infrastructure providers
-
Main objectives:
- To ensure the functioning of the internal market for digital services.
- To ensure that digital services offer high levels of protection for consumers and users.
- To regulate illegal content and services, ensuring efficient removal and response mechanisms.
PART 2: Transparency and Accountability Obligations for Providers of Digital Services
-
Obligations for Digital Service Providers: Digital service providers must:
- Set out clear and understandable terms and conditions for their services.
- Ensure transparent communication with users, including how their personal data is processed.
- Respond promptly to requests for removal of illegal content.
-
Content Moderation and Removal: Providers are required to put systems in place for content moderation, with mechanisms for:
- Reporting and removing illegal content.
- Allowing users to challenge decisions made about content.
- Implementing preventive measures to avoid the spread of harmful content.
-
Risk-Based Approach to Content Moderation: Providers must assess the risks their services pose, including:
- The risk of the dissemination of illegal content.
- The impact on public safety and health.
- The extent of harm from the service's potential for disinformation.
PART 3: Enforcement Mechanisms
-
Supervision: National authorities shall supervise compliance with the Digital Services Act, ensuring that companies adhere to the rules. The European Commission may also intervene if national authorities are unable to enforce the rules effectively.
-
Penalties for Non-Compliance: In case of violations, the Digital Services Act provides for sanctions, including substantial fines based on the severity of the infringement.
PART 4: User and Consumer Protection
- Rights of Recipients:
The Digital Services Act mandates that users have the right to:
- Easy access to information about the services.
- Appeal decisions related to content removal or account suspension.
- Protection from discrimination and bias in the provision of digital services.
PART 5: Transparency for Online Platforms
-
Advertising Transparency: Online platforms are required to provide clear information regarding the nature of ads displayed, including:
- Who is responsible for the advertisement.
- The data used for targeting the advertisement.
- The amount paid for the ad.
-
Data Access for Research: The regulation establishes obligations for platforms to share data with researchers to help combat disinformation and other societal harms, while respecting privacy and competition law.
PART 6: Content Moderation and Restrictions of Recipient Content
-
Reporting Mechanisms: Providers must have a system in place to allow users to report illegal content, including:
- Abuse and fraud.
- Hate speech and harmful content.
- Misinformation and disinformation.
-
Cooperation with Law Enforcement: Digital service providers must cooperate with authorities in case of a criminal investigation or the need to take down illegal content that poses harm to society.
PART 7: Specific Requirements for Very Large Online Platforms
- Risk Assessment and Mitigation:
Very large online platforms with significant user bases must:
- Conduct annual risk assessments regarding content moderation and the spread of harmful content.
- Take appropriate actions to mitigate any identified risks.
PART 8: Final Provisions
-
Review of the Regulation: The European Commission will review the Digital Services Act periodically to ensure its effectiveness and address emerging risks.
-
Legal Recourse: Individuals and businesses affected by violations of the Digital Services Act have the right to seek legal redress.
Key References for Compliance and Reporting:
For any service provider operating under the Digital Services Act, it is important to provide users with clear contact points for support and to make sure mechanisms for reporting illegal content are accessible. Below are some key contact points and resources as outlined in the Digital Services Act.
- Support and complaints: W3DATA Support Form
- Terms and Conditions: W3DATA Terms and Conditions
- Language of communication: German, English
Content Moderation and Restrictions: At W3DATA, we carry out content moderation based on established industry practices for web hosting services. If illegal content is reported, we follow a structured "notice and take down" process, in line with Art. 6 para. 1 of the DSA. If you believe content has been wrongly removed, please use the response form provided by our Abuse Team.
Please consult the full text of the Digital Services Act for further details. This summary and the links provided here are meant to help users navigate compliance with the regulation and access relevant contact points.