The Digital Services Act (DSA) aims to create a safe online environment for consumers and businesses, particularly on digital platforms. We explain the rights the DSA grants to users both natural and legal persons with regard to transparency and liability.
In a nutshell
- The DSA, as an EU regulation, is directly applicable in the member states, certain articles since its entry into force on 16 November 2022, and others since 17 February 2024.
- The primary goal of the DSA is to protect fundamental rights in relation to providers of digital platforms and search engines (so-called intermediary services).
- Both consumers and businesses are afforded protection.
Objects of the DSA
The DSA aims to create a safe, predictable, and trustworthy online environment. The objectives of the law are in particular
- effective protection of consumers and their fundamental rights;
- clear obligations for online platforms and social media;
- regulating the handling of illegal content and products, hate speech and disinformation;
- more transparency with better reporting and monitoring;
- innovation, growth and competitiveness in the EU’s internal market.
Scope and regulated entities
The DSA regulates providers of so-called intermediary services that are offered to users established or residing in the EU, regardless of whether the provider itself is based within or outside the EU.
Intermediary services include:
- Providers of internet access (such as Deutsche Telekom or Vodafone);
- Hosting services such as cloud computing or web hosting (e.g. Amazon Web Services, Microsoft Azure);
- Domain name registration services (in Germany, for example, the German Network Information Center – DENIC);
- Online marketplaces (such as Amazon, eBay, Etsy);
- App stores (such as those operated by Apple, Google or Microsoft);
- Collaborative economy platforms (e.g. Airbnb, Uber, Blablacar);
- Social networks (e.g. Instagram, LinkedIn, TikTok);
- Content-sharing platforms (such as SlideShare, Dropbox, Flickr);
- Online platforms for travel and accommodation (such as Booking.com, Expedia).
The Regulation also contains specific provisions for:
- Very large online platforms, used by more than 10% of the 450 million consumers in the EU; and
- Very large online search engines, also used by more than 10% of EU consumers.
Depending on the scale of their involvement, providers of intermediary services are categorised as follows, a distinction that is also crucial for determining their liability:
Mere Conduit
The provider transmits information supplied by the user via a communication network or provides access to such a network. The provider is not liable for the transmitted information, provided that it does not in any way intervene in the transmission, including with respect to the origin, the recipient, or the content of the information. Mere conduit also covers the automatic, intermediate, and temporary storage of the information transmitted, where this is solely for the purpose of carrying out the transmission.
Examples include providers of internet access and email services.
Caching
When providing an information society service that consists of transmitting information provided by a user through a communication network, the service provider is not liable for the automatic, temporary storage (caching) of that information, where the sole purpose is to make the onward transmission to other users more efficient or secure, provided that the provider does not modify the information or the conditions of access to it.
Examples include proxy servers (which act as intermediaries between clients and other servers) and search engines.
Hosting
The provider stores information supplied by the user of the service on a lasting basis and at the user’s request. The provider is not liable for the stored information, provided it has no actual knowledge of illegal content or, upon obtaining such knowledge, acts expeditiously to remove or disable access to it.
This includes the provision of server space for the sharing of content or hosting of websites.
Transparency rights for businesses under the DSA
Providers of intermediary services must offer users a range of safeguards, enabling them, among other things:
- to understand how their information is processed,
- to communicate directly with the provider, and
- to know whom to contact if they wish to report unlawful or illegal behaviour carried out either by the provider itself or by other users of the service.
To this end, the DSA contains a number of provisions. Below, we present those we consider most relevant.
Art. 12 DSA: contact points for users of the service
Users of an intermediary service must be able to communicate directly, promptly, and electronically with the service provider via designated contact points, which the provider must make available.
Art. 14 DSA: terms and conditions
The terms and conditions of intermediary services must include information on the following:
- The policies, procedures, measures and tools used for content moderation, including any automated decision-making and human review involved;
- The procedural rules of the provider’s internal complaint-handling system.
The terms and conditions must be drafted in clear, plain, intelligible, user-friendly, and unambiguous language.
Art. 15 DSA: transparency reporting obligations
Providers of intermediary services must publish, at least once per year, clear and easily understandable reports on the content moderation activities they have carried out.
Providers of very large online platforms or very large online search engines must publish these reports every six months, as required under Art. 42 DSA.
Art. 16 DSA: notice and action mechanism
Hosting service providers must establish easily accessible and user-friendly mechanisms that allow individuals or entities to notify them of specific items of information hosted on their services which they consider to be illegal content.
Art. 17 DSA: statement of reasons for measures taken
When a hosting service provider assesses the alleged illegality of content or its incompatibility with the provider’s terms and conditions and subsequently applies restrictions, it must provide the affected users with a clear and specific statement of reasons for the measures taken.
Such restrictions may include the removal of content, the suspension or termination of the user’s account, or the suspension or cessation of monetary payments.
Art. 20 DSA: internal complaint-handling system
If a user of the service is subject to a decision by the provider of an online platform to restrict access due to the alleged illegality of uploaded content, the user has the right to contest that decision through the platform’s internal complaint-handling system.
Art. 21 DSA: out-of-court dispute settlement
Users have the right, in all cases, to choose any certified out-of-court dispute settlement body to resolve disputes related to restrictions or complaints.
This option must be publicly, easily accessible, clearly and user-friendly presented on the online interface of the service.
Art. 26 DSA: advertising on online platforms
Providers of online platforms that display advertising on their online interfaces must ensure that users are able to clearly, accurately, unambiguously, and in real time identify:
- that the content is an advertisement (including through visible labelling),
- the person or entity on whose behalf the advertisement is presented, and
- the person or entity who paid for the advertisement.
Art. 27 DSA: transparency of recommender systems
Providers of online platforms that use recommender systems must set out in their terms and conditions, in clear and understandable language, the main parameters used in those systems, as well as any options available to users to modify or influence those parameters.
Where possible, users must also be given the option to adjust or disable certain types of recommendations.
Art. 30 DSA: compliance by design
Providers of online platforms that enable consumers to conclude distance contracts with traders must ensure that their online interface is designed and organised in a way that allows those traders to comply with their obligations under applicable EU law, particularly regarding pre-contractual information, product conformity, and product safety information.
Art. 53 DSA: right to lodge a complaint
Users have the right to lodge a complaint with the Digital Services Coordinator of the Member State in which they are resident or established, against a provider of intermediary services for an alleged infringement of this Regulation.
Art. 54 DSA: right to compensation
Users have the right, in accordance with Union and national law, to seek compensation from providers of intermediary services for any damage or loss suffered as a result of infringements by those providers of their obligations under the DSA.
Conclusion
The DSA primarily applies to providers of intermediary services and introduces additional obligations for very large online platforms and online search engines.
At the same time, it provides fundamental safeguards for users, both consumers and businesses, that are important to be aware of, in order to protect themselves effectively against potential interference by major players in the information society and to prevent the unlawful abuse of their (often dominant) market position.