Deutsch

Keyword search

Find your lawyers

Digital Services Act

02/23/2024

Author

Stefan Adametz

Partner

The Digital Services Act ("DSA"), also known as the "Digital Services Act", has been in full force and effect since February 17, 2024. While the provisions of the DSA have already been applicable to the 19 largest online platforms and search engines (e.g. Amazon, Apple, Zalando, Google, etc.) since August 25, 2023, the scope of application has now been extended to all online platforms across Europe. 

What is the DSA?

The DSA is a (directly applicable) European regulation (Regulation (EU) No. 2022/2065) that imposes certain obligations on digital services that act as intermediaries and bring users together with goods, services and content/information online. The aim of this EU regulation is to create an efficient and clear transparency and accountability framework for online platforms that ensure the protection of users and their fundamental rights on the internet. The regulation lays down special due diligence obligations and new rules for intermediary services and online platforms.

Who does the DSA apply to?

The DSA applies to access and hosting providers as well as communication and distribution platforms (marketplaces), social networks, app stores, content sharing platforms and online search engines. In principle, the DSA therefore covers all companies that offer digital services or enable the transmission, storage and provision of user content. Micro and small enterprises (companies with fewer than 50 employees and an annual turnover of no more than EUR 10 million) are excluded.

The DSA includes the following regulations, among others:

  1. "Notice and take down" of illegal content: On the one hand, service providers must remove illegal content by order of the authorities, but at the same time they must also provide easily accessible reporting and remedial procedures for reporting illegal content. They must follow up on these reports and take the necessary measures (e.g. deletion). However, action against content must now be justified. Reports from (particularly) trustworthy whistleblowers (so-called "trusted flaggers") must be processed by the online platforms as a matter of priority, i.e. without delay, and a quick decision must be made. The status of a "trusted flagger" will primarily be granted to organizations/NGOs that protect the rights of users 
  2. Regulations for advertising on online platforms: In future, advertising must be clearly, precisely, unambiguously and in real time identified as such. Information must also be provided about the advertiser. In addition, information on the most important parameters of the target groups must be made easily accessible.
  3. Prohibition of online advertising based on profiling if particularly sensitive data (origin, health data, political opinions, religious beliefs) is used. Personal data of children/young people may no longer be used for advertising at all.
  4. Inadmissibility of "dark patterns": "Dark patterns" are misleading interface designs that are intended to influence customers to make decisions that they would not have made without these "dark patterns" (for example, by highlighting certain choices such as a "green consent button" or repeatedly asking them to make choices or by exerting pressure, for example by showing that there are only a few products left). The DSA now includes an explicit ban on misleading interface design: this means that "dark patterns" aimed at tempting or pressuring users into certain behaviors or decisions are now prohibited.
  5. Expansion of consumer protection on online marketplaces: Platforms must ensure that only companies that are traceable (i.e. have provided their name, address, telephone number and email address as well as other information for clear identification) offer products on their marketplaces. The platform also has a duty of verification, whereby it must check whether the information received is reliable and complete using publicly accessible data sources. For example, information must also be provided about the entrepreneurial or non-entrepreneurial status of the third-party provider.
  6. In Austria, KommAustria will be the independent media authority responsible for coordinating all issues relating to the monitoring and enforcement of the DSA in Austria. This was also determined by the "Coordinator for Digital Services Act" (KDD G), which also came into force on February 17, 2024, for all digital services based in Austria. The previous Communication Platforms Act (KoPl-G), on the other hand, has expired. KommAustria will also define who is classified (certified) as a trustworthy whistleblower. In addition, an out-of-court dispute resolution body is to be set up at RTR.
  7. The DSA also brings improved user rights: In addition to measures to combat unlawful online content, platforms must also set up a complaints system where users can challenge the platform's decision. What is new here is that both the user who posts the content and the user whose post is removed must be notified. Both can appeal against the decisions. 
  8. General terms and conditions must be made available to users in a clear and comprehensible manner. This also applies to the terms of use of platforms. In terms of content, the terms and conditions must at least contain information on the moderation of content and the internal complaints system.
  9. Fines of up to 6% of annual revenue can be imposed for violations.
  10. Users are also entitled to compensation if breaches of the regulation lead to damage or losses.
  11. The liability privileges of the previous E Commerce Directive are largely adopted. This means that the provider or platform is not obliged to proactively check the legality of content. They are only liable for illegal content if the illegality is either obvious or if they have become aware of the illegality and would not act swiftly to block or remove the content. An exception to this exclusion of liability exists in the case of users who are under the control or supervision of the provider, e.g. price fixing by the platform or online platforms where an average consumer can assume that the platform itself or a user under its control is providing the product. In other words, the platform acts as if it were the provider.

Conclusion

The DSA is not only intended to protect users from "hate online", "disinformation" or unfair advertising practices, but also to restrict the trade in illegal goods or (more quickly) punish infringements of copyright or personal rights. Users/entrepreneurs will even be able to take action against incorrect reviews more quickly than before. Information/content from influencers or companies' social media presences must also be designed in accordance with the regulations (e.g. advertising labeling) so as not to be threatened with deletion, for example. For companies that act as platform or service providers (and do not fall under the exception), there is therefore a great need for adaptation - just as there is for those who create the content. 

Author

Stefan Adametz

Partner