Back to Insights

The EU's Digital Services Act: How Will It Impact Marketplaces


The Digital Services Act ("DSA" or the "Act") was published on 27 October 2022 in the Official Journal of the European Union. The aim of the DSA is to modernize the e-Commerce Directive regarding illegal content, transparent advertising, and disinformation. Along with the Digital Markets Act ("DMA") and EU Regulation on Platform-to-Business Relations ("P2B Regulation"), the last remaining piece of the puzzle has been completed, and the EU is currently ready to reshape the new digital rules.

Scope of the Act

The DSA will require that digital platforms fairly and effectively oversee their communities. DSA not only applies to organizations that offer digital services but also focuses on those that operate as online intermediaries or platforms. The DSA covers various digital services, especially those that host or encounter user content, for example, e-commerce services, websites, and internet infrastructure providers. The Act is expected to introduce new online consumer and provider protection rules and provide opportunities for digital enterprises.

The DSA sets forth several principles:

  1. It brings fairness, transparency, and accountability to the process of controlling the content.
  2. It enforces online advertising rules for service providers.
  3. The Act indicates a guide to notification and action procedures for illegal content.
  4. It also contains several provisions on receiving, storing, verifying, and publishing information for merchants using the services that DSA regulates.

With the DSA, we are about to reorganize the digital space in our internal market, both for societal and economic reasons. A new framework that can become a reference for democracies worldwide,” said Thierry Breton, the EU Commissioner for Internal Market, during a speech on January 19.[1]

Since the turn of the century, Internet use has grown tremendously, facilitating the growth of new industries, creating new markets, increasing the flow of new ideas, and stimulating innovation. In recent years, in response to challenges posed by misinformation and disinformation, hate speech, child sex abuse material, and more, EU policymakers have taken a very tough stance, and the DSA was entered into force to prevent all these violations.

With the DSA, the status of online platforms and search engines with at least 45 million monthly active users in the EU, called “VLOP (very large online platforms)”, will be determined by the EU Commission in particular proceedings. The DSA provides additional regulations for Big Tech companies, which are also considered marketplace gatekeepers, such as Amazon, Apple, Alphabet, Meta, and Microsoft. Overall, the goals of all these actions are to create safer digital spaces that protect users' rights and to create a level playing field for innovation, growth, and competitiveness. In addition, service providers will have to take responsibilities such as controlling content, ensuring transparency, and removing prohibited content quickly. Responsible service providers will also be audited according to these legal regulations.

The DSA also includes several concrete provisions to reduce certain online advertising losses, including the ban on dark patterns[2] when receiving approval from users (Article 8 of DSA). The concept of "dark patterns" is the name given to the process where providers encourage users to make wrong and harmful choices that are deliberately misleading. Dark patterns can be found on many types of websites and are used by several types of organizations. They consist of elements like deceptive and tricky questions, clickbait, disguised ads, misdirection, hidden costs and difficult choices, and graphic elements such as color and shading that guide users. The European Parliament introduced this explicit prohibition via amendments made to the proposal in January 2022. Regardless, the DSA currently includes the most advanced provisions on misleading and manipulative interface designs to date. France's data protection authority, which started to impose heavier penalties based on the DSA draft, had fined Facebook and Google over $200 million for dark patterns.[3]

The new legal framework also urges companies to ensure online ad transparency by requiring enterprises to identify the content of an ad, who is sponsoring this ad, and the parameters used to identify a specific user who will receive a particular ad. While companies track some of these practices, provisions are putting a new burden on critical entrepreneurial infrastructure.

Who Must Implement the Act?

The Act exempts small enterprises from certain obligations. Companies from different countries serving European consumers may exceed the limits of DSA's small enterprises’ exemptions and be responsible for following many of the new rules, and many companies carrying out their digital activities in Europe will not be exempt from this regulation. In order to comply with the limitations of DSA, many enterprises will need to take action.

Transparency Notice

Compared to the previous legislation of the European Commission, DSA has created one of the most undisputed powers in terms of transparency of content audits. In addition, it requires the disclosure of additional information, such as automated tools used to make content audit decisions and types of audit measures taken.

Content Removal

According to the DSA (Article 13 and Article 17), most transparency reporting articles on content control require that the average time required to act on the content allegedly violated by the platform must be explained. As demonstrated by the research, if social media platforms moderate content under the working assumption that regulators prefer swifter takedowns for all types of content or that it would weigh more heavily in favor of determining the immunity of an online platform in the event of a challenge to a decision, this may be likely to cause a chilling effect on free speech online.[4]

Data Accessibility

Another significant improvement in online transparency proposed by the DSA is an obligation for researchers to access data. The Act recommends databases of requests to unpublish content at two levels: a) Article 15 (4) creates a requirement for all hosting services, b) The need to publish the measures taken to remove content, such as web storage or cloud services, and the reasons for doing so in a public database published by the European Commission annually.

In the past, platforms such as Facebook and Instagram have stopped research based on alleged privacy concerns for their users. A law that sets the conditions for sharing such data with authorized researchers will enable more effective and quality research on the release practices of online platforms.


Digital platforms are facing increased regulatory scrutiny across the globe, and the DSA may not be the last regulation setting standards for marketplaces and communities. Organizations that effectively integrate and adapt to new rules may take advantage of improving operations, building trust in their platforms, and growing their business.

The DSA focuses on the protection of consumers at the potential risk of restricting, constraining freedom of speech, and preventing market diversity. Meanwhile, new EU-based startups will use their limited time, money, and resources to adapt to the changing regulatory landscape.

The Act aims to protect consumers and provide a more fluid legal framework for companies by harmonizing national-level legislation, but some provisions of the DSA are likely to increase the barriers to enterprises aiming for innovation. Service providers need to be careful and as well they should be aware of their responsibilities under the DSA.