Professor Steve Peers, Royal Holloway University of London
Photo credit: Animated Heaven, via Wikimedia Commons
Introduction
The EU’s Digital Services Act (DSA) sets out rules for regulating online platforms and search engines. The following sets out a summary of what the Act does, and links to key resources. It draws upon (and updates) a blog post on the Commission’s first non-compliance decision under the Act. This post will be updated.
Overview of the Digital Services Act
The DSA contains rules that govern online platforms generally, regardless of size, but its most prominent rules concern a special regulatory regime for the biggest platforms, defined as ‘very large online platforms’ (VLOPs) and ‘very large online search engines’ (VLOSEs), which subjects them to greater regulation. The Act gives the EU Commission power to designate such platforms and search engines (on the basis that 10% of the EU population visit them monthly) and to enforce the provisions of the DSA against them.
The Commission’s list of designated VLOPs and VLOSEs includes US companies (including Meta, X, Google, LinkedIn), and also Chinese companies (AliExpress, TikTok, Temu, Shein), EU companies (Booking.com, Zalando, and two porn sites), and a Canadian site, Pornhub. Overall, nearly half of the companies designated as operating VLOPs and VLOSEs are non-American (although some of the American companies operate more than one platform).
For VLOPs, enforcement of the DSA involves a number of measures, including requests for information, a start of an investigation into possible breach of the Act, a preliminary finding of a breach, and a final decision finding a breach – which can result in a fine (of up to 6% of worldwide annual turnover) and orders to change practices. A VLOP or VLOSE can also agree avoid a fine by agreeing binding commitments to change its practices with the Commission (in effect, a settlement) before it reaches a final decision. If a finding of breach is not complied with, the Commission can impose very high fines – up to 5% of worldwide annual turnover per day.
The Act imposes a very high threshold before a ban can be imposed against a platform – essentially a refusal to remove illegal content, with additional safeguards including involvement of a court.
The case law has not yet fleshed out the relationship between the DSA and Member States’ laws on overlapping issues, or clarified whether there can be private enforcement of the DSA (ie individuals challenging the VLOPs and VLOSEs in court for breach of the Act, rather than the Commission enforcing it) in parallel.
Substantively, the Act’s requirements on VLOPs and VLOSEs (in its Articles 33-43) start with risk assessment: they must ‘diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services’. Systemic risks are further defined as including ‘dissemination of illegal content through their services’, ‘negative effects’ upon various human rights, ‘actual or foreseeable negative effects on civic discourse and electoral processes, and public security’, and ‘actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being’.
Very large platforms and search engines are also obliged to (as further defined): mitigate these risks; comply with a decision requiring a response to a crisis; perform independent audits; offer a recommender system not based on profiling, at least as an option; make public a repository of advertising data; provide access to their data to researchers; explain their algorithms to regulators; establish independent compliance bodies; provide further public data on their operations; and pay an annual supervisory fee to the EU Commission.
The DSA in the EU courts
Challenges to designation
Amazon, Zalando and several porn sites have challenged their designation as VLOPs.
- Zalando lost its challenge in the EU General Court in September, but has appealed to the EU’s Court of Justice (appeal pending)
- Amazon also lost its challenge in the EU General Court against designation as a VLOP, and has appealed to the CJEU (the Commission has cross-appealed)
- Amazon had won an interim measures ruling in this case – delaying its obligation to publish information about its advertisers – but that interim measure was overturned by the Court of Justice, following a successful appeal by the Commission)
- The porn companies’ legal challenges to their designations as VLOPs are still pending (see the summary of the arguments made by Pornhub, XNXX and XVideos; a challenge by Stripchat is also still pending even though the Commission has dropped its designation as a VLOP)
- the porn companies’ applications for interim measures as regards publishing advertisers’ information have been dismissed (see the General Court orders re Pornhub and XVideos, and the failed appeals to the Court of Justice as regards Pornhub and XVideos)
Summary of the Amazon judgment
Amazon argued that the entire system of special rules for VLOPs in the DSA was invalid, because it violated a number of human rights set out in the EU Charter of Fundamental Rights. All of these arguments were rejected by the EU General Court (now subject to appeal).
First of all, the Court rejected the argument that the VLOP regime breached the freedom to conduct a business (Article 16 of the Charter). In the Court’s view, although the regime interfered with the freedom to conduct a business, because it imposed significant costs on VLOPs and also had a considerable impact on their organisation or required complex technical solutions, that freedom was not absolute, and the interference with it was justified. According to Article 52(1) of the Charter, limitations on Charter rights have to be prescribed by law, have public interest objectives, respect the essence of the right and be proportionate. Here the limits were admittedly prescribed by law (being set out in the Act) and respected the essence of the right (as Amazon could still carry out its core business); Amazon instead argued mainly that the limits were disproportionate, as online shops did not present systemic risks, the objectives could be satisfied by less onerous means, and the costs were significant. However, the Court believed that there was a systemic risk of illegal content in online marketplaces; other means of designating VLOPs were not necessarily more proportionate; making advertising repositories open to the public was justified in the interests of consumer protection; and the arguments about economic impact made by Amazon as regards recommender systems, researchers’ access to data and advertiser repositories were unconvincing.
Secondly, Amazon’s argument that its right to property was infringed (Article 17 of the Charter) was dismissed at the outset, as it had not identified any of its property rights that were affected by the DSA: an administrative burden did not constitute interference with a property right. Thirdly, the Court rejected the argument that the VLOP regime breached the general right to equal treatment (Article 20 of the Charter), by treating larger companies differently from smaller ones, on the grounds that larger companies presented bigger risks.
Fourthly, Amazon’s arguments about freedom of expression (Article 11 of the Charter) were rejected too. This argument was only made as regards applying the DSA rules on recommender systems to Amazon. On this point, the Court reiterated that the Charter freedom of expression rules must be interpreted consistently with the freedom of expression set out in Article 10 of the European Convention on Human Rights (ECHR), referring also to the case law of the European Court of Human Rights (ECtHR) – ie the US First Amendment does not apply to the regulation of a company doing business in the European Union. The Court did not see how the freedom of expression of third-party sellers might be affected by the DSA rules, but it accepted that Amazon’s freedom of expression was limited by having to offer a recommender system not based on profiling.
However, limitations of the right could be justified: the limitation here was prescribed by law; it did not affect the essence of the right (as Amazon could still offer a profiling-based recommender system as an option); it had an objective of general interest (consumer protection); and it was proportionate by only requiring the offer of one non-profiling based recommender system as an option – taking account of ECtHR case law that allows more interference with commercial expression than political expression.
Finally, Amazon complained about a breach of the right to privacy (Article 7 of the Charter). This was a remarkable thing for a company with a business model based on surveillance of its customers to argue about, but the Court considered its arguments seriously nonetheless. Again it followed the ECtHR case law on the corresponding rule (Article 8 ECHR), which states that businesses could invoke the right to privacy. Here the argument concerned the DSA rules on ad repositories and researchers’ access to data. Again the EU court agreed that the DSA interfered with the right, but ruled that it could be justified: it was prescribed by law, did not infringe the essence of the right, and complied with the principle of proportionality, particularly because of the limits built in to the obligations (for instance, no obligation to disclose the personal data of advertising recipients, or about the success of advertising; controls on which researchers can access the data).
Regulation fees
The Commission’s decisions on fees for regulation (for 2023) have also been challenged. These challenges were all successful in the EU General Court (see the judgments as regards Tiktok and Meta), although the Commission has appealed both the Tiktok and Meta judgments to the Court of Justice (appeals pending).
In the meantime, Tiktok, Meta and Google have brought a further round of legal challenges (all still pending) to the regulation fees imposed for 2024.
Non-compliance decision
X, X.AI and Elon Musk have challenged the December 2025 non-compliance decision against X.
Infringement actions
The Commission is suing Spain for non-enforcement of its obligations to apply the DSA at national level.
Enforcement of the DSA
Non-compliance decisions
So far the EU Commission has adopted one final decision of non-compliance, against X in December 2025, following its preliminary findings in July 2024
This decision includes a fine to enforce the DSA for the first time: €120 million for three breaches of the Act by X. It concerns certain issues, namely deceptive practices as regards X’s ‘blue ticks’,* researchers’ access to data, and the repository of advertisers.
The Commission has not yet made a final decision – or even a preliminary finding – as regards other issues involved in its opening of proceedings against X, namely the dissemination of illegal content and the effectiveness of rules against disinformation. In January 2026, the Commission opened proceedings against X as regards its recommender systems and ‘nudification’ apps.
Other enforcement actions
Other Commission enforcement actions under the DSA include:
- The Commission has accepted binding commitments from AliExpress on various issues, but at the same time also adopted a preliminary finding that its risk assessment as regards illegal products was insufficient;
- It has opened proceedings against porn sites for inadequate protection of children;
- It has adopted a preliminary finding that Meta (Facebook and Instagram) is in breach as regards researchers’ access to data, and as regards flagging illegal content and allowing for appeals against content moderation decisions; an investigation as regards deceptive advertising, political data, and misinformation on Meta is still underway; and
- It has adopted a preliminary finding that Temu has breached the DSA as regards illegal products, and an investigation continues as regards other issues
- It has accepted a commitment from TikTok to suspend the ‘TikTok Lite’ programme, which was apparently designed to (further) encourage social media addiction by children, having used the threat of issuing an intention to impose interim measures under the DSA earlier on in this case. A new decision, following a preliminary finding, accepts further commitments regarding information on advertisers The Commission has also adopted a preliminary finding against TikTok as regards researchers’ access to data, a preliminary finding of breach as regards addictive design, and further investigations against Tiktok are still underway.
- It has begun investigation of Shein for illegal content (child sex dolls), recommender systems and addictive design