Steve Peers, Professor of Law, Royal Holloway University
of London
Photo credit: Animated Heaven, via Wikimedia
Commons
Introduction
The EU’s Digital
Services Act (DSA) was conceived before Elon Musk bought Twitter (soon
renaming it X); but they were literally born simultaneously, with the DSA being
published in the EU’s Official Journal on the same day that Musk completed his
takeover. Since then, Musk’s behaviour running X (see my review
of a book on the takeover and the aftermath) has exemplified many of the
reasons why the EU (and other jurisdictions) contemplated regulating social
media in the first place: in particular arguments about the legality of its
content and the fairness of its algorithms.
A Twitter user coined the
phrase ‘today’s main character’ to describe a poster who becomes the centre
of attention for a day – usually due to an absurd or obnoxious post that prompts
many negative responses. For the DSA, X has been its main character since its
creation, with much of the public debate about the potential use of the Act focussing
on how it might apply to the controversial social network.
This debate has now come to a
head. Last week, following its preliminary
findings back in July 2024, the EU Commission adopted a final
decision imposing a fine to enforce the DSA for the first time: €120 million
for three breaches of the Act by X. This initial decision is likely to impact
upon the broader debate over the Act’s implementation, and – due to Musk’s influence
in the current Trump administration – also play a role in the fast-deteriorating
relations between the EU and the US.
This blog post first provides an
overview of the DSA, then examines the legal issues arising from this specific
enforcement decision, and concludes with an assessment of the broader context
of this decision: the enforcement of the DSA more generally, and the relations
between the EU and the USA.
Background: overview of the Digital
Services Act
Adoption of the DSA
Although the critics of the EU Commission
fining X are quick to argue that the EU is undemocratic, EU legislation needs
the support of elected Member State governments and elected Members of the
European Parliament (MEPs) to be adopted. In fact, the Act received unanimous
support from Member States and a large majority of MEPs.
In any event, even without the
Act, Member States would likely regulate social media – perhaps more quickly
and more stringently than the EU has applied the Act in some cases. And even if
the whole EU ceased to exist, as Elon Musk and Russian government mouthpieces demand, those
countries would still be regulating Big Tech, with national equivalents of the Digital Markets Act
and the GDPR, for instance.
Indeed, despite leaving the EU, the UK has its own national versions of all
three laws: the Online
Safety Act, the Digital Markets,
Competition and Consumers Act, and the UK GDPR, which sits alongside the
Data (Use
and Access) Act. While UK regulators may be famously timid about enforcing
these laws, Australia – a long way from the EU – was not dissuaded
from banning under-16 year olds from social media.
But until Musk and his sympathisers
manage to destroy the EU, we have the DSA. It contains rules that govern online
platforms generally, regardless of size, but its most prominent rules concern a
special regulatory regime for the biggest platforms, defined as ‘very large
online platforms’ (VLOPs) and ‘very large online search engines’ (VLOSEs),
which subjects them to greater regulation. The Act gives the EU Commission
power to designate such platforms and search engines (on the basis that 10% of
the EU population visit them monthly) and to enforce the provisions of the DSA against
them.
While some claim that the DSA was
adopted only to punish US tech firms, the list
of designated VLOPs and VLOSEs includes also Chinese companies (AliExpress,
TikTok, Temu, Shein), EU companies (Booking.com, Zalando, and two porn sites),
and a Canadian site, Pornhub. Overall, nearly half of the companies designated
as operating VLOPs and VLOSEs are non-American (although some of the American companies
operate more than one platform).
Content of the DSA
For VLOPs, enforcement of the DSA
involves a number of measures, including requests for information, a start of
an investigation into possible breach of the Act, a preliminary finding of a
breach, and a final decision finding a breach – which can result in a fine (of up
to 6% of worldwide annual turnover) and orders to change practices. A VLOP or
VLOSE can also agree avoid a fine by agreeing binding commitments to change its
practices with the Commission (in effect, a settlement) before it reaches a
final decision. If a finding of breach is not complied with, the Commission can
impose very high fines – up to 5% of worldwide annual turnover per day.
While many critics of X excitedly
demand that the EU Commission ban it, the Act imposes a very high threshold
before a ban can be imposed – essentially a refusal to remove illegal content,
with additional safeguards including involvement of a court. The case law has
not yet fleshed out the relationship between the DSA and Member States’ laws on
overlapping issues, or clarified whether there can be private enforcement of
the DSA (ie individuals challenging the VLOPs and VLOSEs in court for breach of
the Act, rather than the Commission enforcing it) in parallel.
Substantively, the Act’s requirements
on VLOPs and VLOSEs (in its Articles 33-43) start with risk assessment: they
must ‘diligently identify, analyse
and assess any systemic risks in the Union stemming from the design or
functioning of their service and its related systems, including algorithmic
systems, or from the use made of their services’. Systemic risks are further
defined as including ‘dissemination of illegal content through their services’,
‘negative effects’ upon various human rights, ‘actual or foreseeable negative
effects on civic discourse and electoral processes, and public security’, and ‘actual
or foreseeable negative effects in relation to gender-based violence, the
protection of public health and minors and serious negative consequences to the
person’s physical and mental well-being’.
Very large platforms and search
engines are also obliged to (as further defined): mitigate these risks; comply
with a decision requiring a response to a crisis; perform independent audits;
offer a recommender system not based on profiling, at least as an option; make
public a repository of advertising data; provide access to their data to
researchers; explain their algorithms to regulators; establish independent
compliance bodies; provide further public data on their operations; and pay an
annual supervisory fee to the EU Commission.
The DSA in the EU courts
Even before the first fine was
imposed to enforce the DSA last week, its application in practice has been frequently
litigated. First of all, Amazon, Zalando and several porn sites have challenged
their designation as VLOPs. Zalando lost
its challenge in the EU General Court in September, but has appealed
to the EU’s Court of Justice (appeal pending). More recently Amazon
also lost its challenge in the EU General Court against designation as a VLOP,
but it still has time to appeal that judgment to the Court of Justice (Amazon
had won an interim
measures ruling in this case – delaying its obligation to publish
information about its advertisers – but that interim measure was overturned
by the Court of Justice, following a successful appeal by the Commission).
The porn companies’ legal
challenges to their designations as VLOPs are still pending (see the summary of
the arguments made by Pornhub,
XNXX
and XVideos;
a challenge by Stripchat
is also still pending even though the Commission has dropped
its designation as a VLOP); their applications for interim measures as regards
publishing advertisers’ information have been dismissed (see the General Court
orders re Pornhub
and XVideos,
and the failed appeals to the Court of Justice as regards Pornhub
and XVideos).
Of these cases, the recent Amazon
judgment has broad implications for the DSA as a whole, considered further below.
Secondly, the Commission’s decisions
on fees for regulation (for 2023) have also been challenged. These challenges were
all successful in the EU General Court (see the judgments as regards Tiktok
and Meta),
although the Commission has appealed both the Tiktok
and Meta
judgments to the Court of Justice (appeals pending). In the meantime, Tiktok,
Meta
and Google
have brought a further round of legal challenges (all still pending) to the
regulation fees imposed for 2024.
We can also now expect X to
challenge the enforcement decision against it. (If it also requests interim
measures, at least that aspect of the case will be decided soon).
Other enforcement of the DSA
In addition to the new decision
enforcing the DSA against X, other Commission enforcement
actions under the DSA have been adopted or are pending against VLOPs. Leaving
aside requests for information (such as the one recently
sent to Shein as regards reports of sales of child-like sex dolls):
-
The Commission has accepted
binding commitments from AliExpress on various issues, but at
the same time also adopted a preliminary finding that its risk assessment as
regards illegal products was insufficient;
-
It has opened
proceedings against porn sites for inadequate protection of
children;
-
It has adopted a preliminary
finding that Meta (Facebook and Instagram) is in breach as
regards researchers’ access to data, and as regards flagging illegal content and
allowing for appeals against content moderation decisions; an investigation
as regards deceptive advertising, political data, and misinformation on Meta is
still underway; and
-
It has adopted a preliminary
finding that Temu has breached the DSA as regards illegal products,
and an investigation
continues as regards other issues
Finally, the Commission has been
particularly active as regards TikTok. It has accepted
a commitment to suspend the ‘TikTok Lite’ programme, which was apparently
designed to (further) encourage social media addiction by children, having used
the threat of issuing an intention to impose interim measures under the DSA earlier
on in this case. A new decision, following a preliminary
finding, accepts further
commitments regarding information on advertisers – also a great irritant to
Amazon and the porn companies, as can be seen in the litigation summarised above,
as well as an issue in the X case, discussed below. TikTok has deadlines to
implement the various commitments it has made, and there are specific powers to
monitor whether it is complying with them under the DSA. The Commission has
also adopted a preliminary
finding against TikTok as regards researchers’ access to data, and further investigations
against Tiktok are still underway.
Overall, it can be seen that to
date the majority of enforcement actions under the DSA have been initiated against
companies that are not American. Also, to date all the offers of binding
commitments that have been accepted, in place of fines and enforcement orders,
have come from Chinese companies. The potential of negotiating binding commitments
instead of an enforcement order is, however, open to a VLOP based anywhere.
The non-compliance decision
against X
What did the decision address?
First and foremost, the non-compliance
decision against X only concerns certain issues, namely deceptive practices as
regards X’s ‘blue ticks’,* researchers’ access to data, and the repository of
advertisers. The Commission complaint about ‘blue ticks’ is that they are a ‘deceptive
practice’ banned by the DSA (note that this rule applies to platforms generally,
not just VLOPs), in that they purport to indicate that an account has been
verified, when it has not been. Under Musk, X has earned revenue from the blue
ticks by selling them to anyone willing to pay for them, although the sale of
the ticks, and the monetisation programme (ie giving money to X users whose
posts lead to large numbers of reactions) are apparently not the subject of the
non-compliance decision as such. The preference given to blue ticks in the X
algorithm is not the subject of the decision as such either.
(*Disclosure: I applied for and obtained
a ‘blue tick’ from Twitter prior to Musk’s purchase, when a proper verification
system applied. I did not pay for the tick under Musk, and it was initially
removed as a result. However, it was reinstated involuntarily – not at my
request, and without my paying for it, or monetising my posts – as part of a
process of reducing the social opprobrium of having a blue tick under Musk, in
which the ticks were reinstated for some accounts. I initially hid the reinstated
tick, but the facility to do that was removed. It remains there today; I have
not used X since August 2024, due to my objection to Musk encouraging violent
racial conflict in the UK, except for a handful of posts encouraging others to
leave the platform. I have retained my account there to reduce the risk of
anyone impersonating me, which has happened several times.)
The Commission has not yet made a
final decision – or even a preliminary finding – as regards other issues involved
in its opening
of proceedings against X, namely the dissemination of illegal content and the
effectiveness of rules against disinformation.
How can the decision be
enforced?
X now has 60 days to inform the
Commission about measures it will take to enforce the non-compliance decision as
regards blue ticks. It has 90 days to submit an action plan to address the other
two issues, and the Commission must respond to the action plan two months after
that. In the event of non-compliance with the decision, as noted above the DSA
gives the Commission the power to impose much higher fines. The method of calculation
of last week’s fine is not explained in the press release. (The non-compliance
decision itself may explain the calculation, but like most DSA decisions of the
Commission, it has unfortunately not been made public; Article 80 of the DSA
requires the main content of this decision to be published though)
If X challenges the decision in
the EU courts, it can request an interim measures ruling suspending all or part
of the decision; the EU General Court will decide on that (subject to appeal to
the Court of Justice), as it has done in several DSA cases already, as detailed
above. The final judgment of the EU courts can annul the Commission’s non-compliance
decision in whole or part, and the DSA (Article 81) gives the EU courts
unlimited jurisdiction to cancel, increase or reduce the fine. As for the
collection of the fine (and any further fines that might be imposed on X for
continued breach of the DSA), Article
299 TFEU sets out the process of enforcing fines imposed by EU bodies; although
if X removes all its assets from the EU to the US, it might try to prevent collection
by using US law that blocks the enforcement of foreign judgments on ‘free speech’
grounds (perhaps the SPEECH
Act, although that concerns defamation; other routes may be available, or fresh
routes adopted in light of the Commission decision).
This brings us neatly to the
question of whether the non-compliance decision is arguably invalid on ‘free
speech’ (or other) grounds.
Is the decision legal?
What are the legal issues as regards
last week’s non-compliance decision? As noted above, the recent judgment in the
Amazon case addresses two of the issues in the non-compliance decision (advertising
repositories and access to data), while also addressing broader criticisms of
the Act, some of which may be relevant if X challenges the finding as regards ‘deceptive
practices’, or takes this opportunity to challenge the legality of the Act more
generally (as Amazon did when challenging the legality of its designation as a
VLOP; on such challenges, see Article
277 TFEU).
Amazon’s legal challenge to its
VLOP designation did not advance the obviously untenable argument that fewer
than 10% of the EU population uses Amazon monthly (conversely, Zalando and the
porn sites are arguing about the calculation of the numbers). Rather,
Amazon argued that the entire system of special rules for VLOPs in the DSA was invalid,
because it violated a number of human rights set out in the EU
Charter of Fundamental Rights. All of these arguments were rejected by the
EU General Court.
First of all, the Court rejected
the argument that the VLOP regime breached the freedom to conduct a business
(Article 16 of the Charter). In the Court’s view, although the regime interfered
with the freedom to conduct a business, because it imposed significant costs on
VLOPs and also had a considerable impact on their organisation or required complex
technical solutions, that freedom was not absolute, and the interference with
it was justified. According to Article 52(1) of the Charter, limitations on Charter
rights have to be prescribed by law, have public interest objectives, respect
the essence of the right and be proportionate. Here the limits were admittedly prescribed
by law (being set out in the Act) and respected the essence of the right (as
Amazon could still carry out its core business); Amazon instead argued mainly that
the limits were disproportionate, as online shops did not present systemic
risks, the objectives could be satisfied by less onerous means, and the costs
were significant. However, the Court believed that there was a systemic risk of
illegal content in online marketplaces; other means of designating VLOPs were
not necessarily more proportionate; making advertising repositories open to the
public was justified in the interests of consumer protection; and the arguments
about economic impact made by Amazon as regards recommender systems, researchers’
access to data and advertiser repositories were unconvincing.
Secondly, Amazon’s argument that
its right to property was infringed (Article 17 of the Charter)
was dismissed at the outset, as it had not identified any of its property
rights that were affected by the DSA: an administrative burden did not constitute
interference with a property right. Thirdly, the Court rejected the argument
that the VLOP regime breached the general right to equal treatment
(Article 20 of the Charter), by treating larger companies differently from smaller
ones, on the grounds that larger companies presented bigger risks.
Fourthly, Amazon’s arguments
about freedom of expression (Article 11 of the Charter) were
rejected too. This argument was only made as regards applying the DSA rules on recommender
systems to Amazon. On this point, the Court reiterated that the Charter freedom
of expression rules must be interpreted consistently with the freedom of expression
set out in Article 10 of the European Convention on Human Rights (ECHR),
referring also to the case law of the European Court of Human Rights (ECtHR).
The Court did not see how the freedom of expression of third-party sellers
might be affected by the DSA rules, but it accepted that Amazon’s freedom of expression
was limited by having to offer a recommender system not based on profiling.
However, limitations of the right
could be justified: the limitation here was prescribed by law; it did not
affect the essence of the right (as Amazon could still offer a profiling-based
recommender system as an option); it had an objective of general interest
(consumer protection); and it was proportionate by only requiring the offer of
one non-profiling based recommender system as an option – taking account of
ECtHR case law that allows more interference with commercial expression than political
expression.
Finally, Amazon complained about
a breach of the right to privacy (Article 7 of the Charter). This
was a remarkable thing for a company with a business model based on
surveillance of its customers to argue about, but the Court considered its
arguments seriously nonetheless. Again it followed the ECtHR case law on the
corresponding rule (Article 8 ECHR), which states that businesses could invoke
the right to privacy. Here the argument concerned the DSA rules on ad repositories
and researchers’ access to data. Again the EU court agreed that the DSA
interfered with the right, but ruled that it could be justified: it was prescribed
by law, did not infringe the essence of the right, and complied with the principle
of proportionality, particularly because of the limits built in to the
obligations (for instance, no obligation to disclose the personal data of advertising
recipients, or about the success of advertising; controls on which researchers
can access the data).
How does this judgment (noting
that Amazon could still appeal it to the Court of Justice) apply to a legal
challenge that X might make to last week’s non-compliance decision? First of
all, the judgment in principle disposes of many arguments that X might make
about two aspects of the non-compliance decision, as regards ad repositories
and researchers’ access to data – although X might try different arguments, or
contend that the nuances of its case are different.
While the main
US response to the EU Commission’s decision has been to claim that the EU
is engaged in censorship, note that Amazon did not even argue that the DSA
rules on ad repositories or researchers’ access to data infringed freedom of expression,
and remember that X is only being investigated for the dissemination of
illegal content and the effectiveness of rules against disinformation. Obviously
a freedom of expression argument might be made in respect of those issues, but,
as noted above, X has not been subjected to a final decision or even a
preliminary finding in respect of them.
Furthermore, according to the
Amazon judgment, a VLOP challenging a Commission decision under the DSA can
only challenge the validity of those parts of the DSA that are the legal basis for
the decision made against them: so X cannot, at this point, specifically attack
the validity of the DSA rules on risk assessment or risk mitigation, since
there is no decision that it has breached them yet. X can attack the validity of the DSA
system for VLOPs generally, which includes the rules on risk
assessment and risk mitigation. Although Amazon has already tried this and
failed, X might try to argue its case differently; but it looks like a long
shot, given that a non-compliance decision is inherently more narrowly focussed
than designation as a VLOP.
Another key point to remember in
this debate is that, as the Amazon judgment confirms, the human rights
standards applied by the EU courts are those of the EU Charter, interpreted
(where relevant) in light of the corresponding ECHR rights, and the ECtHR case
law on those rights. The ECHR approach to rights differs in some respects from
that of the US courts, arguably providing greater protection for the right to
privacy (although not enough for Amazon to win its arguments on this point),
but lesser protection for the right to free speech (allowing more leeway for
interference with the right). But that is the nature of doing business in
another jurisdiction. US law may take the view that (hypothetical) X user ‘ZyklonB1488’,
regularly posting ‘Next year in Auschwitz!’ at Jewish people, has the right to
set out his stall in the marketplace of ideas. But other legal systems may legitimately
take the view that he does not.
Applying this to the sole
remaining issue in the Commission’s non-compliance decision – the deceptiveness
of X’s blue tick system – this is not directly connected to the content
of what blue tick holders (still less anyone else) may post on X. Any effect on
freedom of expression of last week’s decision is therefore marginal – although again,
free speech arguments would be stronger as regards future decisions the
Commission might make in respect of X as regards other issues still
under investigation (or Meta – subject to some broadly similar investigations,
as summarised above), especially because ‘illegal content’ is the one breach of
the DSA that might (subject to many conditions and safeguards) lead to a
ban on the whole platform. And to the extent that the non-compliance decision
on blue ticks does interfere with freedom of expression, there is a
strong argument that the interference is justified both on the ground of
consumer protection (cf the scams featuring impersonations
of consumer advocate Martin Lewis) and (as Article 52 of the Charter also provides
for) on the ground of ‘the need to protect the rights and freedoms of others’
(ie anyone being impersonated, including myself!).
Context: enforcing the DSA
Last week’s decision is a definitive
sign that the Commission is willing to enforce the DSA, even to the extent of
adopting non-compliance decisions. The world is full of ‘light-touch’
regulators – perhaps one of Britain’s more unappealing exports. Usually, the
Commission is not seen as such; but its obvious stalling on taking a final
decision regarding X, for 17 months since its provisional findings, may have
given the impression that – on the DSA, at least – the lion had turned pussycat.
The non-compliance decision should
be viewed alongside with the Amazon judgment, which it likely also takes
account of. VLOPs now know not only that the Commission is willing to act to enforce
the DSA, but also that the EU courts (subject to possible appeal) back up at
least some key provisions of the Act. Also, the recent judgment may explain
TikTok’s simultaneous willingness to agree on its compliance with the ad
repository rules; and the Commission’s willingness (again) to accept
commitments, combined with the recent judgment, shows VLOPs that it may be less
hassle to negotiate commitments with the Commission, rather than embark upon
court action that is unlikely to succeed. The context also includes a dog that did not bark:
the Commission did not propose any amendment to the DSA (or the Digital Markets
Act) in its recent proposal for an ‘omnibus’ bonfire
of some provisions of EU tech laws.
Having said that, it is striking
that the Commission is moving forward on non-compliance decisions and preliminary
findings other than on the issues relating more closely to content on
social media networks (cf the ongoing investigations into Meta and X), which
raise not only the more difficult legal issues (given their greater impact upon
freedom of expression) but also have the greater political impact (given the
subject-matter, and the closeness of both zillionaire owners to the US government).
And this brings us nicely to the impact of the decision upon US/EU relations.
Context: EU-USA relations
Coincidentally, the
non-compliance decision was released the day after the US government published
a foreign
policy review that was intrinsically hostile to the EU, and hyperpartisan in
its support of right wing populist parties in Member States. In that context,
the decision against X is just a drop in the rapidly-widening Atlantic Ocean.
Famously, US diplomat Dean Acheson was ‘present
at the creation’ of the post-war alliance; the Trump administration’s goal seems
to be to preside over its destruction.
Yet, as noted already, supporters
of Trump are nevertheless enraged by the decision, despite its limited impact.
Even though, as explained above, the DSA was approved by elected governments
and MEPs, does not solely apply to US companies and is not solely enforced
against US companies, and the recent decision has at best a marginal impact
upon freedom of expression, the response is the same: “They’re eating our free
speech!”
Of course, it’s hard to take
concerns about free speech from the Trump administration seriously: these are
folks who want to expel
legal migrants for criticism of a foreign government, and whose leader,
between naps, frequently insults
and threatens
journalists who are insufficiently North Korean in their adoration of him. If
these people are genuine free speech defenders, then I’m Alexander Hamilton.
As hypocritical and inaccurate as
the Trumpian reactions to the decision are, they were presumably anticipated by
the Commission before it took its decision. Even if the EU courts rule in the
Commission’s favour in the event of a legal challenge, its MAGA critics will
likely remain just as irrational (“They’re eating the snails!”). Yet the
Commission took the decision anyway.
The choice to go ahead with the
decision regardless can be understood either as a calculated risk that the US will
not punish the EU for it – at least no more than it was inclined to punish the
EU anyway, for various other reasons – or that even if the US does
punish the EU for the decision, it is worth exercising its regulatory powers anyway.
Perhaps this is a response to the perception that the Commission had seemed
unwilling to stand up to Trump to date. Or maybe the assumption is that Trump
is unlikely to pay much attention to this matter for long, particularly if the
EU can devise a way to distract him: something like a shiny gold award for ‘best
European’, for ending the war between Narnia and Freedonia, may work.
Whatever happens, the Commission’s
decision was certainly a gamble, in the current context of fraught EU/US
relations, with far broader trade and security issues at stake. Time will tell
whether this assertion of regulatory strength is worth it in light of the
reaction it may trigger.