Lorna Woods, Professor
Emerita, University of Essex
Image credit: US Department
of Defense
This Grand Chamber judgment of
the Court of Justice in X v
Russmedia Digital and Inform Media Press (Case C-492/23) handed down on
2 December 2025 concerns the scope of data protection rights and intermediary
immunity in the context of the non-consensual use of someone’s image. The judgment identifies:
- when someone has
responsibilities under the GDPR,
- the relationship between those
regulatory obligations and intermediary immunity, and
- the steps an data controller
could take to satisfy those GDPR obligations.
It has been described as
reshaping the obligations of online operators in the EU, while others have
questioned how far the points in the judgments may be generalised to other
situations.
Judgment
The Facts
Russmedia owns an online
marketplace on which advertisements may be published. An unidentified user
posted an advertisement falsely representing X as offering sexual services. The
advert included X’s photographs (though there is no suggestion that these were
intimate images) and phone number, all without her consent. Once notified,
Russmedia removed the advert within an hour but the advertisement had been
shared across several third party websites and remained accessible. X sued in
the national courts in respect of her image rights, rights to reputation and
data protection rights. The Romanian courts struggled with the question of
whether Russmedia could claim the benefit of intermediary immunity (under the
e-Commerce Directive (Directive
2000/31), provisions now replaced by the Digital
Services Act (DSA)) and the extent of the obligations under the GDPR.
Is Russmedia subject to
Obligations under the GDPR?
Obligations under GDPR arise when
(1) personal data are (2) processed by (3) a data controller.
The CJEU commenced its analysis
by noting the the information contained in the advert about X was personal data
for the purposes of the GDPR and moreover that claims about a person’s sex life
(implied in the advert) constituted “sensitive” personal data as protected by
Article 9 GDPR, and that remained the case whether or not the claim was
true. Classification of the data as
special category data means that there is a higher threshold to show lawful
processing of those data.
The Court further noted that “the
operation of loading personal data on a webpage constitutes processing” for the
purposes of the GDPR (para 54) and therefore covered the publication of the
advert.
While this puts the advert within
the scope of the GDPR, its obligations apply to data controllers and
processors, so the question was whether, given Russmedia had no control over
the content of the advert, was it a controller or joint controller? The Court
reiterated previous jurisprudence to say (para 58) that:
any natural
or legal person who exerts influence over the processing of such data, for his
or her own purposes, and who participates, as a result, in the determination of
the purposes and means of that processing, may be regarded as a controller in
respect of such processing.
It noted also that there may be
more than one entity which is a controller in respect of processing – this is
the idea of joint controllers, although they may not have equal responsibility
depending on the facts (para 63). Joint
decision making is not necessary for there to be joint controllers.
While the test for “controller”
requires that the person processing the data does so for their own purposes,
the Court added that this could include the situation “where the operator of an
online marketplace publishes the personal data concerned for commercial or advertising
purposes which go beyond the mere provision of a service which he or she
provides to the user advertiser” (para
66). The Court in this case pointed to
the fact that the terms of use give Russmedia “considerable freedom to exploit
the information published on that
marketplace” including “the right to use published content, distribute it,
transmit it, reproduce it, modify it, translate it, transfer it to partners and
remove it at any time” (para 67). Russmedia is therefore not publishing solely
on behalf of the user placing the advert. The Court also noted that Russmedia
make the data in the advert accessible, allows the placing of anonymous adverts
and sets the parameters for the dissemination of adverts (likely to contain
personal data).
As a result of finding that the
advert publishing platform was a joint controller the GDPR obligations bite in
relation to the advert and must be able to demonstrate that the advert is
published lawfully, which includes the requirement for consent for sensitive
data (para 84 and 93) and the requirement for accuracy. The CJEU notes that once published online and
accessible to any Internet user, such data may be copied and reproduced on
other websites, so that it may be difficult, if not impossible, for the data
subject to obtain their effective deletion from the Internet. The adds to the seriousness of the risks
facing the data subject.
The GDPR also requires the
implementation of technical and organisational measures – and this should be
considered in the design of the service so that such data controllers can
identify adverts containing sensitive data before they are published and to
verify that such sensitive data is published in compliance with the principles
of the GDPR (para 106). Further, the
controller must ensure that there are safety measures in place so that adverts
containing sensitive data and not copied and unlawfully published elsewhere
(para 122).
Are the GDPR Obligations
Affected by Intermediary Immunity?
While the immunity provisions in the
e-Commerce Directive are far-reaching, the e-Commerce Directive specified that
it was not to apply to the
Data Protection Directive (the legislation in force at the time the e-Commerce Directive
was drafted) and that included the immunities; the Court concluded that this
meant the e-Commerce Directive could not interfere with the GDPR. It also
specified that GDPR requirements here cannot be classified as general
monitoring (which is prohibited by the e-Commerce Directive (and now the DSA)).
Conclusions, Implications and
Questions
The ruling in this case does not
match existing industry practice. It is not a bolt out of the blue, however,
but builds on existing jurisprudence (eg Fashion ID (Case
C-40/17)). While the obligations
required of Russmedia in this case may indicate, to some, a landmark shift in
the Court’s approach, the judgment does rely on the specific facts in the case
and, specifically, the point that “sensitive” data, which effectively requires
explicit consent, is in issue. In principle, this could be relevant to other
forms of sensitive content, notably non-consensual intimate images (NCII).
Certainly, it re-emphasises data protection as a route for victims’ redress, if
not preventing harm in the first place.
The ruling clarifies that a range
of activities typically carried out by platforms - structuring, categorizing,
and monetizing user content, can amount to determining “the purposes and means
of processing personal data”, the test for responsibility as a controller under
the GDPR (article 4 GDPR). In taking this approach, it differed from the
Opinion of its Advocate-General (AG’s Opinion, para 120). The Court noted that the definition of
controller in the GDPR is broad – and this is to support the protection of
individuals’ fundamental rights to privacy and data protection. Once a body is
a controller, that body must be able to demonstrate compliance with the data
protection principles, and take appropriate technical and organisational
measures to ensure data processing is carried out in accordance with the
GDPR.
Here, some of the points that the
Court relied on to determine that Russmedia was a joint controller could well
be relevant to other services and not just online marketplaces. For example,
many sites have broad terms of service similar to those the Court highlighted
here; other services also allow anonymous posting and a key feature of many
services is the making available of that content for advertising revenue
purposes, as well as controlling how content is promoted. (Note the decision of
the court in YouTube
and Cyanado (Joined Cases C-682/18 and C-683/18), which suggested that
automated content curation did not mean that a service is not neutral, is not
directly relevant here as it relates to the conditions for maintaining
intermediary immunity – and see Russmedia, AG’s Opinion, para 155) It is unclear how many of these criteria need
to be present for a service to constitute a controller in relation to the
personal data in third party content it publishes (though the Court seems to
list them as alternatives, suggesting any of them would suffice), or whether
less far-reaching terms of service may be sufficient to stop a platform being a
joint controller. Where these conditions
are satisfied, its impact need not be limited to advertising but to organic
content containing third party personal data too.
The Court’s confirmation that the
clear wording of the e-Commerce Directive, excluding the Data Protection
Directive (the predecessor legislation to the GDPR) from its scope, meant that
an intermediary cannot escape its own data protection responsibilities does not
affect immunity from liability in respect of unlawful content. Note that this decision was based on the
wording of the e-Commerce Directive. This language has not been carried over to
the DSA, which is expressed to operate without prejudice to, inter alia, the
GDPR. It is not clear if or how this would change the Court’s interpretation.
Immunity provisions from the e-Commerce Directive have been carried across to
the DSA (albeit with a “carve out” in respect of consumer law in Article 6(3)
DSA). While the EDPB has published guidance
on the interplay of the GDPR and the DSA, it has looked at the question of the
impact of the DSA requirements on data protection rather than the impact of
data protection on the DSA.
The judgment suggests that
services should design checks into their services to ensure compliance with the
data protection obligations including pre-publication checks as to whether
sensitive data is included and to check the identity of the person posting the
material. Of course, while some sorts of posts (eg NCII) clearly constitute
sensitive personal data, the outer edges of this category might not be clear
cut. The Court here noted that the category should be interpreted broadly (para
52). It could be that some of the obligations could be passed on to the user
uploading the advert through terms of service, though this might be capable of
being abused by some users. Further, the
CJEU expects the site to prevent third party scraping so far as is possible –
the judgment does not introduce strict liability in this regard. What technical measures would be sufficient
in practice remains uncertain. This is
very different from the reactive response required to maintain immunity under
the e-Commerce Directive – and which has been the dominant framing until now.
Assuming the position on immunity does not change, services may have to
implement new systems, probably including automated tools and may ultimately
affect choice of business model for some services.
There are questions about how
this ruling impacts the DSA. How does a pre-check system differ from general
monitoring. General monitoring is prohibited under Article 8 DSA (though
specific monitoring is not)? The CJEU stated that systems to ensure GDPR
compliance could not be classified as “general monitoring” (para 132) – but did
not explain this statement any further. There is an argument to say that all
content will need to be scanned to identify that which contains sensitive
personal data – and by contrast to checking against a database of known CSAM
images, for example, which might be considered specific monitoring, this is a
more open ended obligation. It is unclear whether there are other routes to
pre-check which do not involve content scanning. The requirements to check whether the person
posting the personal data is the person to which the data relates (or is
otherwise lawfully processing) may make, for example, anonymity difficult to
maintain and it is unclear what level of identity verification would be
acceptable. There are also questions
about how this system of pre-checks affects the neutrality of the platform and
consequently the possibility for the platform to claim immunity (in respect of
other claims relating to the content) under Article 6 DSA.
The position in the UK may be
slightly different, however. Section 6(1)
European Union (Withdrawal) Act provides that decisions of the CJEU
post-dating 31 December 2020 do not bind UK courts although they may have
regard to such judgments. The provisions
which would have had the effect of removing the status of binding precedent
from decisions of the CJEU made on or before that date have now not been
brought into force (but they remain on the statute book), as the Labour
Government revoked
the relevant commencement regulations.
Furthermore, old case law from the Northern Irish courts (pre-dating
Brexit), CG
v. Facebook, suggested the the e-Commerce Directive (the relevant law
at the time) could apply to data protection claims.
No comments:
Post a Comment