Lorna Woods, Professor of Internet Law, University of Essex
Introduction
The ubiquity of social media
platforms and their significance in disseminating information (true or false)
to potentially wide groups of people was highly unlikely to have been in the
minds of the European legislators when they agreed, in 2000, the e-Commerce
Directive (
Directive
2000/31/EC) (ECD). Facebook itself was launched only in 2004. Despite the
changing times and technological capabilities, the Commission has decided not
to revise the ECD, specifically its safe harbour provisions for intermediaries,
in its current single digital market programme.
Although the ECD seems set to remain unchanged, the application of the
safe harbour provisions raises many difficult questions which have not yet been
fully answered at EU level by the Court of Justice.
CG v. Facebook ([2016] NICA 54), a
decision
of the Northern Irish Court of Appeal, illustrates some of these difficulties
and certainly raises questions about the proper interpretation of the ECD and its
relationship with the Data Protection Directive.
Intermediary Immunity - Legal Framework
The ECD provides immunity from
liability for certain ‘information society service providers’ (ISS providers)
on certain conditions. To gain immunity,
the ISS provider must
-
be an ISS provider within the terms of the ECD;
and
-
one of the following applies:
-
the provider is a ‘mere conduit’ (Art. 12 ECD);
-
provides caching services (Art. 13 ECD); or
-
provides hosting services (Art. 14 ECD).
Each one of these three
categories provides for a different level of immunity, which seems connected
with the level of knowledge the ISS provider is assumed to have of the
problematic content. Here Article 14, which deals with hosting, is the relevant
provision. It provides:
1. Where an
information society service is provided that consists of the storage of
information provided by a recipient of the service, Member States shall ensure
that the service provider is not liable for the information stored at the
request of a recipient of the service, on condition that:
(a) the
provider does not have actual knowledge of illegal activity or information and,
as regards claims for damages, is not aware of facts or circumstances from
which the illegal activity or information is apparent; or
(b) the
provider, upon obtaining such knowledge or awareness, acts expeditiously to
remove or to disable access to the information.
2. Paragraph 1
shall not apply when the recipient of the service is acting under the authority
or the control of the provider.
3. This
Article shall not affect the possibility for a court or administrative
authority, in accordance with Member States' legal systems, of requiring the
service provider to terminate or prevent an infringement, nor does it affect
the possibility for Member States of establishing procedures governing the
removal or disabling of access to information.
The recitals to the ECD give more
detail as to the scope of services protected by Article 14 and there is a
certain amount of case law on this point, notably
Google Adwords
(Case C-236/08) and the Grand Chamber decision in
L’Oreal
v. eBay (Case C-324/09). Recital 42 has been pointed to by the Court in
these cases as relevant for understanding the sorts of activities protected by
the immunity. Recital 42 refers to services of a
mere
technical, automatic and passive nature, which implies that the information
society service provider has neither knowledge of nor control over the
information which is transmitted or stored.
The ECJ in Google Adwords referred to this as being ‘neutral’ (para 113-4). The
Grand Chamber in its subsequent L’Oreal
decision suggested that advice in optimising presentation would mean a provider
was no longer neutral (para 114).
The provision protects relevant
ISS providers from liability in relation to illegal content, provided they have
no knowledge (actual or constructive) of the illegal activity or information,
and that if they have such knowledge, they have acted expeditiously to remove
it. In L'Oreal v eBay the Court of
Justice provided a standard or test by which one can measure whether or not a
website operator could be said to have acquired an 'awareness' of an illegal
activity of illegal information in connection with its services, that is whether
"a diligent economic operator would have identified the illegality and
acted expeditiously". The CJEU also held that an awareness of
illegal activities or information may become apparent as the result of an
investigation by the operator itself or where the operator receives
notification of such activity. Article
14 does not protect ISS providers from injunctions, or the costs associated
with any such injunctions (see Recital 45).
Additionally, Article 15
specifies that, for those falling within Articles 12-14, Member States cannot
impose a ‘general obligation’ to monitor content to determine whether content
is illegal. There has been a considerable amount of dispute as to the
relationship between this provision and the scope of immunity, especially given
the requirements in L’Oreal. Recital 40 notes that ‘service providers have
a duty to act, under certain circumstances, with a view to preventing or
stopping illegal activities’ and that the immunity provisions ‘should not
preclude the development and effective operation, by the different interested
parties, of technical systems of protection and identification and of technical
surveillance instruments made possible by digital technology’. The Recitals
also state:
(47) Member
States are prevented from imposing a monitoring obligation on service providers
only with respect to obligations of a general nature; this does not concern
monitoring obligations in a specific case and, in particular, does not affect
orders by national authorities in accordance with national legislation.
(48) This
Directive does not affect the possibility for Member States of requiring
service providers, who host information provided by recipients of their
service, to apply duties of care, which can reasonably be expected from them
and which are specified by national law, in order to detect and prevent certain
types of illegal activities.
The distinction between general
monitoring and specific monitoring has yet to be fully elaborated, and is an
issued much discussed in the context of intellectual property enforcement,
especially as regards keeping pirated copies of materials down after taking it
down in the first place.
Facts of CG
McCloskey opened a Facebook page
in August 2012 entitled ‘Keeping Our Kids Safe from Predators’ in which he
published details of individuals who had criminal convictions relating to
sexual offences involving children. This
page was not subject to any privacy settings.
One individual who was so named brought action against Facebook and an
interim injunction was issued requiring Facebook to remove the page and related
comments, on the basis that the comments responding to the posting were
threatening, intimidatory, inflammatory, provocative, reckless and
irresponsible. This was the XY litigation. Immediately after the page was
removed, McCloskey set up a new page, Predators 2. CG was identified on this
page on 22 April 2013; his photograph was published and there were discussions
about where he lived. Comments included abusive language, violent language – including
support for those who would commit violence against CG and for the exclusion of
CG from the community in which he lived.
The disclosure of CG’s residence was contrary to the position taken by
the Public Protection Arrangements in Northern Ireland (PPANI), which took the
view that such disclosure interferes with the rehabilitation process.
On 26th April 2013, CG’s
solicitors wrote to Facebook and its solicitors in Northern Ireland, claiming
the material was defamatory and that CG’s life was at risk. A hardcopy of
Predators 2 page was enclosed. Facebook’s response was that CG should use the
online reporting tool, but CG expressed a desire not to have to engage with
Facebook. By 22 May 2013 Facebook removed all postings on Predators 2, but on
28 May, CG issued proceedings. Subsequently, CG’s solicitors wrote to Facebook
complaining that the photograph had been shared 1622 times and that other
Facebook users had included comments threatening violence. They identified the
main URL, but not all such instances which Facebook then requested. This information
was provided on 3rd and 4th December and removed on 4th or 5th December. A
further reposting of the photographed by RS occurred on 23 December, stating
that this was what a “pedo” looked like. A letter of claim was send to Facebook
on 8th January 2014, identifying the relevant URLs and the page was taken down
on 22 January 2014. While CG accepted
that the defamation claim was without merit, it was accepted that he was
extremely concerned about potential violence as well as the effect on his
family.
Judgment at First Instance
The trial judge had to deal with
claims against McCloskey, as well as claims against Facebook. The trial judge, having reviewed the
evidence, concluded that McCloskey’s conduct constituted harassment of CG. The
case against Facebook was based on the tort of misuse of private information.
To find that there had been such misuse, there had to be a reasonable
expectation of privacy in relation to the relevant information which should take into account all the
circumstances (relying on
JR38
[2015] UKSC 42 and
Murray v. Express
Newspapers [2008] EWCA Civ 446). The judge also accepted the submission
that the Data Protection Act, and specifically the category of ‘sensitive
data’, provided a useful touchstone as to what information could be seen as
private (see
Green
Corns Ltd v. Claverly Group Limited [2005] EWHC 958). The judge concluded
that the use of a photograph or name in conjunction with information which
could identify where CG lived and any information about his family members were
private information. The judge considered that Facebook was put on notice of
the problematic nature of the material by the XY litigation (which mentioned
the Predator 2 page) and that simple searches would reveal the page, as it had
an almost identical name with identical purposes. The trial judge concluded
that it was apparent on the face of the posts that consideration of the
lawfulness of the posts was needed. As regards the
Electronic
Commerce (EC Directive) Regulations 2002, which implement the ECD in the
UK, the judge rejected the contention that there was an obligation to give
Facebook notice in a particular form. So, neither the ECD nor the 2002
Regulations protected Facebook from the claim of misuse of private information.
A further claim under the Data
Protection Act was added late in the day. The judge concluded that –in the
absence of relevant discovery - CG had not established this proposition.
Facebook appealed. CG also appealed as regards the data protection point, but
did not pursue this point.
Court of Appeal Judgment
The Court noted that there was
agreement that McCloskey’s behaviour was unreasonable conduct sufficient to
give rise to criminal liability (
R v Curtis [2010]
EWCA 123), and that the 2002 Regulations do not cover injunctions. The
Court agreed that this was an appropriate case in which to make an order taking
to down the material to protect CG from continued intimidation [para 40]. The
Court noted that the tort of misuse of private information and harassment,
while complementary, are not the same and that a finding of harassment did not
automatically mean that there had been a misuse of private information.
As regards the tort, the Court
noted that there was no dispute between the parties that this case was about an
intrusion, but that the tort would come into play only if there was a
reasonable expectation of privacy in the information, which is a fact sensitive
determination. The Court of Appeal noted
the public interest in knowing about criminal convictions; it also disagreed
with the trial court judge about the reading across of the categories of
sensitive information in the DPA. It held:
The fact that
the information is regulated for that purpose does not necessarily make it
private’ [para 45].
Reviewing the material, the Court
held that the context of harassment was determinative to the finding that CG
has a reasonable expectation of privacy in the material [para 49]. By contrast,
RS was protected by principles of open justice which allow citizens ‘to
communicate the decisions of the criminal justice systems to others’ and
therefore CG did not have a reasonable expectation of privacy in relation to
that posting [para 51].
The Court then considered whether
Facebook could rely on the safe harbour provisions of the ECD and the 2002
Regulations. It held that the 2002 Regulations need to be understood in the
light of Art 15 ECD even though it is not formally implemented in the UK.
According to the Court, Article 15 ‘clearly’ applied to Facebook [para 52].
While not expressly stated, the Court’s approach is based on the assumption
that Article 14 (safe harbour provisions for those providing hosting services)
and Regulation 19 of the 2002 Regulations, which implement it, also apply.
The Court then considered the
issue of notice. Facebook argued that CG had not given proper notice, on the
basis that CG had not used Facebook’s online submission process. The Court of
Appeal agreed with the trial court’s dismissal of this argument, stating,
‘[a]ctual knowledge is sufficient however acquired’ [para 58]. Facebook
challenged the approach taken at first instance, that Facebook had the
resources to find the material and assess it [High Court, para 61]. It was also argued that the way the High Court
approached the question of constructive knowledge also implied a monitoring
obligation. The trial judge referred to the XY litigation; that litigation plus
the letters of CG’s solicitors; and the litigation together with some
elementary investigation of the profile. The Court of Appeal agreed with these
concerns. It stated the question as
being:
Whether
Facebook had actual knowledge of the misuse of private information … or
knowledge of facts and circumstances which made it apparent that the publication
of the information was private
before commenting that
[t]he task
would, of course, have been different if there had been a viable claim in
harassment made against Facebook [para 62].
It did not elaborate the basis or
extent of the difference.
The Court concluded that the XY
litigation did not fix Facebook with sufficient notice; that it only could do
so if Facebook was subject to a monitoring obligation. In any event, knowledge
of a propensity to harass did not fix Facebook with notice about the private
information. As regards the correspondence, the Court held that this too was
insufficient to fix Facebook with notice. While it referred to the problematic
content, it did not refer to misuse of privacy. ‘The correspondence did not,
therefore, provide actual notice of the basis of claim which is now advanced’
[para 64]. The Court also considered that there was nothing in the letters to
indicate that the information was private. So, while ‘the omission of the
correct form of legal characterisation of the claim ought not to be
determinative of the knowledge and facts and circumstances which fix social
networking sites such as Facebook with liability’, it is necessary to identify
‘a substantive complaint in respect of which the relevant unlawful activity is
apparent’.
Here, since there was no indication in the letter of claim that the
address was the issue, the Court did not ‘consider that the correspondence
raised any question of privacy in respect of the material published’. [para 69]
By contrast, in the letter of 26th November, CG referred to the general
identification of where CG was living and the threat from paramilitaries. This
was sufficient to establish knowledge of facts and circumstances in relation to
that particular post. Referring to the Court of Justice in L’Oreal, the Court
noted that Facebook is obliged to act as a diligent economic operator. This
point was not argued; Facebook was found to be liable in respect of that post
for the period 26th November-4/5 December.
The burden of proof is in the
first instance on the claimant to show knowledge; thereafter the ISS must prove
it did not.
As regards the DPA, it was agreed
that Predators contained personal data and sensitive personal data, the issue
was whether Facebook Ireland could be seen as subject to the UK DPA. The ECJ rulings in
Google
Spain (Case C-131/12) and
Weltimmo
(Case C-230/14) were argued before the Court. The Court did not accept the
submission that Google Spain was limited to its particular facts and the
concern that the protection offered by the Data Protection Directive would be
undermined if it excluded out of EU data controllers. The Court here noted that
Weltimmo in fact built on the approach in GoogleSpain. It concluded that
Facebook is a data controller established in the UK for the purposes of the
DPA. Although the Court accepted that
the ECD does not cover data protection, and this is reflected in Regulation 3
of the 2002 Regulations, the Court held at para 95:
The starting
point has to be the matter covered by the e-Commerce Directive which is the
exemption for information society services from the liability to pay damages in
certain circumstances …We do not consider that this is a question relating to
information society services covered by the earlier Data Protection Directive
and accordingly do not accept that the scope of the exemption from damages is
affected by those Directives.’
Comment
This case is one of a number
coming through the Northern Irish court system regarding different types of
problematic content and the responsibility of social media platforms to take
action against such content. Shortly
before this case was handed down, the High Court handed down its decision in
J20
v Facebook Ireland Ltd ([2016] NIQB 98). Other cases are working their way
through the system:
AY
v Facebook (Ireland) Ltd ([2016] NIQB 76), concerning naked images of a
school girl on a ‘shame page’;
MM
v BC, RS and Facebook ([2016] NIQB 60), concerning revenge porn; and
Galloway
v Frazer and Google t/a YouTube ([2016] NIQB 7) concerning defamatory and
harassing videos. While this case is
based in the particular cultural and legal context of Northern Ireland, and
raises questions on the meaning of private information, it also leads of
questions about the interpretation of EU laws, notably the ECD and DPD.
The first point to note is that
the Court does not directly address the question of the applicability of
Articles 14 and 15 ECD, beyond stating the Article 15 clearly applies. Article
15 is dependent on the ISS provider providing services that fall within one of
Article 12, 13 or 14 ECD, with Article 14 being relevant here. So the question
is whether Article 14 ECD (and consequently Regulation 19 of the 2002
Regulations) applies here. While the text of Article 14 ECD refers to ‘the
storage of information provided by a recipient of the service’, the case law
makes it clear that not any storage will do. Rather, the service provider must
be neutral as regards the content, technical and passive. In this regard, services Facebook provide
regarding information of interest to Facebook users (News Feed algorithm and
content recommendation algorithm, as well as Ad Match services), may mean that
the question of neutrality and passivity here is at least worthy of
investigation, in that Facebook may promote certain content (in the term of L’Oreal,
para 114). Of course in
Netlog
(Case C-360/10), the Court of Justice held that a social media platform could
benefit from Article 14, but this does not mean that all will – much will
depend on the facts (see eg Commission 2012 Working Paper on trust in the
digital single market (
SEC(2011)
1641 final, accompanying
COM(2011)
942 final).
Assuming Article 14 (and its UK
equivalent, Regulation 19) applies, the next question is whether Facebook was
on notice. The ECD is silent on the
nature of any formalities, leaving it to Member States and industry (via
self-regulation per Recital 40) to fill in the detail. In its 2012 Working Paper, the Commission
acknowledged that there were diverging views as to what notice required,
ranging from those who argued that nothing less than a court order should be
accepted (seemingly thereby focussing on just actual knowledge) through to
those who suggested that general awareness of the use of the site for illegal
content was sufficient (which covers constructive knowledge) (p. 33-34). It
seems there are three main issues here:
- Whether
notice has to be given in any particular format;
- Whether
notice has to identify the illegality or whether identifying the problematic
content will do; and
- The
relationship between constructive notice and Article 15, also bearing in mind
the obligations of the diligent economic operator.
Facebook argued of course that a
person complaining about content should use the tools provided by Facebook and
provide rather precise information. The
Court, rightly, held that to require a particular format to be used but run
counter to the aim (particularly with reference to the 2002 Regulations) of
facilitating the ability of users to make complaints. It is less clear the
position of the Court with regard to the need to provide URLs. The need to
provide specific URLs makes it difficult for claimants especially those who
seek orders for content to be taken down and to stay down (seen particularly in
the field of intellectual property enforcement, for example even in L’Oreal). In
this case, where the Court found Facebook liable CG had provided specific URLs,
but the Court is silent on whether the lack of specific URLs was a
determinative factor in the other instances. It is submitted that, provided sufficient
identifying information about the content is provided, precise URLs should not
be required especially for a diligent economic operator (discussed below).
The Court focussed on the
question of whether CG sufficiently identified the reason why the content is
illegal. In this, the Court observes that the omission of the correct legal
characterisation is not determinative; to have held to the contrary would
undermine the ability of claimants without lawyers to have material taken down.
The Court moves on to suggest that the relevant unlawful activity has to be
apparent. It does not consider to whom such unlawfulness must be apparent, or
indeed the prior question of whether the ECD requires just notification of
content or activity perceived as illegal by the complainant, rather than a
justification of why the complainant thinks that. While on the facts of this
case there are concerns that CG referred to causes of action that were clearly
wrong (e.g, defamation), it is arguable that the Court’s position needs further
refinement. Certainly the Court’s approach on this aspect seems generous to
Facebook in terms of what it needs to be told.
In this regard a number of
comments can be made. While, an operator
would need to make an assessment about the legitimacy of a take down request,
that is a separate issue from the fact of being notified that someone thinks
some content is problematic. Further, there may a world of difference between
what a man on the street might so recognise and that which the diligent
economic operator should recognise and the detail required for that. Indeed, in
L’Oreal, the ECJ held:
although such
a notification admittedly
cannot automatically preclude
the exemption from
liability provided for
in Article 14
of Directive 2000/31,
given that notifications
of allegedly illegal
activities or information
may turn out to be insufficiently
precise or inadequately substantiated, the fact remains that such
notification represents, as
a general rule,
a factor of
which the national
court must take
account when determining,
in the light
of the information
so transmitted to
the operator, whether
the latter was
actually aware of
facts or circumstances
on the basis
of which a
diligent economic operator should have identified the illegality (para
121-2).
This suggests that a diligent
economic operator may not just rely on what a complainant said, but may have to
take steps to fill in the blanks. As the
Commission reported in 2012, it has been suggested by some that the degree to
which it is obvious that the activity or information is illegal should play a
role in this assessment. Some content is
more obviously problematic than others. This position is not incompatible with
the approach of the Court here: the problem for CG is that an address is not
usually that problematic in privacy terms, it was the context (not apparent on
the face of it) that made it so [para 69]. This distinction may have relevance for the AY
litigation, if not the revenge porn case – depending on the nature of the
images.
The final point of concern
relates to general monitoring. The rejection by the Court of the possibility
becoming aware of a particular type of content (as from the XY litigation) and
being on notice as a consequence deserves further examination. This depends on
what is meant by ‘general monitoring’ as opposed to a ‘specific’ monitoring obligation,
accepted by recital 47 ECD, and recognised by the Commission in its 2012
Working Paper (p. 26). It is unfortunate
that the Court did not give this more attention. While case law has made clear
that filtering of all content, for example, constitutes general monitoring (
SABAM
v Scarlet (Case C-70/10)), it has been argued- principally in the context
of IP enforcement -that searching for a particular instance of content (re-occurring)
is not. Such a broad view of general
monitoring as the Court here adopted also seems to decrease the space in which
the diligent economic operator acts, raising questions about the meaning of L’Oreal. Note also that the Commission in its recent
review noted ‘there are important areas such as incitement to terrorism, child
sexual abuse and hate speech on which all types of online platforms must be
encouraged to take more effective voluntary action to curtail exposure to
illegal or harmful content’ (
COM/2016/0288
final). This suggests that the
Commission may expect such platforms to be proactive and not merely reactive.
Perhaps the most significant
point, and one on which a reference should perhaps have been made, is the
relationship between the ECD and DPD, a point yet not dealt with in English law
(see
Mosley v Google
[2015] EWHC 59 (QB)). The Court accepted
fairly readily that Facebook (Ireland) falls under the UK DPA, but then insists
that despite the fact that data protection is excluded from the field of
application of the ECD, that Facebook pages and comments fell within the
“matter covered by the e-Commerce Directive” which provide a “tailored solution
for the liability of [ISS providers] in the particular circumstances” set out
in the ECD. It did not explain why, beyond asserting that the ECD safe harbour
provisions do ‘not interfere with any of the principles in relation to the
processing of personal data, the protection individuals ... or the free
movement of data’ [para 95]. In this assessment, the Court overlooked the fact that
under the DPD a remedy must be provided to individuals, so as to make effective
their rights and, that the protection awarded to data subjects should not vary
depending on the mechanism used for that processing. Furthermore, Recital 14 to the ECD elaborates
that
The protection
of individuals with regard to the processing of personal data is solely
governed by Directive 95/46/EC …..the implementation and application of this
Directive should be made in full compliance with the principles relating to the
protection of personal data.
Whilst a Member State was free to
provide more far-reaching to protection to intermediaries, this freedom reaches
its limit when it conflicts with another harmonised area of EU law, such as
data protection. The Court’s position on this point, and especially its
reasoning, in the light of the terms of both directives, is not convincing.
In sum, the outcome – liability
for Facebook on one aspect of the content posted – sounds on the face of it a
narrowing of immunity. The reality
points in a different direction. While there are a number of problematic issues
with which the court had to deal, the impact of this judgment lies in the
statements of general principle which the Court made. Significantly, these fell
into areas ultimately governed by EU law, rather than purely domestic
matters. It is far from certain that
those issues are clearly determined at EU level, nor that the Court’s
assessment here is free from doubt.
Photo credit: