Lorna Woods, Professor of Internet Law, University of Essex
There is an increasing incidence
of the use of video-surveillance and with it a need to find a framework in
which the conflicting rights or those who are subject to such surveillance may
be balanced. While there is increasing
case law surrounding state surveillance the position as regards private actors
– beyond the fact that data protection laws may well apply – is less well-developed. The recent case
of TK deals with such a balance,
but the ruling of the Court of Justice is far from ground-breaking.
The Facts
Three video cameras were
installed in the common areas of a block of flats to deal with issues of
vandalism and burglaries. The owner of
one of the flats in that block objected and sought to have the cameras
removed. National law permitted the use
of video-surveillance without the data subject’s consent for specified uses
including the prevention and countering of crime and the safety and protection
of persons, property and assets. The
national court referred a number of questions concerning the national law’s
compatibility with the Data
Protection Directive (Directive 95/46) (DPD) and provisions of the EU
Charter.
The Judgment
The ECJ determined that the questions
the national court referred should be understood together as asking whether Article 6(1)(c) and Article 7(f) DPD, read in
the light of Articles 7 and 8 of the EU
Charter, preclude national provisions which authorise the installation of a
system of video surveillance installed in the common parts of a residential
building, for the purposes of pursuing legitimate interests of ensuring the
safety and protection of individuals and property, without the data subject’s
consent.
These provisions of the DPD
concern respectively the principle of data minimisation (Article 6(1)(c)),
which requires that personal data must be ‘adequate, relevant and not excessive
in relation to the purposes for which they are collected and/or further
processed’, and the permissibility of processing personal data where ‘necessary
for the purposes of the legitimate interests pursued by the [data] controller
[ie, the person who decides on the purposes and means of data processing] or by
the third party or parties to whom the data are disclosed, except where such
interests are overridden by the interests for fundamental rights and freedoms of
the data subject’ (Article 7(f)). The subsequent EU data protection law, the GDPR, has not significantly amended these
rules, so there is no reason to believe that the judgment would have been different
under the GDPR.
The Court confirmed the position
in Ryneš
(Case C-212/13), a 2014 judgment concerning the use of a home security camera
(discussed here),
that the system of surveillance cameras constitutes the automatic processing of
personal data, but left it to the national court to assess whether the
characteristics identified in Ryneš were satisfied here. It also reiterated that processing must
fall within one of the six cases identified in Article 7 DPD. Considering
Article 7(f), the Court – citing Rīgas
satiksme (Case C-13/16), a 2017 judgment concerning personal data in
the context of a dispute over liability for an accident – identified a three
stage test:
-
the pursuit of a legitimate interest by the data
controller or by the third party or parties to whom the data are disclosed;
-
the need to process personal data for the
purposes of the legitimate interests pursued; and
-
the fundamental rights and freedoms of the
person concerned by the data protection do not take precedence over the
legitimate interest pursued.
Consent of the data subject is
not required. While a legitimate
interest has been claimed, the referring court was uncertain as to whether that
interest must be ‘proven’ and be ‘present and effective at the time of the data
processing’. The court agreed that
hypothetical interests would not satisfy Article 7(f) but in this case the
requirement of a present and effective interest is satisfied given the
instances of theft and vandalism prior to the installation of the CCTV.
Considering the second element of
the test, this in line with existing case law must be interpreted strictly,
that is that the legitimate aims cannot reasonably be as effectively achieved
by other means less restrictive of fundamental rights. It must also be
understood in the light of the principle of data minimisation in Article
6(1)(c) DPD. While previous means of
dealing with the vandalism and thefts have been ineffective, and the CCTV
relates only to the common parts of the building, the Court noted that the
proportionality assessment also must take into account the specific methods or
installing and operating that device – for example, limiting the hours with the
CCTV operates, or obscuring some images.
The final element constitutes a
balance between the opposing interests, and this will be fact specific bearing
in mind the significance of the data subject’s rights and specifically the
seriousness of any infringement [56]. Member States may not prescribe, for
certain categories of data, the outcome of any such balancing (see Breyer
(Case C-582/14), para 62 – a 2016 judgment about IP addresses as personal data,
discussed here).
The Court also distinguished between personal data available from public
sources and that from non-public sources, with the latter constituting a more
serious infringement. It ‘implies that information relating to the data
subject’s private life will thereafter be known by the data controller and, as
the case may be, by the third party or parties to whom the data are
disclosed’[55]. The Court also
identified the following factors:
-
the nature of the personal data at issue, in
particular of the potentially sensitive nature of those data
-
the nature and specific methods of processing
the data at issue, in particular
-
the number of persons having access to those
data and the methods of accessing them
-
the importance of the legitimate interests
pursued.
These factors are for the
national court to balance and to determine the legitimacy of the
processing. The DPD does not per se
preclude such a system.
Comment
The Court did not strike the
national regime down. This should not be
read as a ruling in favour of those who seek to place their neighbours under
surveillance. In the end, the Court leaves it to the national court to decide
on the facts. This acceptance of the
boundary between the competence of the national courts (in re facts and
national law) and the Court of Justice (in re EU law) does not mean that the
national law is automatically acceptable from the perspective of EU law.
Crucially the national law in issue itself contained a balancing requirement,
so that individual instances in which CCTV was to be deployed should be
assessed in light of the impact on the data subjects’ rights. A national rule that did not contain such a requirement
could be seen to fall foul of the position in Breyer, noted by the Court here, of specifying the outcome of a
conflict between interests.
As noted above, this chamber
judgment, based on the DPD, presumably will apply also to the relevant
provisions of the GDPR (Article 6(1)(f)), so it may have future relevance. What
can be learned? The Court’s approach is
to try to base its reasoning in existing case law. Of note is the reiteration of the point that
an ‘images allowing natural persons to be identified’ can be personal data
[35]. As a small point of detail, the phraseology is slightly different from
that in Rynes: in Rynes the Court explained its reasoning
by suggesting that “…the image of a person recorded by a camera constitutes
personal data … inasmuch as it makes it possible to identify the person
concerned...” (Rynes, para 22).
Whether there is a significance in this difference is unclear. Arguably, in Rynes the identification point seems to explain why an image is
personal data; in TK it could be read
as limiting the circumstances in which images could be personal data. It is not as clear as the individuation
argument accepted in the recent English High Court decision in Bridges,
concerning automatic facial recognition.
The main body of the judgment is
focused on the legitimate interest ground, analysing it according to a
three-staged test. In this the Court
relies on the inter-connectedness of the requirements in Article 6 (concerning
data quality) and Article 7 DPD (legitimacy of processing). The judgment gives
some light as to when the legitimate interests of the processer are real or
not; in this instance there had been instances of vandalism and theft, but the
Court expressly notes that it’ cannot, however, be necessarily required, at the
time of examining all the circumstances of the case, that the safety of
property and individuals was previously compromised’ [44]. This does not however give us much guidance
as to how far away from hypothetical such a case would need to be to be
‘present and effective’.
The second element focuses on
‘necessity’, which the Court reminds us should be ‘strictly necessary’. Yet, the Court’s explanation of this
requirement seems to adopt a lower standard, that of reasonableness. It states the requirement thus: that the objective
‘cannot reasonably be as effectively achieved by
other means less restrictive of the fundamental freedoms’ [47]. The Court also equates this element to a
proportionality principle, understood in the light of the data minimisation
principle. The Court states that proportionality has been taken into account
because a previous system (using access cars to access the common areas) had
been tried and failed. This, however, seems to refer to necessity.
The Court then does consider the
operation of the CCTV system, and whether constant surveillance is required,
implicitly at this point considering data minimisation. It is suggested that this issue has relevance
for the balancing of rights, though it is not a factor listed as relevant for
the third element of the test by the court.
The third stage lists some familiar considerations but does not go
beyond their identification, giving the national court little guidance as to
what these considerations might mean, how they might inter-relate and their
respective importance. It may be that
the Court did not want to have to get into the difficult consideration of
private space, especially shared private space. In this context, note that Rynes considered the impact of private
surveillance on public space, but for the assessment of a different question:
the extent of the household exemption.
Nonetheless, while the Court notes the relevance of Articles 7 and 8 of
the Charter it gives no separate consideration to the issue of the right to a
private life and data protection seen as a fundamental right. For this, the judgment might be legitimately
criticised.
Barnard & Peers: chapter 9
Photo credit: Deacon Insurance
This comment has been removed by a blog administrator.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteYour mention of the CJEU's recent judgment in TK on "Video surveillance in flats and data protection law" brings attention to a pivotal legal development. This case underscores the intersection of video surveillance practices in residential spaces and data protection regulations, emphasizing the need to balance security measures with individual privacy rights. A relevant and insightful reference for those interested in the evolving legal landscape surrounding video surveillance. Well done in shedding light on this significant judgment!
ReplyDelete