Showing posts with label passenger name records. Show all posts
Showing posts with label passenger name records. Show all posts

Tuesday, 14 December 2021

Is the Passenger Name Record Directive Valid? Opinion on the pending CJEU case

 



Douwe Korff, comparative and international lawyer specialising in human rights and data protection


In Case-817/19, Belgium’s Constitutional Court has asked the EU Court of Justice whether the PNR Directive (2016/681) is compatible with the Charter of Fundamental Rights. An Advocate-General’s opinion in this case is expected in the New Year.

In my opinion, the appropriate tests to be applied to mass surveillance measures such as are carried out under the PNR Directive (and were carried out under the Data Retention Directive, and are still carried out under the national data retention laws of the EU Member States that continue to apply in spite of the CJEU case-law) are:

Have the entities that apply the mass surveillance measure – i.e., in the case of the PNR Directive (and the DRD), the European Commission and the EU Member States — produced reliable, verifiable evidence:

-          that those measures have actually, demonstrably contributed significantly to the stated purpose of the measures, i.e., in relation to the PNR Directive, to the fight against PNR-relevant crimes (and in relation the DRD, to the fight against “serious crime as defined by national law”); and

-          that those measures have demonstrably not seriously negatively affected the interests and fundamental rights of the persons to whom they were applied?

If the mass surveillance measures do not demonstrably pass both these tests, they are fundamentally incompatible with European human rights and fundamental rights law and the Charter of Fundamental Rights; this means the measures must be justified, by the entities that apply them, on the basis of hard, verifiable, peer-reviewable data.

The conclusion reached by the European Commission and Dutch Minister of Justice: that overall, the PNR Directive, respectively the Dutch PNR law, had been “effective” because the EU Member States said so (Commission) or because PNR data were quite widely used and the competent authorities said so (Dutch Minister) is fundamentally flawed, given that this conclusion was reached in the absence of any real supporting data. Rather, my analyses show that:

-          Full PNR data are disproportionate to the purpose of basic identity checks;

-          The necessity of the PNR checks against Interpol’s Stolen and Lost Travel Document database is questionable;

-          The matches against unspecified national databases and “repositories” are not based on foreseeable legal rules and are therefore not based on “law”;

-          The necessity and proportionality of matches against various simple, supposedly “suspicious” elements (tickets bought from a “suspicious” travel agent; “suspicious” travel route; etc.) is highly questionable; and

-          The matches against more complex “pre-determined criteria” and profiles are inherently and irredeemably flawed and lead to tens, perhaps hundreds of thousands of innocent travellers wrongly being labelled to be a person who “may be” involved in terrorism or serious crime, and are therefore unsuited (D: ungeeignet) to the purpose of fighting terrorism and serious crime.

The hope must be that the Court will stand up for the rights of individuals, enforce the Charter of Fundamental Rights, and declare the PNR Directive (like the Data Retention Directive) to be fundamentally in breach of the Charter.

For my full 149-page opinion, and an executive summary of it, see here.

 

Reblogged from the Data Protection and Digital Competition blog

Photo credit: Konstantin von Wedelstaedt, via wikicommons




Friday, 17 September 2021

On Flights, Rock Concerts and the Needle in a Haystack A report from the Court of Justice of the European Union’s oral hearing on the PNR directive


 



 

Christian Thönnes, research assistant at the Department of Public Law of the Max Planck Institute for the Study of Crime, Security and Law in Freiburg.

Christian Thönnes ist wissenschaftlicher Mitarbeiter in der Abteilung Öffentliches Recht des Freiburger Max-Planck-Instituts zur Erforschung von Kriminalität, Sicherheit und Recht.

 

13 July 2021 was a potentially fateful day for the balance of privacy and security in the European Union. The Court of Justice of the European Union (CJEU) held an oral hearing in its preliminary ruling procedure C-817/19. Following a legal challenge undertaken by the Belgian NGO Ligue des Droits Humains, the Constitutional Court of Belgium had submitted ten preliminary questions to the CJEU (arrêt n°135/2019 du 17 octobre 2019). These questions concern the interpretation of Directive (EU) 2016/681 of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime (in short: PNR Directive) and its compatibility with EU primary law. The hearing did offer important insights into the court’s thinking. Especially judge-rapporteur Mr. von Danwitz asked the EU Commission many critical questions which do not appear to portend good news for proponents of this unprecedented surveillance measure. After having worked on a similar case for the Berlin-based strategic litigation NGO Gesellschaft für Freiheitsrechte, I attended the oral hearing. I am providing this entry as a brief summary and analysis.

 

This report hopefully encapsulates the high stakes of the case in question: The PNR preliminary ruling procedure could result in a landmark decision, laying the doctrinal groundwork not only generally for the mass retention of travel data in the name of defending security, but more specifically for the deployment of self-learning algorithms in order to pre-emptively detect presumptively suspicious movement patterns. As such, the CJEU has to deal with one of the first EU-wide, large-scale use cases of predictive policing. If the court were to essentially approve of this paradigm shift, a radical expansion of technology-driven surveillance to all sorts of ordinary human behavior, regardless of individual prior suspicion or imminent threat, could ensue. In its national transposition law, the Belgian parliament, for instance, decided to expand the PNR’s scope of application beyond only aviation to international trains, busses and ferries (Doc. parl., Chambre, 20152016, DOC 54-2069/001, p.7). 

 

The PNR Directive: An unprecedented tool of technology-driven mass surveillance

 

The PNR Directives obliges EU Member States to require air carriers to transmit a set of data for each passenger to national security authorities, so-called “Passenger Information Units”. PNR (passenger name record) datasets contain unverified information provided by passengers to the airlines or to a travel agency in order to facilitate the processing of each flight. The exact content of these PNR datasets depends on the commercial needs of each airline. All necessary data categories to be transmitted are defined in the PNR Directive’s Annex I. They encompass clearly-defined data items, for example, date of birth, details of accompanying persons, travel itineraries, or baggage information, but also loosely-defined items such as “general remarks” made by the aviation staff.

 

After their reception by the PIUs, PNR datasets are then automatically compared against databases “relevant for the purposes of preventing, detecting, investigating and prosecuting terrorist offences and serious crime” (Art. 6 § 3 letter a), as well as against “pre-determined criteria” (Art. 6 § 3 letter b). The latter are used “to identify ‘unknown’ suspects” (EU Commission PNR Directive Proposal, SEC(2011) 132, p. 12). What exactly these pre-determined criteria are, is not really defined in the PNR Directive’s text. Art. 6 § 4 only lays out that they ought to be “targeted, proportionate and specific”, as well as not based on “a person's race or ethnic origin, political opinions, religion or philosophical beliefs, trade union membership, health, sexual life or sexual orientation”. Their general purpose, however, is to extrapolate suspicious patterns passengers’ flight behaviors. Research (Korff, Passenger Name Records,data mining & data protection:the need for strong safeguards, p. 4; Rademacher, Predictive Policing im deutschen Polizeirecht, AöR 2017, 366, 410-415; Sommerer, Personenbezogenes Predictive Policing, S. 96-98), policy papers and the GFF generally discuss “pre-determined criteria” as a likely product of self-learning algorithms

 

According to Art. 6 § 5, once a “hit” occurs – meaning an automatically generated match between a PNR dataset and either a database or pre-determined criteria – it is “reviewed by non-automated means”. Verified hits can then be transmitted to other national competent law enforcement authorities on a “case-by-case basis” (Art. 6 § 6), or to other Member States (Art. 9) or, under additional conditions, to third countries (Art. 11). These authorities can then decide to take further action under their respective national law. The PNR Directive itself only requires the collection of PNR data for all flights entering and exiting the EU but recognizes Member States’faculty to extend its scope to intra-EU flights. Almost all Member States proceeded to do just that. PNR data are stored for six months in raw form and, after that, for another four-and-a-half years in a (reversibly) pseudonymized form (Art. 12).

 

EU Member States scramble to defend the PNR Directive’s proportionality

 

As always, the hearing opened with opening statements by the parties, Member States and EU institutions. This turned out to be a rather one-sided affair, given the fact the plaintiff’s representative faced a united front of eleven delegations from Member States and EU institutions, all in favor of the PNR Directive. They all defended against the charge that the PNR Directive was a disproportionate affair mainly by providing anecdotal evidence of its usefulness and by characterizing the relevant interferences with fundamental rights as not particularly severe.

 

First of all, many Member States pointed to the CJEU’s recent Opinion 1/15 of 26 July 2017 on the Draft PNR Draft agreement between Canada and the European Union. In that opinion, they argued, the CJEU had not generally prohibited the mass retention of PNR data; neither did it, in principle, object to a five-year retention period. And, they asked, were PNR data not much safer within the realms of the PNR Directive than in Canada, they suggested, as data stored on European servers remain protected by the GDPR and other legal regimes? Member States’ representatives failed to mention, however, that the CJEU considered the Draft PNR Agreement to be incompatible with Articles 7, 8 , 21 and Article 52 § 1 CFR. It did so due to the Draft Agreement’s lack of clear and precise criteria, substantive and procedural, for the automated processing of PNR data. It also criticized the lack of a required link, such as new circumstances of threat, between the Agreement’s stated objective (averting terrorist threats) and the prolonged retention and processing of PNR data (n° 232). As pointed out above, the PNR Directive perpetuates this dissatisfying state of vagueness (see its Art. 6 § 4) and even extends it beyond the Draft Agreement’s scope of serious cross-border crimes to any crime of some (but not necessarily exceeding) gravity, such as fraud (Annex II, Number 7) or the facilitation of unauthorised entry and residence (Number 11).

 

Second, Member States pointed to evidence of the PNR Directive’s effectiveness in combatting crime. The Belgian government, for example, mentioned the interception of human trafficking victims on Belgian airports. The French government lauded the PNR Directive for enabling it to stop and arrest a person running an illegal prostitution ring who was en route to Bangkok, accompanied by several minors. The Polish government talked about detecting illegal imports of cigarettes from Poland to Germany, and about Ukrainian nationals attempting to use flights for illegal immigration. It was striking, however, that all this evidence remained purely anecdotal. Just like the EU Commission in its Evaluation Report, Member States never provided any detailed statistical evidence on the PNR Directive’s contribution to the prevention, detection, investigation or prosecution of terrorism or serious crime. It also remained unclear, throughout the hearing, whether and, if so, how many of the cases provided by Member States could have been detected by previously existing data processing methods.

 

Third, most Member States contended that the PNR Directive did not result in a particularly severe interference with the right to respect for private and family life (Article 7 CFR), and the right to the protection of personal data (Article 8 CFR). The PNR Directive, they opined, prohibited the processing of sensitive personal data (see Article 13 § 4 and Recital 37). Some Member States, like Germany, Ireland, Spain, and Cyprus, leaned into that claim by forcefully asserting that all pre-determined criteria in use were assembled by humans, not by self-learning algorithms. Their opening statements were characterized by emphatic renouncements of such algorithms as processing tools. The Dutch government even went so far as to claim that the PNR Directive prohibited the use of artificial intelligence or self-learning algorithms for the creation of pre-determined criteria. This assertion, however, appears rather implausible at the least. In academic literature, the PNR Directive was seen as a blueprint for the use of self-learning algorithms (see references above). The Directive contains no explicit prohibition of artificial intelligence. The European Parliamentary Research Service mentions the PNR Directive in its report on the use of “Artificial Intelligence at EU borders” (pages 18-19). Artificial intelligence only works when fed with gigantic amounts of data. That is why Article 6 § 2 letter c of the PNR Directive allows “analysing PNR data for the purpose of updating or creating new criteria to be used in the assessments carried out“ [through pre-determined criteria].

 

Judge von Danwitz’ questions

 

Much more interesting than the (quite repetitive) jubilant opening statements was the oral hearing’s second part: questions by the court. The majority of this part was characterized by an exchange between judge-rapporteur Professor von Danwitz and the EU Commission’s representative. Judge von Danwitz structured his questioning into four topics: The statistical reliability (or fallibility) of the PNR system (1), the severity of fundamental rights interferences produced by the PNR Directive (2), discriminatory effects of the PNR system (3), and the overall proportionality of the system (4).

 

(1) Judge von Danwitz began by referencing concerningly high false-positive rates mentioned in the Member States’ respective statements. In its Evaluation Report, the EU Commission writes on page 28 that, in 2019, “0.59% of all passengers whose data have been collected have been identified through automated processing as requiring further examination”. Only 0.11% of all data, however, were verified by humans and then transferred to law enforcement authorities. This suggests, as Judge von Danwitz emphasized, a false-positive rate of more than 81 %. Moreover, it remains uncertain whether the remaining 19 % of datasets were legitimately process, thus making a definitive assessment of the full false-positive rate impossible.  Referencing the COVID pandemic in questioning the PNR system’s suitability for its intended purpose, Judge von Danwitz quipped: “If a PCR test operated with a sensitivity of 19%, I doubt it would be welcomed with open arms” (The original words were spoken in French. Translations are my own). While these numbers may intuitively feel small, consider this: When the German transposition law was adopted, the German government expected roughly 170 million yearly affected flight passengers (Gesetzesbegründung, BT-Drs. 18/11501, S. 23). This would mean that, in Germany alone, 187,000 people could be subjected to false automated suspicion every year.

 

In fact, these EU numbers are no anomaly. Similar false-positive rates have been reported by Member States: In a GFF case before the Administrative Court of Wiesbaden (docket number 6 K 806/19.WI), the Bundeskriminalamt, which functions as the German PIU, reported that 31,617,068 processed PNR datasets yielded 237,643 automatic matches. After human review, only 910 matches remained, resulting in a false-positive rate of 99.6%. Out of these 910 matches, 396 investigations went dry because the affected flight passengers were not identical with the actual wanted persons –leading to even more serious false suspicions. This rate, mind you, only pertains to database matches – one is left to imagine the error rate of the much more volatile matching procedure against pre-determined criteria.

 

As Judge von Danwitz pointed out, there is a statistical reason for such high rates: Base rate fallacy. Base rate fallacy denotes the phenomenon that when you are looking for very rare incidents in very large datasets, even extremely sophisticated detection tools will likely yield more false-positives than true-positives. Out of all EU flight passengers, extremely few will be actual terrorists or serious offenders – European law enforcement is looking for the proverbial needle in a haystack. Adding more hay to the stack will not allow them to find more needles – needles remain just as rare and elusive.

 

Confronted with this criticism, the EU Commission pointed to the legislator’s limited responsibility for mathematical limits: "There are mathematical limits and errors, but the legislator is not required to conduct mathematical demonstrations.".

 

(2) Judge von Danwitz then proceeded to refute the Member States’ and EU institutions’ claim that, since the PNR Directive does not result in the processing of particularly sensitive data, it does not constitute particularly severe interferences with Articles 7 and 8 CFR. In so doing, he drew an explicit comparison with telecommunications data which were the subject of Digital Rights Ireland, another landmark CJEU decision on mass data retention. While acknowledging that telecommunications data may per se contain more sensitive information than passenger data, he pointed out that, when determining the severity of interferences, one must also take into account the scale and method of processing: Firstly, while Directive 2006/24/EC intended that the majority of telecommunications data be accessible but remain unscrutinized, the PNR Directive provides for the automated analysis of every single PNR dataset. Secondly, Judge von Danwitz emphasized that the deployment of data mining through self-learning algorithms intensified the severity of the interference (thus also casting aside the renouncements put forth by many Member States).

 

The EU Commission agreed with Judge von Danwitz’ analysis in principle (“Oui, certainement, la gravité de l’ingérence est déterminée dans la manière où le Data Mining se fait.") but contended that there were different degrees of Data Mining ("Il y a Data Mining et Data Mining."). They claimed that the safeguards included in the PNR Directive rendered the Data Mining at hand rather minor.

 

(3) Judge von Danwitz then turned to the lack of clear criteria for the individual review of automated matches by non-automated means, as per Article 6 § 5. He pointed out that this vagueness opened room for discrimination. When the EU Commission responded that discrimination was prohibited under Article 6 § 4, Judge von Danwiz replied that indirect discriminations certainly remained possible – which the EU Commission admitted (“un risque de discrimination indirecte existe toujours"). Judge von Danwitz then asked why the EU legislator did not include more provisions in the text in order to mitigate the risk of indirect discrimination – given the extremely high false-positive rate of over 80 %. The EU Commission responded that there were limits to the specificity that can reasonably be expected from any legislator (“tout législateur a des limites lorsque l’on doit réglementer une activité minutieuse et détaillée"). Responding to this, the European Data Protection Supervisor proposed a reversal of the burden of proof as a possible solution, but this was rejected by the EU Commission because, according to them, that would presuppose that “everyone is automatically a victim”. 

 

(4) Judge von Danwitz’ fourth line of questioning turned on the overall proportionality of the PNR system. His questions mainly focused on the (lack of a) link between the occasion for mass data retention – taking a flight – and the PNR Directive’s stated purpose – combatting terrorist offences and serious crime. The lack of clear criteria buttressing this link was one of the CJEU’s main sources of concern in its Opinion 1/15 (see n°217). The EU Commission responded that criminals specifically use the convenience of international air travel to orchestrate their crimes. Judge von Danwitz proceeded by pointing out that the notion that locations and behaviors suitable for crime should be subjected to mass surveillance was stretchable to an almost unlimited extent. “Why not rock concerts?”, he asked. “Why not museum visits?”. Surprisingly, the EU Commission basically agreed with him, saying that yes indeed, rock concerts could be prone to drug-related offenses (“I don’t have any police experience, but I could imagine that there could be much drug-related crime occurring at rock concerts.”).

 

Lingering doubts regarding the PNR system’s proportionality

 

By no means can this report paint a full picture of what was said during the oral hearing or what is to be considered when assessing the mass surveillance of flight passengers. For example, other questions raised by the Advocate General Pitruzzella concerned the vagueness around which databases would be “relevant” and could therefore be used for comparison (von Danwitz asked whether Facebook databases could be used), whether the unanimous extension of the PNR Directive’s scope to intra-EU flights was warranted or excessive, and whether the five-year-retention period was disproportionate. Advocate General Pitruzzella also inquired about external oversight of pre-determined criteria and whether false-positives were systematically used in order to improve algorithms (to which some Member States replied in the affirmative).

 

In my opinion though, the hearing shone a light on the PNR Directive’s manifold constitutional weaknesses. The EU Commission and Member States did not succeed in dispelling my lingering doubts about its underdeterminacy, its questionable suitability and effectiveness, its unchecked potential to produce large-scale discrimination. But chief among these weaknesses is the Directive’s sheer excessiveness: It is just disproportionate to take ordinary human behavior and use it as an opportunity to unleash an unprecedented degree of technology-fueled surveillance upon hundreds of millions of European flight passengers and to sift through mountains of useless, potentially discriminatory data, just to – maybe, kind of – detect a handful of criminals.

 

Photo credit: Juke Schweizer, via Wikimedia Commons

Friday, 4 August 2017

Transferring personal data outside the EU: Clarification from the ECJ?



Lorna Woods, Professor of Internet Law, University of Essex

Opinion 1/15 EU/Canada PNR Agreement, 26th July 2017

Facts

Canadian law required airlines, in the interests of the fight against serious crime and terrorism, to provide certain information about passengers (API/PNR data), which obligation required airlines under EU data protection regulations to transfer data to outside the EU.  The PNR data includes the names of air passengers, the dates of intended travel, the travel itinerary, and information relating to payment and baggage. The PNR data may reveal travel habits, relationships between two individuals, information on the financial situation or the dietary habits of individuals. To regularise the transfer of data, and to support police cooperation, the EU negotiated an agreement with Canada specifying the data to be transferred, the purposes for which the data could be used, as well as some processing safeguard provisions (e.g. use of sensitive data, security obligations, oversight requirements, access by passengers).  The data was permitted to be retained for five years, albeit in a depersonalised form.  Further disclosure of the data beyond Canada and the Member States was permitted in limited circumstances.  The European Parliament requested an opinion from the Court of Justice under Article 218(11) TFEU as to whether the agreement satisfied fundamental human rights standards and whether the appropriate Treaty base had been used for the agreement.

Opinion

The Court noted that the agreement fell within the EU’s constitutional framework, and must therefore comply with its constitutional principles, including (though this point was not made express), respect for fundamental human rights (whether as a general principle or by virtue of the EU Charter – the EUCFR).

After dealing with questions of admissibility, the Court addressed the question of appropriate Treaty base. It re-stated existing principles (elaborated, for example, in Case C263/14 Parliament v Council, judgment 14 June 2016, EU:C:2016:435) with regard to choice of Treaty base generally: the choice must rest on objective factors (including the aim and the content of that measure) which are amenable to judicial review.  In this context the Court found that the proposed agreement has two objectives: safeguarding public security; and safeguarding personal data [opinion, para 90].  The Court concluded that the two objectives were inextricably linked: while the driver for the need to PNR data was protection of public security, the transfer of data would be lawful only if data protection rules were respected [para 94].  Therefore, the agreement should be based on both Article 16(2) (data protection) and Article 87(2)(a) TFEU (police cooperation).  It held, however, that Article 82(1)(d) TFEU (judicial cooperation) could not be used, partly because judicial authorities were not included in the agreement.

Looking at the issue of data protection, the Court re-stated the question as being ‘on the compatibility of the envisaged agreement with, in particular, the right to respect for private life and the right to the protection of personal data’ [para 119].  It then commented that although both Article 16 TFEU and Article 8 EUCFR enshrine the right to data protection, in its analysis it would refer to Article 8 only, because that provision lays down in a more specific manner the conditions for data processing.  The agreement refers to the processing of data concerning identified individuals, and therefore may affect the fundamental right to respect for private life guaranteed in Article 7 EUCFR as well as the right to protection to personal data in Article 8 EUCFR. The Court re-iterated a number of principles regarding the scope of the right to private life:

‘the communication of personal data to a third party, such as a public authority, constitutes an interference with the fundamental right enshrined in Article 7 of the Charter, whatever the subsequent use of the information communicated. The same is true of the retention of personal data and access to that data with a view to its use by public authorities. In this connection, it does not matter whether the information in question relating to private life is sensitive or whether the persons concerned have been inconvenienced in any way on account of that interference’ [para 124].

The transfer of PNR data and its retention and any use constituted an interference with both Article 7 [para 125] and Article 8 EUCFR [para 126]. In assessing the seriousness of the interference, the Court flagged ‘the systematic and continuous’ nature of the PNR system, the insight into private life of individuals, the fact that the system is used as an intelligence tool and the length of time for which the data is available.

Interferences with these rights may be justified.  Nonetheless, there are constraints on any justification: Article 8(2)  of the EU Charter specifies that processing must be ‘for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law’; and, according to Article 52(1) of the EU Charter, any limitation must be provided for by law and respect the essence of those rights and freedoms. Further, limitations must be necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others. 

Following WebMindLicenses (Case C‑419/14, judgment of 17 December 2015, EU:C:2015:832, para 81), the law that permits the interference should also set down the extent of that interference. Proportionality requires that any derogation from and limitation on the protection of personal data should apply only insofar as is strictly necessary. To this end and to prevent the risk of abuse, the legislation must set down ‘clear and precise rules governing the scope and application of the measure in question and imposing minimum safeguards’, specifically ‘indicat[ing] in what circumstances and under which conditions a measure providing for the processing of such data may be adopted’ [para 141], especially when automated processing is involved.

The Court considered whether there was a legitimate basis for the processing, noting that although passengers may be said to consent to the processing of PNR data, this consent related to a different purpose. The transfer of the PNR data is not conditional on the specific consent of the passengers and must therefore be grounded on some other basis, within the terms of Article 8(2) EUCFR. The Court rejected the Parliament’s submission that the meaning of ‘law’ be restricted to ‘legislative act’ internally. The Court, following the reasoning of the Advocate General, found that in this regard the international agreement was the external equivalent of the legislative act.

In line with its previous jurisprudence, the Court accepted that public security is an objective of public interest capable of justifying even serious interferences with Articles 7 and 8 EUCFR. It also noted that everybody has the right to security of the person (Art. 6 EUCFR), though this point was taken no further. The Court considered that PNR data revealed only limited aspects of a person’s private life, so that the essence of the right was not adversely affected [para 151]. In principle, limitation may then be possible. The Court accepted that PNR data transfer was appropriate, but not that the test of necessity was satisfied. It agreed with the Advocate General that the categories of data to be transferred were not sufficiently precise, specifically ‘available frequent flyer and benefit information (free tickets, upgrades, etc.)’, ‘all available contact information (including originator information)’ and ‘general remarks including Other Supplementary Information (OSI), Special Service Information (SSI) and Special Service Request (SSR) information’. Although the agreement required the Canadian authorities to delete any data transferred to them which fell outside these categories, this obligation did not compensate for the lack of precision regarding the scope of these categories.

The Court noted that the agreement identified a category of ‘sensitive data’; it was therefore to be presumed that sensitive data would be transferred under the agreement. The Court then reasoned:

any measure based on the premiss that one or more of the characteristics set out in Article 2(e) of the envisaged agreement may be relevant, in itself or in themselves and regardless of the individual conduct of the traveller concerned, having regard to the purpose for which PNR data is to be processed, namely combating terrorism and serious transnational crime, would infringe the rights guaranteed in Articles 7 and 8 of the Charter, read in conjunction with Article 21 thereof [para 165]

Additionally, any transfer of sensitive data would require a ‘precise and particularly solid’ reason beyond that of public security and prevention of terrorism. This justification was lacking. The transfer of sensitive data and the framework for the use of those data would be incompatible with the EU Charter [para 167].

While the agreement tried to limit the impact of automated decision-making, the Court found it problematic because of the need to have reliable models on which the automated decisions were made. These models, in the view of the Court, must produce results that identify persons under a ‘reasonable suspicion’ of participation in terrorist offences or serious transnational crime and should be non-discriminatory. Models/databases should also be kept up-to-date and accurate and subject to review for bias. Because of the error risk, all positive automated decisions should be individually checked.

In terms of the purposes for processing the data, the definition of terrorist offences and serious transnational crime were sufficiently clear. There were however other provisions, allowing case-by-case assessment.  These provisions (Article 3(5)(a) and (b) of the treaty) were found to be too vague.  By contrast, the Court determined that the authorities who would receive the data were sufficiently identified. Further, it accepted that the transfer of data of all passengers, whether or not they were identified as posing a risk or not, does not exceed what is necessary as passengers must comply with Canadian law and ‘the identification, by means of PNR data, of passengers liable to present a risk to public security forms part of border control’ [para 188].

Relying on its recent judgment in Tele2/Watson (Joined Cases C‑203/15 and C‑698/15, EU:C:2016:970), which I discussed here, the Court reiterated that there must be a connection between the data retained and the objective pursued for the duration of the time the data are held, which brought into question the use of the PNR data after passengers had disembarked in Canada.  Further, the use of the data must be restricted in accordance with those purposes. However,

where there is objective evidence from which it may be inferred that the PNR data of one or more air passengers might make an effective contribution to combating terrorist offences and serious transnational crime, the use of that data does not exceed the limits of what is strictly necessary [para 201].

Following verification of passenger data and permission to enter Canadian territory, the use of PNR data during passengers’ stay must be based on new justifying circumstances. The Court expected that this should be subject to prior review by an independent body. The Court held that the agreement did not meet the required standards.  Similar points were made, even more strongly, in relation to the use of PNR data after the passengers had left Canada. In general, this was not strictly necessary, as there would no longer be a connection between the data and the objective pursued by the PNR Agreement such as to justify the retention of their data. PNR data may be stored in Canada, however, when particular passengers present a risk of terrorism of serious transnational crime. Moreover, given the average lifespan of international serious crime networks and the duration and complexity of investigations relating to them, the Court did not hold that the retention of data for five years went beyond the limits of necessity [para 209].

The agreement allows PNR data to be disclosed by the Canadian authority to other Canadian government authorities and to government authorities of third countries. The recipient country must satisfy EU data protection standards; an international agreement between the third country and the EU or an adequacy decision would be required. There is a further, unlimited and ill-defined possibility of disclosure to individuals ‘subject to reasonable legal requirements and limitations ... with due regard for the legitimate interests of the individual concerned’. This provision did not satisfy the necessity test.

To ensure that the individuals’ rights to access their data and to have data rectified is protected, in line with Tele2/Watson, passengers must be notified of the transfer of their PNR data to Canada and of its use as soon as that information is no longer liable to jeopardise the investigations being carried out by the government authorities referred to in the envisaged agreement. In this respect, the agreement is deficient. While passengers are told that the data will be used for security checks/border control, they are not told whether their data has been used by the Canadian Competent Authority beyond use for those checks.  While the Court accepted that the agreement provided passengers with a possible remedy, the agreement was deficient in that it did not guarantee in a sufficiently clear and precise manner that the oversight of compliance would be carried out by an independent authority, as required by Article 8(3) EUCFR.

Comment

There are lots of issues in this judgment, of interest from a range of perspectives, but its length and complexity means it is not an easy read. Because of these characteristics, a blog – even a lengthy blog – could hardly do justice to all issues, especially as in some instances, it is hardly clear what the Court’s position is.

On the whole the Court follows the approach of its Advocate General, Mengozzi, on a number of points specifically referring back to his Opinion. There is, as seems increasingly to be the trend, heavy reliance on existing case law and it is notable that the Court refers repeatedly to its ruling in Tele2/Watson.  This may be a judicial attempt to suggest that Tele2/Watson was not an aberration and to reinforce its status as good law, if that were in any doubt. It also operates to create a body of surveillance law rulings that are hopefully consistent in underpinning principles and approach, and certainly some of the points in earlier case law are reiterated with regards to the importance of ex ante review by independent bodies, rights of redress and the right of individuals to know that they have been subject to surveillance.

The case is of interest not only in regards mass surveillance but more generally in relation to Article 16(2) TFEU. It is also the first time an opinion has been given on a draft agreement considering its compatibility with human rights standards as well as the appropriate Treaty base. In this respect the judgment may be a little disappointing; certainly on Article 16, the Court did not go into the same level of detail as in the AG’s opinion [AG114-AG120]. Instead it equated Article 16 TFEU to Article 8 EUCFR, and based its analysis on the latter provision.

As a general point, it is evident that the Court has adopted a detailed level of review of the PNR agreement.  The outcome of the case has widely been recognised as having implications, as –for example – discussed earlier on this blog.  Certainly, as the Advocate General noted, possible impact on other PNR agreements [AG para 4] which relate to the same sorts of data shared for the same objectives.  The EDPS made this point too, in the context of the EU PNR Directive:

Since the functioning of the EU PNR and the EU-Canada schemes are similar, the answer ofthe Court mayhave a significant impact on the validity of all other PNR instruments …. [Opinion 2/15, para 18]

There are other forms of data sharing agreement, for example, SWIFT, the Umbrella Agreement,  the Privacy Shield (and other adequacy decisions) the last of which is coming under pressure in any event (DRI v Commission (T-670/16) and La Quadrature du Net and Others v Commission (T-738/16)).  Note that in this context, there is not just a question of considering the safeguards for protection of rights but also relates to Treaty base.  The Court found that Article 16 must be used and that – because there was no role for judicial authorities, still less their cooperation – the use of Article 82(1)(d) is wrong.  It has, however, been used for example in regards to other PNR agreements.  This means that that the basis for those agreements is thrown into doubt.

While the Court agreed with its Advocate General to suggest that a double Treaty base was necessary given the inextricable linkage, there is some room to question this assumption.  It could also be argued that there is a dominant purpose, as the primary purpose of the PNR agreement is to protect personal data, albeit with a different objective in view, that of public security. In the background, however, is the position of the UK, Ireland and Denmark and their respective ‘opt-outs’ in the field. While a finding of a joint Treaty base made possible the argument of the Court that:

since the decision on the conclusion of the envisaged agreement must be based on both Article 16 and Article 87 TFEU and falls, therefore, within the scope of Chapter 5 of Title V of Part Three of the FEU Treaty in so far as it must be founded on Article 87 TFEU, the Kingdom of Denmark will not be bound, in accordance with Articles 2 and 2a of Protocol No 22, by the provisions of that decision, nor, consequently, by the envisaged agreement. Furthermore, the Kingdom of Denmark will not take part in the adoption of that decision, in accordance with Article 1 of that protocol. [para 113, see also para 115]

The position would, however, have been different had the agreement be found to have been predominantly about data protection and therefore based on Article 16 TFEU alone.

Looking at the substantive issues, the Court clearly accepted the need for PNR to challenge the threat from terrorism, noting in particular that Article 6 of the Charter (the “right to liberty and security of person”) can justify the processing of personal data. While it accepted that this resulted in systemic transfer of large quantities of people, we see no comments about mass surveillance. Yet, is this not similar to the ‘general and indiscriminate’ collection and analysis rejected by the Court in Tele2/Watson [para 97], and which cannot be seen as automatically justified even in the context of the fight against terrorism [para 103 and 119]? Certainly, the EDPS took the view in its opinion on the EU PNR Directive that “the non-targeted and bulk collection and processing of data of the PNR scheme amount to a measure of general surveillance” [Opinion 1/15, para 63]. It may be that the difference is in the nature of the data; even if this is so, the Court does not make this argument. Indeed, it makes no argument but rather weakly accepts the need for the data.  On this point, it should be noted that “the usefulness of large-scale profiling on the basis of passenger data must be questioned thoroughly, based on both scientific elements and recent studies” [Art. 29 WP Opinion 7/2010, p. 4]. In this aspect, Opinion 1/15 is not as strong a stand as Tele2/Watson [c.f para 105-106]; it seems that the Court was less emphatic about significance of surveillance even than the Advocate General [AG 176].

In terms of justification, while the Court accepts that the transfer of data and its analysis may give rise to intrusion, it suggests that the essence of the right has not been affected. In this it follows the approach in the communications data cases.  It is unclear, however, what the essence of the right is; it seems that no matter how detailed a picture of an individual can be drawn from the analysis of data, the essence of the right remains intact.  If the implication is that where the essence of the right is affected then no justification for the intrusion could be made, a narrow view of essence is understandable.  This does not, however, answer the question of what the essence is and, indeed, whether the essence of the right is the same for Article 7 as for Article 8.  In this case, the Court has once again referred to both articles, without delineating the boundaries between them, but then proceeded to base its analysis mainly on Article 8.

In terms of relationship between provisions, it is also unclear what the relationship is between Art 8(2) and Art 52.  The Court bundles the requirements for these two provisions together but they serve different purposes. Article 8(2) further elaborates the scope of the right; Article 52 deals with the limitations of Charter rights.  Despite this, it seems that some of the findings will apply Article 52 in the context of other rights. For example, in considering that an international agreement constitutes law for the purposes of the EUCFR, the Court took a broader approach to meaning of ‘law’ than the Parliament had argued for.  This however seems a sensible approach, avoiding undue formality. 

One further point about the approach to interpreting exceptions to the rights and Article 52 can be made. It seems that the Court has not followed the Advocate General who had suggested that strict necessity should be understood in the light of achieving a fair balance [AG207].
 
Some specific points are worth highlighting. The Court held that sensitive data (information that reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, information about a person’s health or sex life) should not be transferred. It is not clear what interpretation should be given to these data, especially as regards proxies for sensitive data (e.g. food preferences may give rise to inferences about a person’s religious beliefs).

One innovation in the PNR context is the distinction the Court introduced between use of PNR data on entry, use while the traveller is in Canada, and use after the person has left, which perhaps mitigates the Court’s acceptance of undifferentiated surveillance of travellers.  The Court’s view of the acceptability of use in relation to this last category is the most stringent.  While the Court accepts the link between the processing of PNR data on arrival, after departure the Court expects that link to be proven, and absent such proof, there is no justification for the retention of data. Does this mean that on departure PNR data of persons who are not suspected of terrorism or transnational crime should be deleted at the point of their departure? Such a requirement surely gives rise to practical problems and would seem to limit the Court’s earlier acceptance of the use of general PNR data to verify/update computer models [para 198].

One of the weaknesses of the Court’s caselaw so far has been a failure to consider investigatory techniques, and whether all are equally acceptable.  Here we see the Court beginning to consider the use of automated intelligence techniques.  While the Court does not go into detail on all the issues to which predictive policing and big data might give rise, it does note that models must be accurate.  It also refers to Article 21 EUCFR (discrimination).  In that this section is phrased in general terms, it has potentially wide-reaching application, potentially even beyond the public sector.

The Court’s judgment has further implications as regards the sharing of PNR and other security data with other countries besides Canada, most notably in the context of EU/UK relations after Brexit. Negotiators now have a clearer indication of what it will take for an agreement between the EU and a non-EU state to satisfy the requirements of the Charter, in the ECJ’s view. Time will tell what impact this ruling will have on the progress of those talks.

Barnard & Peers: chapter 25
JHA4: chapter II:9

Photo credit: ctvnews.ca

Thursday, 8 January 2015

Does the EU need more anti-terrorist legislation?


 

Steve Peers

In the wake of the appalling attacks in Paris two days ago, it only took 24 hours for the EU Commission to state that it would propose a new wave of EU anti-terrorist measures in a month’s time. It’s not yet known what the content of this law will be; but the very idea of new legislation is a profound mistake.

Of course, it was right for the EU institutions to express sympathy for the victims of the attack, and solidarity as regards defence of free speech. Equally, it would not be problematic to use existing EU anti-terrorism laws if necessary, in order (for instance) to surrender the suspects in this crime on the basis of a European Arrest Warrant (EAW), in the event that they fled to another Member State.  The question is whether the EU needs more such laws.

For the EU has already reacted to prior terrorism offences, first as regards 9/11 and then to the atrocities in Madrid and London in 2004 and 2005. The result is a huge body of anti-terrorism law, catalogued here by the SECILE project. This comprises not only measures specifically concerning terrorism (such as substantive criminal law measures, adopted in 2002 and amended in 2008), but many other measures which make it easier to cooperate as regards terrorism as well as other criminal offences, such as the EAW, the laws on exchange of police information and transmission of evidence across borders, and so on.  

Moreover, there are proposals already under discussion which would apply to terrorism issues (among others), such as a new law on Europol, the EU’s police intelligence agency (discussed here), and proposed EU legislation on the transfer of airlines’ passenger name records (PNR).  

So what new laws is the Commission likely to propose? It may suggest a new version of the data retention Directive, the previous version of which was struck down by the Court of Justice of the European Union (CJEU) last spring, in the Digital Rights judgment (discussed here). Other ideas under discussion, according to leaked documents (see here and here) are new laws strengthening mandatory checks at borders .  

Are any of these laws really necessary? Member States can already adopt laws on retention of communications data, pursuant to the EU’s e-privacy directive. As the European Parliament’s legal service has confirmed (see its advice here), if Member States adopt such measures, they will be subject to the constraints of the Digital Rights judgment, which bans mass surveillance carried out in the absence of safeguards to protect privacy. Equally, Member States are free to establish their own PNR systems, in the absence of any EU-wide measure (besides EU treaties with the USA, Canada and Australia on PNR). The question of whether mass surveillance is as such compatible with human rights has already been sent to the CJEU by the European Parliament, which has asked the Court to rule on this issue in the context of the EU/Canada PNR treaty (see discussion here).

It would be possible to adopt new laws calling for systematic border checks in specific cases. In practice, this would likely mean checks on Muslims who are returning after travel to places like Syria. It is questionable whether asking detailed further questions at the external borders will, by itself, really do a lot to prevent terrorism. After all, in the Paris attacks, it unfortunately proved impossible to prevent an apparent terrorist attack despite extensive anti-terrorist legislation on the books, and bodyguards protecting the staff of a known terrorist target.

There’s also a question of principle here. The Paris attacks were directed at free speech: the foundation of liberal democracy. Of course efforts should be stepped up to prevent such attacks from happening again; but existing laws allow for targeted intelligence gathering and sharing already, The Commission’s immediate response reeks of panic. And the direct attack on fundamental democratic principles this week in Paris is precisely the wrong context to consider that new legislation curtailing other fundamental freedoms is limited.

 

Barnard & Peers: chapter 25