Pages

Friday, 17 September 2021

On Flights, Rock Concerts and the Needle in a Haystack A report from the Court of Justice of the European Union’s oral hearing on the PNR directive


 



 

Christian Thönnes, research assistant at the Department of Public Law of the Max Planck Institute for the Study of Crime, Security and Law in Freiburg.

Christian Thönnes ist wissenschaftlicher Mitarbeiter in der Abteilung Öffentliches Recht des Freiburger Max-Planck-Instituts zur Erforschung von Kriminalität, Sicherheit und Recht.

 

13 July 2021 was a potentially fateful day for the balance of privacy and security in the European Union. The Court of Justice of the European Union (CJEU) held an oral hearing in its preliminary ruling procedure C-817/19. Following a legal challenge undertaken by the Belgian NGO Ligue des Droits Humains, the Constitutional Court of Belgium had submitted ten preliminary questions to the CJEU (arrêt n°135/2019 du 17 octobre 2019). These questions concern the interpretation of Directive (EU) 2016/681 of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime (in short: PNR Directive) and its compatibility with EU primary law. The hearing did offer important insights into the court’s thinking. Especially judge-rapporteur Mr. von Danwitz asked the EU Commission many critical questions which do not appear to portend good news for proponents of this unprecedented surveillance measure. After having worked on a similar case for the Berlin-based strategic litigation NGO Gesellschaft für Freiheitsrechte, I attended the oral hearing. I am providing this entry as a brief summary and analysis.

 

This report hopefully encapsulates the high stakes of the case in question: The PNR preliminary ruling procedure could result in a landmark decision, laying the doctrinal groundwork not only generally for the mass retention of travel data in the name of defending security, but more specifically for the deployment of self-learning algorithms in order to pre-emptively detect presumptively suspicious movement patterns. As such, the CJEU has to deal with one of the first EU-wide, large-scale use cases of predictive policing. If the court were to essentially approve of this paradigm shift, a radical expansion of technology-driven surveillance to all sorts of ordinary human behavior, regardless of individual prior suspicion or imminent threat, could ensue. In its national transposition law, the Belgian parliament, for instance, decided to expand the PNR’s scope of application beyond only aviation to international trains, busses and ferries (Doc. parl., Chambre, 20152016, DOC 54-2069/001, p.7). 

 

The PNR Directive: An unprecedented tool of technology-driven mass surveillance

 

The PNR Directives obliges EU Member States to require air carriers to transmit a set of data for each passenger to national security authorities, so-called “Passenger Information Units”. PNR (passenger name record) datasets contain unverified information provided by passengers to the airlines or to a travel agency in order to facilitate the processing of each flight. The exact content of these PNR datasets depends on the commercial needs of each airline. All necessary data categories to be transmitted are defined in the PNR Directive’s Annex I. They encompass clearly-defined data items, for example, date of birth, details of accompanying persons, travel itineraries, or baggage information, but also loosely-defined items such as “general remarks” made by the aviation staff.

 

After their reception by the PIUs, PNR datasets are then automatically compared against databases “relevant for the purposes of preventing, detecting, investigating and prosecuting terrorist offences and serious crime” (Art. 6 § 3 letter a), as well as against “pre-determined criteria” (Art. 6 § 3 letter b). The latter are used “to identify ‘unknown’ suspects” (EU Commission PNR Directive Proposal, SEC(2011) 132, p. 12). What exactly these pre-determined criteria are, is not really defined in the PNR Directive’s text. Art. 6 § 4 only lays out that they ought to be “targeted, proportionate and specific”, as well as not based on “a person's race or ethnic origin, political opinions, religion or philosophical beliefs, trade union membership, health, sexual life or sexual orientation”. Their general purpose, however, is to extrapolate suspicious patterns passengers’ flight behaviors. Research (Korff, Passenger Name Records,data mining & data protection:the need for strong safeguards, p. 4; Rademacher, Predictive Policing im deutschen Polizeirecht, AöR 2017, 366, 410-415; Sommerer, Personenbezogenes Predictive Policing, S. 96-98), policy papers and the GFF generally discuss “pre-determined criteria” as a likely product of self-learning algorithms

 

According to Art. 6 § 5, once a “hit” occurs – meaning an automatically generated match between a PNR dataset and either a database or pre-determined criteria – it is “reviewed by non-automated means”. Verified hits can then be transmitted to other national competent law enforcement authorities on a “case-by-case basis” (Art. 6 § 6), or to other Member States (Art. 9) or, under additional conditions, to third countries (Art. 11). These authorities can then decide to take further action under their respective national law. The PNR Directive itself only requires the collection of PNR data for all flights entering and exiting the EU but recognizes Member States’faculty to extend its scope to intra-EU flights. Almost all Member States proceeded to do just that. PNR data are stored for six months in raw form and, after that, for another four-and-a-half years in a (reversibly) pseudonymized form (Art. 12).

 

EU Member States scramble to defend the PNR Directive’s proportionality

 

As always, the hearing opened with opening statements by the parties, Member States and EU institutions. This turned out to be a rather one-sided affair, given the fact the plaintiff’s representative faced a united front of eleven delegations from Member States and EU institutions, all in favor of the PNR Directive. They all defended against the charge that the PNR Directive was a disproportionate affair mainly by providing anecdotal evidence of its usefulness and by characterizing the relevant interferences with fundamental rights as not particularly severe.

 

First of all, many Member States pointed to the CJEU’s recent Opinion 1/15 of 26 July 2017 on the Draft PNR Draft agreement between Canada and the European Union. In that opinion, they argued, the CJEU had not generally prohibited the mass retention of PNR data; neither did it, in principle, object to a five-year retention period. And, they asked, were PNR data not much safer within the realms of the PNR Directive than in Canada, they suggested, as data stored on European servers remain protected by the GDPR and other legal regimes? Member States’ representatives failed to mention, however, that the CJEU considered the Draft PNR Agreement to be incompatible with Articles 7, 8 , 21 and Article 52 § 1 CFR. It did so due to the Draft Agreement’s lack of clear and precise criteria, substantive and procedural, for the automated processing of PNR data. It also criticized the lack of a required link, such as new circumstances of threat, between the Agreement’s stated objective (averting terrorist threats) and the prolonged retention and processing of PNR data (n° 232). As pointed out above, the PNR Directive perpetuates this dissatisfying state of vagueness (see its Art. 6 § 4) and even extends it beyond the Draft Agreement’s scope of serious cross-border crimes to any crime of some (but not necessarily exceeding) gravity, such as fraud (Annex II, Number 7) or the facilitation of unauthorised entry and residence (Number 11).

 

Second, Member States pointed to evidence of the PNR Directive’s effectiveness in combatting crime. The Belgian government, for example, mentioned the interception of human trafficking victims on Belgian airports. The French government lauded the PNR Directive for enabling it to stop and arrest a person running an illegal prostitution ring who was en route to Bangkok, accompanied by several minors. The Polish government talked about detecting illegal imports of cigarettes from Poland to Germany, and about Ukrainian nationals attempting to use flights for illegal immigration. It was striking, however, that all this evidence remained purely anecdotal. Just like the EU Commission in its Evaluation Report, Member States never provided any detailed statistical evidence on the PNR Directive’s contribution to the prevention, detection, investigation or prosecution of terrorism or serious crime. It also remained unclear, throughout the hearing, whether and, if so, how many of the cases provided by Member States could have been detected by previously existing data processing methods.

 

Third, most Member States contended that the PNR Directive did not result in a particularly severe interference with the right to respect for private and family life (Article 7 CFR), and the right to the protection of personal data (Article 8 CFR). The PNR Directive, they opined, prohibited the processing of sensitive personal data (see Article 13 § 4 and Recital 37). Some Member States, like Germany, Ireland, Spain, and Cyprus, leaned into that claim by forcefully asserting that all pre-determined criteria in use were assembled by humans, not by self-learning algorithms. Their opening statements were characterized by emphatic renouncements of such algorithms as processing tools. The Dutch government even went so far as to claim that the PNR Directive prohibited the use of artificial intelligence or self-learning algorithms for the creation of pre-determined criteria. This assertion, however, appears rather implausible at the least. In academic literature, the PNR Directive was seen as a blueprint for the use of self-learning algorithms (see references above). The Directive contains no explicit prohibition of artificial intelligence. The European Parliamentary Research Service mentions the PNR Directive in its report on the use of “Artificial Intelligence at EU borders” (pages 18-19). Artificial intelligence only works when fed with gigantic amounts of data. That is why Article 6 § 2 letter c of the PNR Directive allows “analysing PNR data for the purpose of updating or creating new criteria to be used in the assessments carried out“ [through pre-determined criteria].

 

Judge von Danwitz’ questions

 

Much more interesting than the (quite repetitive) jubilant opening statements was the oral hearing’s second part: questions by the court. The majority of this part was characterized by an exchange between judge-rapporteur Professor von Danwitz and the EU Commission’s representative. Judge von Danwitz structured his questioning into four topics: The statistical reliability (or fallibility) of the PNR system (1), the severity of fundamental rights interferences produced by the PNR Directive (2), discriminatory effects of the PNR system (3), and the overall proportionality of the system (4).

 

(1) Judge von Danwitz began by referencing concerningly high false-positive rates mentioned in the Member States’ respective statements. In its Evaluation Report, the EU Commission writes on page 28 that, in 2019, “0.59% of all passengers whose data have been collected have been identified through automated processing as requiring further examination”. Only 0.11% of all data, however, were verified by humans and then transferred to law enforcement authorities. This suggests, as Judge von Danwitz emphasized, a false-positive rate of more than 81 %. Moreover, it remains uncertain whether the remaining 19 % of datasets were legitimately process, thus making a definitive assessment of the full false-positive rate impossible.  Referencing the COVID pandemic in questioning the PNR system’s suitability for its intended purpose, Judge von Danwitz quipped: “If a PCR test operated with a sensitivity of 19%, I doubt it would be welcomed with open arms” (The original words were spoken in French. Translations are my own). While these numbers may intuitively feel small, consider this: When the German transposition law was adopted, the German government expected roughly 170 million yearly affected flight passengers (Gesetzesbegründung, BT-Drs. 18/11501, S. 23). This would mean that, in Germany alone, 187,000 people could be subjected to false automated suspicion every year.

 

In fact, these EU numbers are no anomaly. Similar false-positive rates have been reported by Member States: In a GFF case before the Administrative Court of Wiesbaden (docket number 6 K 806/19.WI), the Bundeskriminalamt, which functions as the German PIU, reported that 31,617,068 processed PNR datasets yielded 237,643 automatic matches. After human review, only 910 matches remained, resulting in a false-positive rate of 99.6%. Out of these 910 matches, 396 investigations went dry because the affected flight passengers were not identical with the actual wanted persons –leading to even more serious false suspicions. This rate, mind you, only pertains to database matches – one is left to imagine the error rate of the much more volatile matching procedure against pre-determined criteria.

 

As Judge von Danwitz pointed out, there is a statistical reason for such high rates: Base rate fallacy. Base rate fallacy denotes the phenomenon that when you are looking for very rare incidents in very large datasets, even extremely sophisticated detection tools will likely yield more false-positives than true-positives. Out of all EU flight passengers, extremely few will be actual terrorists or serious offenders – European law enforcement is looking for the proverbial needle in a haystack. Adding more hay to the stack will not allow them to find more needles – needles remain just as rare and elusive.

 

Confronted with this criticism, the EU Commission pointed to the legislator’s limited responsibility for mathematical limits: "There are mathematical limits and errors, but the legislator is not required to conduct mathematical demonstrations.".

 

(2) Judge von Danwitz then proceeded to refute the Member States’ and EU institutions’ claim that, since the PNR Directive does not result in the processing of particularly sensitive data, it does not constitute particularly severe interferences with Articles 7 and 8 CFR. In so doing, he drew an explicit comparison with telecommunications data which were the subject of Digital Rights Ireland, another landmark CJEU decision on mass data retention. While acknowledging that telecommunications data may per se contain more sensitive information than passenger data, he pointed out that, when determining the severity of interferences, one must also take into account the scale and method of processing: Firstly, while Directive 2006/24/EC intended that the majority of telecommunications data be accessible but remain unscrutinized, the PNR Directive provides for the automated analysis of every single PNR dataset. Secondly, Judge von Danwitz emphasized that the deployment of data mining through self-learning algorithms intensified the severity of the interference (thus also casting aside the renouncements put forth by many Member States).

 

The EU Commission agreed with Judge von Danwitz’ analysis in principle (“Oui, certainement, la gravité de l’ingérence est déterminée dans la manière où le Data Mining se fait.") but contended that there were different degrees of Data Mining ("Il y a Data Mining et Data Mining."). They claimed that the safeguards included in the PNR Directive rendered the Data Mining at hand rather minor.

 

(3) Judge von Danwitz then turned to the lack of clear criteria for the individual review of automated matches by non-automated means, as per Article 6 § 5. He pointed out that this vagueness opened room for discrimination. When the EU Commission responded that discrimination was prohibited under Article 6 § 4, Judge von Danwiz replied that indirect discriminations certainly remained possible – which the EU Commission admitted (“un risque de discrimination indirecte existe toujours"). Judge von Danwitz then asked why the EU legislator did not include more provisions in the text in order to mitigate the risk of indirect discrimination – given the extremely high false-positive rate of over 80 %. The EU Commission responded that there were limits to the specificity that can reasonably be expected from any legislator (“tout législateur a des limites lorsque l’on doit réglementer une activité minutieuse et détaillée"). Responding to this, the European Data Protection Supervisor proposed a reversal of the burden of proof as a possible solution, but this was rejected by the EU Commission because, according to them, that would presuppose that “everyone is automatically a victim”. 

 

(4) Judge von Danwitz’ fourth line of questioning turned on the overall proportionality of the PNR system. His questions mainly focused on the (lack of a) link between the occasion for mass data retention – taking a flight – and the PNR Directive’s stated purpose – combatting terrorist offences and serious crime. The lack of clear criteria buttressing this link was one of the CJEU’s main sources of concern in its Opinion 1/15 (see n°217). The EU Commission responded that criminals specifically use the convenience of international air travel to orchestrate their crimes. Judge von Danwitz proceeded by pointing out that the notion that locations and behaviors suitable for crime should be subjected to mass surveillance was stretchable to an almost unlimited extent. “Why not rock concerts?”, he asked. “Why not museum visits?”. Surprisingly, the EU Commission basically agreed with him, saying that yes indeed, rock concerts could be prone to drug-related offenses (“I don’t have any police experience, but I could imagine that there could be much drug-related crime occurring at rock concerts.”).

 

Lingering doubts regarding the PNR system’s proportionality

 

By no means can this report paint a full picture of what was said during the oral hearing or what is to be considered when assessing the mass surveillance of flight passengers. For example, other questions raised by the Advocate General Pitruzzella concerned the vagueness around which databases would be “relevant” and could therefore be used for comparison (von Danwitz asked whether Facebook databases could be used), whether the unanimous extension of the PNR Directive’s scope to intra-EU flights was warranted or excessive, and whether the five-year-retention period was disproportionate. Advocate General Pitruzzella also inquired about external oversight of pre-determined criteria and whether false-positives were systematically used in order to improve algorithms (to which some Member States replied in the affirmative).

 

In my opinion though, the hearing shone a light on the PNR Directive’s manifold constitutional weaknesses. The EU Commission and Member States did not succeed in dispelling my lingering doubts about its underdeterminacy, its questionable suitability and effectiveness, its unchecked potential to produce large-scale discrimination. But chief among these weaknesses is the Directive’s sheer excessiveness: It is just disproportionate to take ordinary human behavior and use it as an opportunity to unleash an unprecedented degree of technology-fueled surveillance upon hundreds of millions of European flight passengers and to sift through mountains of useless, potentially discriminatory data, just to – maybe, kind of – detect a handful of criminals.

 

Photo credit: Juke Schweizer, via Wikimedia Commons

7 comments:

  1. This comment has been removed by a blog administrator.

    ReplyDelete
  2. This comment has been removed by a blog administrator.

    ReplyDelete
  3. This comment has been removed by a blog administrator.

    ReplyDelete
  4. This comment has been removed by a blog administrator.

    ReplyDelete
  5. This comment has been removed by a blog administrator.

    ReplyDelete
  6. This comment has been removed by a blog administrator.

    ReplyDelete
  7. This comment has been removed by a blog administrator.

    ReplyDelete