Wednesday 28 April 2021

The Ola & Uber judgments: for the first time a court recognises a GDPR right to an explanation for algorithmic decision-making



 

 

Raphaël Gellert, Marvin van Bekkum, and Frederik Zuiderveen Borgesius

- Dr. Gellert is assistant professor of law, at the iHub, Radboud University, The Netherlands (Twitter: @gellertraphael)

- Marvin van Bekkum is a PhD candidate, at the iHub, Radboud University, The Netherlands

- Prof. Dr. Zuiderveen Borgesius is professor of ICT and law, at the iHub, Radboud University, The Netherlands frederikzb@cs.ru.nl

 

In March 2021, the Amsterdam District Court decided in two cases regarding Uber (‘Uber employment’ and ‘Uber deactivation’ cases), and one case regarding Ola (see also the unofficial English translations of the judgments). Ola offers a service that’s comparable to Uber. Both companies offer an app that links (taxi) drivers to passengers.

In the Ola judgment, the Court requires the Ola company to explain the logic behind a fully automated decision in the sense of article 22 of the General Data Protection regulation (GDPR). This is the first time that a court in the Netherlands recognises such a right. To the best of our knowledge, it is also the first time that a Court anywhere in Europe recognises such a right.

In this blog post we sketch the background of the three cases, we summarise the relevant part of the judgments, and we comment on the judgments. We focus only on the parts of the judgment about fully automated decisions and a right to an explanation.

Background of the case: the GDPR and a right to an explanation

The GDPR contains a specific provision that in principle prohibits a fully automated decision with ‘legal effects’ for the data subject (individual), or that ‘similarly significantly affects him or her’ (article 22 GDPR). An example of a fully automated decision is, for example, automated credit scoring. The main rule of the prohibition says, in essence, that people may not be subjected to certain types of completely automated decisions with far-reaching effects, unless an exception applies.

The prohibition does not apply if the individual has consented to such decisions, or if such decisions are ‘necessary’ for entering into, or performing, a contract between the individual and the company.

If such an exception applies, the automated decision is allowed. Article 15 GDPR grants people the right to learn ‘meaningful information about the logic involved’ in such fully automated decisions. Some scholars speak of a ‘right to an explanation’ of AI-driven decisions. Article 15 also grants people a right to access their data. In short, people can ask an organisation what data the organisation has about them, for which purpose, etc.

Summary of the Ola and Uber judgments: access to data

We start by summarising the similarities between the Ola and Uber employment cases. In both judgments, drivers wanted to prove that they were subject to an employment relationship with Ola and Uber. To prove their employment relationship, the drivers requested access to their data under article 15 GDPR (par. 2.5 Ola judgment, par. 2.7-2.8 Uber employment case).

The drivers further state that the degree of algorithmic and automated management control is important for proving an employment relationship (par. 3.6 of the Ola judgment). A key element of the judgments was therefore whether such ‘algorithms and automated decision-making’ fall under the scope of article 22 GDPR. If article 22 applied, the drivers would also be able to access ‘meaningful information about the logic involved’ in these algorithms (par. 3.1 Ola judgment, par. 3.1 Uber employment judgment).

In both cases, the issue at stake was whether the algorithms and automated decision-making had ‘legal effects’ or did ‘similarly significantly affect’ the drivers in the sense of article 22 GDPR.

In the Uber employment case, the Court examined the algorithm-mediated matching of passengers and drivers. In the Court’s view, the drivers did not adequately motivate why there was a ‘legal’ or ‘significant effect’ in article 22 GDPR (par. 4.66 and 4.67 Uber employment judgment).

In the Ola case, the Court looked at various algorithms and automated decision-making processes such as those pertaining to the drivers’ earning profile, the system of irregularities detection, and the system for assigning trips. In the Court’s view, the drivers did not prove that these systems had a ‘legal’ or ‘similarly significant effect’, despite the systems having some effect on the driver’s behaviour (par. 4.47-4.50 Ola judgment).

The situation is different concerning Ola’s automated system of ‘penalties and deductions’ (par. 4.51 of the Ola judgment). If a certain ride was considered invalid, then Ola’s computer systems would give the driver a monetary penalty. The Court considered that such penalties ‘similarly significantly affects’ the driver. The penalties were significant because they affected the rights of the drivers resulting from the drivers’ agreement with Ola. Therefore, the Court required Ola to explain the logic behind such decisions on the basis of article 15 GDPR.

In the words of the Court, ‘Ola must communicate the main assessment criteria and their role in the automated decision to [the drivers], so that they can understand the criteria on the basis of which the decisions were taken and they are able to check the correctness and lawfulness of the data processing’ (par. 4.52 of the judgment).

Summary of the Uber deactivation judgment

In the other Uber deactivation case, the drivers were contesting the removal of their Uber license pursuant to an automated decision (par. 2.4, 3.1, 3.2 Uber deactivation judgment). As part of this contestation, the drivers also requested access to meaningful information about the logic involved in the automated decision pursuant to article 15 GDPR (par. 3.1 Uber deactivation judgment).

Contrary to the other cases, the discussion here concerned mostly whether the automated decision was ‘solely’ (or fully) automated in the sense of article 22 GDPR. Uber explained that an ‘Operational Risk team’ takes the decision to end the licenses on the basis of the potential fraud signal it receives from Uber’s automated algorithm (par. 4.19 Uber deactivation judgment).

In Dutch Civil procedure law, a statement by one party that is not contested by the opposing party is considered proven. The Court accepted Uber’s explanation because the drivers did not contest the explanation. The Court therefore concluded that there were no fully automated decisions (par. 4.24 Uber deactivation judgment). The Court consequently denied the drivers a right to access to meaningful information concerning the algorithm pursuant to article 15 GDPR (par. 4.26 Uber deactivation judgment).

Comments

In the Ola case, for the first time, a Court requires an organisation to explain the logic behind a fully automated decision in the sense of the GDPR. Many scholars (including us) thought that the GDPR provisions on automated decision-making and a right to an explanation would remain a dead letter. The predecessors of those provisions (in the 1995 Data Protection Directive) have not been applied much either.

However, this recent Ola judgment shows that Courts can actually apply these GDPR provisions in practice. Hence, the judgment gives an extra reason for organisations to take the GDPR provisions on automated decision-making seriously. Therefore, organisations that use fully automated decision-making that seriously affects people must be able to explain the logic behind such decisions.

In the Ola judgment the Court elaborates the term ‘meaningful information’. The Court builds on the ‘Guidelines on Automated individual decision-making and Profiling’, adopted by the Article 29 Working Party, the predecessor of the European Data Protection Board.

The Court interprets ‘useful information about the underlying logic’ in such a way that the most important assessment criteria and their role must be communicated to the data subject. Based on that information, the data subject should be able to understand which criteria the decision is based on. The data subject should also be able to verify the correctness and lawfulness of the data processing based on the given information (para 4.41 Ola judgment).

If a decision is automated in the sense of article 22 GDPR and an exception applies that allows that automated decision, then another requirement follows. Article 22(3) GDPR states that the organisation must ‘implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.’

Roughly summarised, the organisation must ensure that the victim of a fully automated decision can ask a human to reconsider the decision. For instance, if a bank uses a computer to decide whether a customer gets a mortgage, the customer must be able to ask a bank employee to reconsider the decision. Because the Court ruled that Ola used automated decision making in some cases, Ola probably needs to implement a system that allows human intervention.

GDPR is about more than privacy

The Ola judgment illustrates that the GDPR does not only aim to protect privacy. Rather, the GDPR aims for fairness in general, in situations where organisations use personal data. For instance, the GDPR also aims to mitigate the risk of discrimination. Indeed, in the Uber employment case, the Court stated that the GDPR is key to avoid ‘the discriminatory consequences of profiling’ (par. 3.3 Uber employment case). In the cases at stake, the drivers used the GDPR to contest the unfairness of a license-removal decision and to expose the power that platform economy apps have over drivers. 

The Dutch judgments discussed above are all from Courts of first instance. Hence, parties may still appeal the judgments.

Open questions

There are still many open questions about the GDPR’s provisions regarding fully automated decisions and a right to an explanation. For instance, article 22 GDPR applies to decisions ‘based solely on automated processing’. It is debatable to what extent the GDPR’s provision applies to decisions that are largely, rather than ‘solely’, based on automated processing. In the Uber deactivation case, a whole team took the decisions, so the case was clear-cut: the decisions were not automated.

Arguably, article 22 does not apply if a bank employee denies a loan on the basis of a recommendation by an AI system. It would be useful if case law made clearer where the border lies between, on the one hand, fully automated decisions, and on the other hand, partly automated decisions that remain outside the scope of article 22 of the GDPR. The European Data Protection Board says (at p 21) that an automated decision counts as a fully automated decision, if employees rubberstamp automated decisions. 

More clarity is also needed on what constitutes as a sufficient explanation under the GDPR. For many AI-driven decisions, it is difficult to explain the underlying logic. Explaining a decision can be especially difficult when an AI system arrives at that decision after analysing large amounts of data.


Photo credit: Ilya Plekahnov, via Wikimedia Commons

13 comments:

  1. This comment has been removed by a blog administrator.

    ReplyDelete
  2. This comment has been removed by a blog administrator.

    ReplyDelete
  3. This comment has been removed by a blog administrator.

    ReplyDelete
  4. This comment has been removed by a blog administrator.

    ReplyDelete
  5. This comment has been removed by a blog administrator.

    ReplyDelete
  6. This comment has been removed by a blog administrator.

    ReplyDelete
  7. This comment has been removed by a blog administrator.

    ReplyDelete
  8. This comment has been removed by a blog administrator.

    ReplyDelete
  9. This comment has been removed by a blog administrator.

    ReplyDelete
  10. This comment has been removed by a blog administrator.

    ReplyDelete
  11. This comment has been removed by a blog administrator.

    ReplyDelete
  12. This comment has been removed by a blog administrator.

    ReplyDelete
  13. This comment has been removed by a blog administrator.

    ReplyDelete