Showing posts with label data protection authorities. Show all posts
Showing posts with label data protection authorities. Show all posts

Friday, 14 July 2023

Is the UK data protection authority giving free pass to big tech giants?


 


Asress Adimi Gikay (PhD), Senior Lecture in AI, Disruptive Innovation and Law (Brunel University London)

Photo credit: howtostartablogonline.net 

In the online space, it is perhaps difficult to find a more empty promise than “we value your privacy.“ Businesses promise to preserve our data privacy rights, but in reality, they have neither the carrot, nor enough sticks, to make them respect data protection rules. This holds true even in the European Union (EU), where the most comprehensive data protection legislation—the General Data Protection Regulation (GDPR)— failed to satisfactorily deliver on its promise to protect the fundamental rights of citizens.  As businesses openly flout data privacy laws, regulators either struggle to adequately enforce the law or wilfully ignore infractions.

The UK’s data protection authority— the Information Commissioner's Office (ICO)— has succumbed the most to its ambition of promoting innovation and economic growth while simultaneously protecting data protection rights. Unfortunately, the drive to appeal to businesses has reduced data privacy rights to mere buzzwords, not just in the business world but also within the ICO itself.

As a result, the authority's enforcement record defies the primary objective of protecting the public's data privacy rights, displaying an unexplainable leniency towards corporations. I argue that this indefensible record of the ICO’s underscores the authority’s insistence on operating with failed enforcement policy.

The ICO’s enforcement track record—the numbers don’t lie

During the 2021-2022 fiscal year, the ICO reported receiving 35,558  data privacy violation complaints. The complaints were diverse including companies refusing to delete individuals’ personal data or processing their data without consent. Sometimes, organizations infringed the individual’s right to access their own personal data, contrary to what the data protection legislation requires.

Similarly, in the 2022-2023 financial year, a total of 27,130  complaints were filed with the ICO, excluding data from the most recent financial quarter, yet to be reported by the authority. Out of the 62,688 complaints filed over a span of two years, the authority levied only 59 monetary penalties. This means that only approximately 0.094% of the complaints led to real consequences— organizations being sanctioned for breaching data protection rules.

The ICO closed most of the complaints alleging insufficient information to proceed with the complaints or lack of evidence of infraction. It resolved numerous cases through discussions with infringing companies. In such cases, the authority recognises the presence of  infringement by the organization but does nothing concrete other than what it describes as “informal action taken.”

Due to the ICO’s practice of not disclosing comprehensive details about these cases, except for summaries that serve more statistical purposes, the public tends to perceive the authority as prioritizing business interests over safeguarding data privacy rights. Interestingly, this public perception aligns with the available evidence.

The broader context

The enforcement of the GDPR has been unsatisfactory across the EU, since the implementation of what has been described as a breakthrough law, that promised to empower people in the digital world, through giving them more control on their personal data. Even when applying a more forgiving standard, the ICO's enforcement record remains unsatisfactory. Between 2018 and 2022, it levied around 50 monetary penalties, while German and the Italian authorities imposed 606 and 228 penalties between 2018 and 2021.

The ICO is generally passive compared to its European counterparts. In a notable case, the French authority, Commission Nationale de l’Informatique et des Liberté  (CNIL) fined Meta and Google €60 million and €150 million respectively in 2021 for their illegal use of cookies. Despite engaging in similar unlawful data collection practices in the UK, the companies made changes to their cookie-based data collection practices in the UK only while complying with the French ruling. They faced no threat of sanction in the UK.

The ICO's consistently poor enforcement record clearly undermines public confidence in the authority. In its 2022 annual report, the authority itself acknowledged getting the lowest score in complaint resolution in a 2021 customer survey it backed. An independent review—Trustpilot— rates the authority at 1.1 out of 5. This is based on self-initiated reviews conducted by members of the public, some claiming that the ICO prioritizes business interests rather than protecting privacy rights.

Unfit enforcement policy— corporate free pass

The lack of adequate data protection law enforcement in the EU has been explained by resource constraints.  For example, a report by the Dutch ombudsman highlighted that the relevant authority in the country had 9,800 unresolved privacy complaints at the end of 2020. And according to the Irish Council for Civil Liberties, “almost all (98%) major GDPR cases referred to Ireland remain unresolved”— in part due to lack of budget and sufficient specialist staff.

However, the ICO is considered to be a relatively resourced authority. It also has the ability to impose substantial fines that could finance its operations. So, it is unlikely that resource constraints explain its inadequate enforcement record. The ICO’s enforcement policy is largely culpable. 

The authority’s risk-based approach prioritizes a softer approach to ensuring compliance, reserving enforcement actions to violations that are likely to possess the highest risk and harm to the public. Enforcement action includes requiring an offending organization to end violations and comply with relevant rules through enforcement notice and issuing penalty. The ICO considers several factors in determining whether imposing a penalty is appropriate, including the intentional or repeated nature of the breach, the degree of harm to the public, and the number of people impacted.

In practice however, the authority exercises discretion even in cases of intentional and repeat violations impacting millions of people. For example, numerous companies illegally collect consumers’ personal data using cookies.

By tracking a user's browsing behavior, third-party cookies, known as tracking cookies, usually gather information that is enough to identify the person behind a device. Besides visits to particular web pages, they can record a person’s search queries, goods or services purchased, IP address and location.

From this, it is possible to infer a person's name, nationality, language, religion, sexual orientation, health condition, and other intimate details – most of which are considered special categories of personal data. These types of data cannot be processed without the individual's explicit consent, unless limited exceptions apply. Whilst these data could be used, for example for marketing health products, insurance companies could also use them to assess premiums, in a manner unknown and detrimental to the interest of the individual.

To its credit, the ICO has fined Easylife Ltd £1.35m which has later been reduced to £250,000 for using personal data to profile medical conditions without consent, to target individuals with health-related products. But the authority does not seem to recognise that it takes a simple switch to transition from inferring personal data from browsing behavior using cookies to profiling health conditions.

Cookies-based unconsented data collection is illegal and potentially poses a serious harm to the public, as companies could process special categories of data in a detrimental manner. Unfortunately, companies openly violate cookies-related legislations in the UK with impunity.

The ICO also shows unwarranted leniency towards tech companies repeatedly violating data protection rules. In one fiscal year (2022/2023), the ICO found evidence of Google UK’s potential infringement or infringement of the law more than 25 times,  in separate complaints. But the authority claims to have taken informal actions, essentially advising the company to do better work to comply.

Google UK's infractions include refusal or delaying to delete personal data upon request by individuals exercising their right to be forgotten. Meta Platform(formerly Facebook Inc.) received 20 compliance suggestions, after evidence of its infringement or potential infringement has been found, while Microsoft and Twitter each received the same soft compliance advices 8 times, in the same year.

In all these cases, taxpayers go through the stressful process of demonstrating that their data protection rights were violated, providing evidence of infringement by big tech companies. Yet the ICO consistently chose to be lenient to companies that obviously do not mind being told repeatedly that their data protection practices are non-compliant. The authority has essentially transformed itself into a legal advisory office for tech companies, neglecting its role as an overseer.

Data protection law inherently creates hurdles for individuals seeking compensation for privacy rights violations. In 2021, the UK's highest court ruled that without evidence of material damage or distress, mere loss of control over personal data is not compensable under the GDPR. This effectively forces individuals to wait for a recognized harm to occur due to violation of their data privacy rather than preventing it. The ICO, which should deter privacy violation, is unfortunately impotent as well.

The need for policy change

The ICO's enforcement policy heavily relies on collaboration with regulated entities rather than utilizing effective sanctions to deter repeat violations. This approach aims to support the digital economy by avoiding excessive enforcement of data protection rights and fostering data innovation. In theory, it should attract businesses to the UK, create jobs, and stimulate economic growth. However, the policy is currently being misapplied to serve the interest of big tech companies.

The companies repeatedly violating data protection laws do not necessarily contribute to digital innovation exclusively in the UK, while most of them are not strategically positioned to provide job opportunities in the country. But the UK remains their crucial consumer market. As such, sanctioning them is unlikely to change their business decisions and behaviour.  In the event of firm and measured enforcement actions, these companies will be left with no choice but to adhere to the rule of law, considering the market they operate in is one they cannot afford to lose.

The ICO’s failure to effectively enforce data privacy laws risks eroding public trust. It could also discourage data innovation, as the public might refuse to provide data for research and innovation, which could in turn negatively affect the digital economy. 

Sunday, 23 January 2022

Consent and Cookies in EU Data Privacy Law— Two Clicks are Too Many

 


 


 

Dr Asress Adimi Gikay (PhD, SJD, LLM), Lecturer in AI, Disruptive Innovation, Law Brunel Law School (Brunel University London); Twitter: @DrAsressGikay

 

Consent and Data Protection in the European Union

European Union data protection law is based on the conception that data protection is a fundamental right, something the General Data Protection Regulation (GDPR) upholds. Thus, personal data processing requires complying with stringent legal requirements. The GDPR prescribes that consent be specific, informed, unambiguous and given freely, requiring affirmative action by the individual. Over the years, companies have circumvented the consent requirement by resorting to various tactics.

In December 2021, the French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (CNIL), imposed sanctions on Facebook (€60 million) and Google (€150 million) for illegal use of cookies—against the consent requirement. If Facebook and Google do not comply with the decisions in three months from the decision date, they would be paying 100,000 Euros for each day of non-compliance. The decision being made under the ePrivacy Directive is not subject to the GDPR's one-stop-shop cooperation mechanism, so the French decisions bind the companies concerned only in France and would probably (if at all) affect cookies practice in other industries in France only. Nevertheless, websites across the EU and UK are non-compliant with the consent requirement in their use of cookies.

First Rule of Cookies— Consent— has Always Been Tricky

Despite data protection law aiming to give individuals control over their personal information through consent, researchers have argued that several challenges weaken the individuals’ informational control. Due to the sophistication of privacy policies and the complex systems of data collection coupled with the individuals’ limited cognitive ability to process information, they lack sufficient informational control. In many cases, data collection consent forms or privacy policies are adhesion contracts where the data subjects(individuals) have no power to bargain. This is notwithstanding the fact that consent forms should be decoupled from the provision of goods and services and not be imposed on the individual. Even if privacy agreements were to be negotiable,  individuals do not have the time to adequately scrutinize them due to information overload coupled with challenges in understanding technical jargon.

In a 2020 Eurobarometer survey conducted in EU Member states, 37% of the participants responded that they do not read online privacy policies while 47% and 13% read them partially and fully, respectively. Those who read privacy policies partially or do not read them at all indicated that privacy policies are too long (66%) or unclear and difficult to understand (31%). Some responded that it is sufficient for them to know that the entity they deal with has a privacy policy (17%). While some believed that they would be protected by law anyway (15%), others believed that websites would not honour privacy terms (10%). The survey highlights that only a small minority of individuals interacting over the internet read and scrutinize privacy policies. The majority are not adequately protected by the consent requirement even without the added challenge of cookies technology.

Second Rule of Cookies—No Preselected Tick Boxes

As data collection in a traditional setting where the individual supplies the information and consents to its processing is being more tightly regulated, companies have been operating with more efficient data collection and analysis method—deploying cookies. Cookies are small text files that websites place on the user’s devices(terminal equipment) as the user browses to allow the website to recognize the user's device and collect information about the user's browsing behaviour. While cookies serve multiple purposes, including the proper functioning of websites, they notably analyze the user's browsing behaviour for providing personalized advertisement(marketing cookies). As cookies can collect personal data, their use should comply with personal data protection law—the ePrivacy Directive & GDPR.

Although the primary law governing cookies is the ePrivacy Directive, the consent requirement under that Directive is governed by the GDPR. Despite the requirements of the ePrivacy Directive and the GDPR, companies have been applying questionable procedures to launch cookies on the devices of millions of citizens. Most web-based data controllers used to present preselected tick boxes that, by default, made individuals accept cookies on their devices from the relevant website as well as a third-party website(s) until 2019 when Court of Justice of the European Union (CJEU) handed down a judgment in the Planet49 case, specifying that websites could no longer set cookies procedures to require positive action for the individual to opt-out of cookies based-tracking of their behaviour. The judgment was meant to address the rampant tracking of individuals' behaviour for marketing purposes by requiring them to untick preselected checkboxes if they wish to opt out. The preselected checkbox contravenes GDPR consent rules which require consent to be manifested by affirmative action. The CJEU's judgment has not changed cookies-based data collection as most websites  merely switched to different tricks.

Third Rule of Cookies—Two Clicks are Too Many

In December 2021, the CNIL imposed sanctions on Facebook and Google for the illegal use of cookies. According to the decision, FACEBOOK FRANCE made refusing cookies policy more difficult than accepting them. A Facebook user who wishes to log into FACEBOOK FRANCE would be shown a pop-up window (“Accept Facebook cookies in this browser”) which has two buttons —“Manage Data Settings" and "Accept Cookies." Users who click "Accept Cookies" consent to cookies being stored on their computers, whereas those who want to refuse have to take further steps. They have to click on "Manage Cookies" to see a second window which in turn has two buttons— "Accept Cookies" and "Reject Cookies". However, various cookies options in the second window are not preselected, so clicking "Accept" at this stage without further action does not lead to consenting to cookies use. Those who wish to accept some cookies can activate the enable button (slide button) and accept the cookies. However, the CNIL Tribunal argued that users should not be taken to the second window to refuse cookies while they can accept cookies on the first window with one click—two clicks are too many. In essence, the decision establishes that  rejecting cookies should be as easy as accepting them.

The CNIL has made a similar decision against Google’s cookies practice. Facebook submitted a screenshot of the expected cookies procedure update for Europe, including France. The change anticipated has been implemented as of January 2022.  The update changed  "Manage Data Settings" and "Accept all", respectively to “Other options” and “Allow all cookies”. In the second window (once the user clicks “other options”), the new button is entitled “Allow essential cookies only” which appears next to “Allow all cookies”. The CNIL Committee found these anticipated changes to be insignificant regarding the validity of cookies consent.  

Facebook's argument that for valid consent to be obtained, the GDPR does not require accepting and rejecting cookies to be equally easy was rejected. The CNIL clarified that the GDPR requires consent to be obtained freely. If accepting cookies is easier than rejecting them, individuals would be influenced to consent rather than make a free choice. This is consistent with a 2020 study (cited in the decision) that 93.1% of users who are given the option to manage their cookies setting in the second window accept the cookies without going to the second window. Fatigued by a constant request for consent, individuals accept the cookies without attempting to change their settings. Companies are capitalizing on this to collect data illegally from our devices. 

What Happens in the other EU Member States & the UK?

The decision of the CNIL being taken under the ePrivacy Directive is not subject to the GDPR’s one-stop-shop mechanism. Thus, it is binding on Facebook and Google only in France. Until all EU Member States, as well as the UK, take similar steps, both companies are unlikely to change their cookies use practice in other countries. Many other companies still use dubious cookies policies. The majority of the websites give the user the opportunity to reject cookies only with the second click, i.e., at the second window, while users can accept the cookies with one click. 

Companies that have this type of cookie setting include social media giants such as  Twitter and Instagram, news sites such as the New York Times and the Washington Post and brick and mortar companies such as Barclays UK. Even public institutions, including universities, have similar data collection and analysis practices. All these companies have cookies settings that do not comply with the GDPR/ePrivacy Directive as interpreted by the French DPA. It is only a matter of time before other DPAs follow the footstep of the CNIL.

 

Photo credit: Eran Sandler, via wikimedia commons

Wednesday, 16 June 2021

Who has jurisdiction over Facebook Ireland? The CJEU rules on the GDPR 'one stop shop'

 



 

Lorna Woods, Professor of Internet Law, University of Essex

 

Introduction

 

This recent CJEU judgment concerns the one stop shop in the GDPR and the way that very large corporations that have operations in most if not all Member States are regulated.  Facebook has its European headquarters in Ireland so that the Irish Data Protection Commissioner (DPC) is ‘lead authority’ – that is, the DPC has primary responsibility for regulating Facebook under the GDPR.  There have been some concerns about how this one stop shop has been working, especially since some of the larger companies have tended to establish themselves in the same, small Member State. The one stop shop mechanism relies on trust between the Member States, but different Member States have varying degrees of enthusiasm for the enforcement of data protection and also have different levels of money to throw at the issue. As is the case with other one-stop shop mechanisms in other legislation, there are exceptions or ways for other affected regulators to be involved. This case is about the space left to those other regulators.

 

Facts

 

In 2015 the Belgian Privacy Commissioner (subsequently the Data Protection Authority) sought an injunction in the Belgian courts against Facebook Belgium with the objective of ending alleged infringements of data protection laws by Facebook through the collection and use of information on the browsing behaviour of Belgian internet users, whether or not  they  were  Facebook  account  holders,  by  means  of  various  technologies,  such  as  cookies, plug-ins (like or share buttons) or pixels. The matter ended up in the Hof van beroep te Brussel (an appeal court) which was uncertain as to the effect of the one stop shop in the GDPR on the competence to the Belgian Data Protection Authority to bring action against Facebook Belgium. So while Article 55(1) GDPR establishes the principle that each national regulatory authority is competent to carry out its role as regards its own national territory, Article 56(1) states:

 

the supervisory authority of the main establishment or of the single establishment of the controller or processor shall be competent to act as lead supervisory authority for the cross-border processing carried out by that controller or processor.

 

Judgment

 

The central question concerned the circumstances in which, given the one stop shop established by Article 56(1) GDPR, a supervisory authority could take action in relation to specific instances of processing. In this, the Court emphasised two underpinning considerations: that the high level of data protection applied across the EU; and that the one stop shop depended on the process for cooperation laid down in Article 60.

 

While Article 60 envisages that it is the responsibility of the lead authority to adopt decisions in relation to cross-border processing, and that position is the general rule, there are exceptions found in Articles 56(2) (matter only affecting its own territory) and Article 66 (urgency procedure). The Court noted, however, that the exercise of these provisions “must be compatible with the need for sincere and effective cooperation with the lead supervisory authority” as set [para 60] – but this obligation applies also to the lead authority - so that it cannot eschew dialogue with those other authorities [para 63]. Specifically, any  relevant  and  reasoned  objection  made  by  one  of  the  other  supervisory  authorities has the effect of blocking, at least temporarily, the adoption of the draft decision of the lead supervisory authority.

 

In terms of the protection of fundamental rights, the Court noted this allocation of responsibilities is compatible with the Charter. It noted that:

 

the use of the ‘one-stop shop’ mechanism cannot under any circumstances have the consequence that a national supervisory authority, in particular the lead supervisory authority, does not assume the responsibility incumbent on it under Regulation 2016/679 to contribute to providing effective protection of natural persons from infringements of their fundamental rights as recalled in the preceding paragraph of the present judgment, as otherwise that consequence might encourage the practice of forum shopping, particularly by data controllers, designed to circumvent those fundamental rights and the practical application of the provisions of that regulation that give effect to those rights [para 68].

 

The Court noted that legal action by a regulatory authority could not be completely excluded- for example when the lead supervisory authority has not responded to a request for information (see Article 61(8) GDPR), where there is an urgent need for the adoption of final measures (Article 66(2) GDPR), or where the matter is referred for consideration by the European Data Protection Board (EDPB) (Article 64(2) GDPR). In this instance, the Belgian DPA asked the DPC to respond to its request for mutual assistance as expeditiously as possible, but no response was given.

 

The Court also addressed the question of whether the data controller must have a ‘main establishment’ in the territory of that other regulator, concluding that there was no such prerequisite [para 84]. A third question asked whether the non-lead supervisory would be limited as to which body to sue – that is, whether it can take action against the main establishment of the controller or against the establishment that is located in its own Member State. In the national proceedings in this case, the litigation was brought against Facebook Belgium although the headquarters of the Facebook group is situated in Ireland and Facebook Ireland is the sole controller with respect to the collection and processing of personal data throughout the European Union. Facebook Belgium was set up to sell advertising in Belgium but also to lobby the EU institutions. The Court determined that the non-lead regulatory authority may take action with respect to the main establishment of the controller located in that authority’s own Member State but also with respect to another establishment of that controller, provided that the object of the legal proceedings is data processing  carried out in the context of the activities of that establishment and that that authority is competent to exercise that power [para 96].

 

A fourth question addressed the impact of the change in regime from the Data Protection Directive (which did not have a one stop shop) and the GDPR. The Court distinguished between actions brought before the date the GDPR became applicable and actions after that date. As regards the first situation, such legal action may be continued (on the basis of the Directive); for other actions the GDPR rules apply – and this allows such a regulatory authority to take action where one of the exceptions applies.

 

The Court held that Article 58(5) GDPR (on the power of data protection authorities to bring legal proceedings) has direct effect, so that the relevant authorities may rely on the provision even when it has not been specifically implemented in the national legal system.

 

Comment

 

This seems to be a balanced judgment in which the Court aims to reconcile competing pressures.  It has re-emphasised the one stop shop, but is aware of the unevenness of resources and alive to the risk of forum shopping against that background.  One of the key elements of this judgment is the Court’s emphasis on the obligation to cooperate, which applies to lead authority and other authorities alike. Nonetheless, while the lead regulator must be given the chance to act, lead regulators cannot choose to ignore the importunate demands of other national regulators – whether for lack of resources, or other reasons (eg a different assessment as to what’s important).  The significance of this comes down to the concerns about the effectiveness of the DPC (especially bearing in mind the size of the companies under the DPC’s jurisdiction).  Against this background, the judgment will probably be welcomed by privacy advocates. Whether it is equally good from the perspective of data controllers, at least those based in Ireland, seems far less likely. What is potentially problematic from the perspective of the data controller is the greater unpredictability of the data protection regime. This may be less about fragmenting standards (especially if the decision is referred to the EDPB) but about where enforcement actions may start; this agenda may not rest entirely in the hands of the lead authority.

 

Photo credit: Niamfrifruli, via Wikimedia Commons