Wednesday 16 December 2020

Overview of Digital Services Act


 


Professor Lorna Woods, University of Essex

 

The following is a summary of the proposal for the Digital Services Act, based on the leaked version of the document. 

Background 

The legal regime for online services has been unchanged since the e-Commerce Directive and as such it reflects the technology, services and thinking of more than twenty years ago.  The Commission committed to updating these horizontal rules in its Communication, Shaping Europe’s Digital Future (Feb 2020).  A number of European Parliament resolutions, while calling for revisions, also emphasised that some core principles from the e-Commerce Directive still remain valid.  So the Digital Services Act (DSA), which is proposed as a regulation not a directive, does not repeal the e-Commerce Directive but builds on it, including the internal market principle found in Art 3 e-Commerce Directive.  It is therefore envisaged that there be one Member State with regulatory responsibility for a service provider, that is the Member State in which the main establishment of the provider of intermediary services is located (article 40).  By contrast with the Audiovisual Media Services Directive (AVMSD), this is a very simple provision.  The proposal does however delete the immunity provisions (Arts 12-15 e-Commerce Directive) and replaces them with a new structure which claims to recognise the impact of some very large actors on not only the economy but also European society.  Note the e-Commerce Directive allowed Member States to take their own regulatory action within the limits imposed by EU law; it does not seem that this proposal completely harmonises the field either. 

The Regulation does not deal with other information society services (which remain regulated by the e-Commerce Directive); its relationship with sector-specific rules (notably the provisions relating to video-sharing platforms in the AVMSD) will need to be considered.  As lex specialis, those rules will apply in preference to the more general rules here – though the general rules could cover any gaps in that regime.  The rules do not displace the general consumer protection rules either and the DSA is without prejudice to the operation of the GDPR. 

Regulated Entities 

The DSA distinguishes between four levels of actors, all based on the definition of ‘information society service’ (and presumably to be interpreted in the light of the Uber judgment): 

-          providers of intermediary services (defined Art 2(f));

-          hosts and online platforms providers (defined Art 2(h));

-          online platforms; and

-          very large online platform providers (defined Article 25). 

Online platforms could include social networks and online marketplaces (recital 13). Where the function is ancillary to another service, that service would not be caught by these rules. Notably, the press comments sections would not be included.  There is a boundary issue between closed groups which could in principle fall in the regime and information shared within groups consisting of a finite number of pre-determined persons.  Interpersonal communication services within the meaning of the European Electronic Communications Code (EECC) (Directive 2018/1972) – for example emails or private messaging services, fall outside the scope of the proposed Regulation (Rec 14). 

Each is subject to a different set of rules; this is in the interests of proportionality – though it is questionable whether this differentiating recognises that some small platforms are used as the basis for extremism and other potentially dangerous activities. Note also that the ‘very large platforms’ are defined by reference to use by population at the EU level, potentially overlooking national platforms of significance. This also means that the definition of very large platform is under EU level control. 

Chapter II: Retention of Immunity for Content of Third Parties 

The baseline positions seems to be a retention of the position under the e-Commerce Directive; that is, immunity from liability for content transmitted (and not immunity from all rules) provided that the service provider “acts expeditiously to remove or to disable access to the information” and a restatement that there is no general obligation to monitor – though this is linked to the information the intermediaries transmit or store (Article 7).  Whether we read these provisions in the light of the jurisprudence on the e-Commerce Directive provisions remains to be seen; the recitals which refer to technical and automatic processing of information (recital 18) suggest that this might be the case. In passing, it is unclear how this might interact with the proposed exception to the e-Privacy Directive as regards the fight against child sexual exploitation material (COM(2020) 568 final). This is the retention of the ‘conditional immunity’ approach from the previous regime. The fact that the immunity does not apply to all rules is made explicit as regards hosting service providers; Art 5(3) states that immunity “shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders ….”.  Article 6 seems to constitute a form of ‘Good Samaritan’ clause.  Another addition is the obligation to act in relation to orders in relation to a specific item of illegal content. “Illegal content” is a defined term, meaning 

any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law. (art 2(g)) 

The recitals give us more detail, though this is not an exhaustive definition: 

-          illegal hate speech or terrorist content and unlawful discriminatory

-          content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law (rec 12). 

There are also provisions regarding the provision by the intermediary service operator of information about specific recipients of the service.  These two provisions seem to be part of the procedural tidying up of weaknesses in the operation of the e-Commerce Directive. 

Chapter III: General Rules 

There are some specific obligations with regards all intermediary service providers: 

-          establishment of single point of contact (SPoC);

-          establishment of legal representative within EU for providers which are not established in the EU;

-          providers are to implement their terms and conidtions in a ‘diligent, objective and proportionate manner’;

-          except for micro or small enterprises, publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period (content of rules specified in directive). 

Hosting providers, including online platforms, are subject to additional rules: 

-          reporting mechanisms for illegal content (containing specific elements) – these are called ‘notice and action mechanisms’ though this is not a defined term – the recitals envisage these applying to file storage and sharing services, web hosting services, advertising servers and paste bins (recital 40);

-          providing a statement of reasons in relation to a decision to remove/disable access to specific items of content and the information to be included in the statement of reasons are set out in Art 15(2);

-          decisions are to be made available in a database managed by the Commission. 

While there is a focus on problem content, the obligations themselves are not about defining such content but rather about the mechanisms that the platforms should have in place to deal with problems, a point recognised in the recitals (recital 26).  This shift can be seen in the more extensive obligations imposed on the online platforms, especially the very large online platforms and seems to be part of a more general policy shift across a range of countries. 

Chapter III: Rules Relating to Online Platforms 

Online platforms, other than micro and small enterprises, are subject to more specific rules found in Section 3. Online platforms are to provide the following: 

-          access to an effective, free internal complaint-handling system in relation to decisions to suspend an account or remove/disable content (it is not clear what the position is for someone who has complained about content but that content has not been removed)

-          out of court dispute settlement system in relation to disputes against the platform (not this is without prejudice to other routes available to users)

-          assumes a trusted flagger system, which allows those notices to be dealt with without delay

-          trusted flaggers must meet certain criteria and notes that Europol or the INHOPE organisations would meet these criteria (recital 46). 

They are subject to the following additional obligations: 

-          Online platforms are obliged to suspend, for a reasonable period of time and after having issued a prior warning

-          the provision of their services to recipients of the service that frequently provide manifestly illegal content and/or

-          the processing of notices and complaints by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded

-          platforms are obliged to notify suspicion of criminal activity

-          platforms are required to ensure that traders using their platforms must be traceable (giving specified information). 

There are enhanced transparency reporting requirements, including transparency of online advertising (that the information is an advertisement; who placed it and why the user is seeing that ad, including information relating to any profiling (recital 52)). 

Chapter III: Very Large Online Platforms

 Section 4 deals with ‘very large online platforms’; that is platforms with active participants in the EU of 45 million or more.  The proposal highlights the systemic risks posed by such platforms, influencing online safety, shaping public opinion as well as online trade. They are therefore subject to further obligations on top of those already set out: 

-          6 monthly assessment of ‘any significant systemic risks’ including:

-          dissemination of illegal content, for example the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products;

-          negative effective on the right to private and family life, freedom of expression, the prohibition of discrimination and certain rights of the child, and

-          intentional manipulation of the service ‘with an actual or foreseeable negative effect’ on ‘public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security’ (art 26(1));

-          in making this assessment the platforms should take into account the functioning of content moderation, recommender algorithm systems and targeting of adverts and also notes the role of fake accounts, the use of bots, and other automated or partially automated behaviours;

-          implementation of ‘reasonable, proportionate and effective mitigation measures’;

-          these measures are to be independently audited – those doing the audit have to meet certain criteria including “proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards”; 

There are reporting requirements in relation to risk assessment, risk mitigation and auditing processes: 

-          explanation of parameters used in recommender systems, and provision to allow user to modify such systems;

-          additional transparency as to online advertising by instituting a repository of adverts;

-          provision of data that is necessary to monitor compliance with this regulation;

-          provisions of access to vetted researchers (ie attached to academic institutions and ‘independent from commercial interests’ (how this affects research institutions which have received significant grants from ‘Big Tech’ is unclear) and

-          required to have a compliance officer (who must have certain professional experience/qualifications). 

Transversal Provisions concerning Due Diligence 

Section 5 deals with standards (Art 34) and codes of conduct (Art 35) to support the functioning of the systems and the due diligence requirement.  Civil society may be involved in the development of codes in addition to service providers and other interested parties. Article 36 puts an obligation on the Commission to “encourage and facilitate the drawing up of codes of conduct at Union level” in relation to transparency of online advertising. In a pandemic world it is unsurprising that the draft also makes provision for ‘crisis protocols’ dealing with “extraordinary circumstances affecting public security or public health” (Art 37).  The recitals suggest the following as constituting “extraordinary circumstances”: “any unforeseeable event, such as earthquakes, hurricanes, pandemics, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information” (recital 70). 

Institutions and Enforcement 

Chapter IV deals with the institutions for implementation and enforcement.  The system requires regulator(s) at national level. A Member State must designate such a responsible body as its Digital Services Coordinator (DSC). There are provisions in cross border cases for cooperation between DSCs.  A European Board for Digital Services (EBDS) is established to advise the respective DSCs and the Commission. There are questions of possible overlap between the remit of this body and that of ERGA – the equivalent body set up under AVMSD – at least as regards the video sharing platform provisions. 

The DSC coordinates the national response and has specific tasks under the regulation (see specifically Art 41). While the DSC is to carry those tasks out in an ‘impartial, transparent and timely manner’ and to ‘act with complete independence’ (Art 39), it does not appear that the DSC needs to be independent in the same was the national supervisory authority is required to be independent in the telecommunications, audiovisual and data protection fields.  The Regulation lists the minimum enforcement powers to be granted to the DSC including the power to accept commitments, the power to order cessation of infringements and to adopt interim measures; the power to impose fines. In extreme circumstances it is envisaged that the DSC may “request a competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place” (Art 41(2)(f)).  Penalties are not to exceed 6% of the provider’s annual turnover; penalties for failure to comply with information requests are capped at 1%.  Users are given the right to comply to the DSC. 

The EDBS is made up of the DSCs, represented by high level officials. In this there seems to be some similarity with existing EU structures (e.g the EDPB under the GDPR). The EBDS may, for example, issue opinions, recommendations or advice and support the development of codes and guidelines as well as supporting joint investigations. 

There are specific provisions relating to the supervision of very large platforms. First of all the relevant national DSC (and the Irish regulator will be clearly one such) will be obliged to “take utmost account of any opinion and recommendation” under the enhanced supervision system set down in Art 50.  Further, there is a mechanism whereby the Commission, or the DSCs in destination states may “recommend” the DSC with jurisdiction to investigate a suspected infringement of the DSA.  In implementing any decisions, the DSC with jurisdiction must communicate certain information to the EDBS/Commission who may communicate their views when they are of the opinion that any action plan proposed is insufficient.  Significantly, the Commission may in some circumstances initiate action in relation to very large platforms. The Commission has power to request information, to interview and to conduct on-site investigations and the Commission may issue interim measures or make commitments binding. The Commission also has the power to adopt a ‘non-compliance decision’ and impose fines.  Some of these provisions reflect the approach found in the competition enforcement processes at EU level, including the right of the very large platform to be heard and have access to the file.  Again there are questions of overlap in terms of all these powers with the rules pertaining to video sharing platforms and the remit of the AVMSD – which does not have equivalent provisions. 

Conclusions 

This is just the start of what will be a long and, if the experience of the recently agreed terrorist content proposal is anything to go by, contentious process.  While the main themes of the proposal are on one level straight forward and seemingly broadly in line with the proposals of the UK Government, the devil is always in the detail and there will no doubt be some sensitivity about who gets to exert control (national or EU level), the differing views and sensitivities of some of the Member States and concerns about freedom of expression and the right to private life, especially given the proposal seems to encompass some forms of content that is not illegal (e.g. misinformation/disinformation).

 

Photo credit: TodayTesting, via Wikimedia commons

5 comments:

  1. This comment has been removed by a blog administrator.

    ReplyDelete
  2. This comment has been removed by a blog administrator.

    ReplyDelete
  3. This comment has been removed by a blog administrator.

    ReplyDelete
  4. This comment has been removed by a blog administrator.

    ReplyDelete
  5. This comment has been removed by a blog administrator.

    ReplyDelete