Photo credit: Lobo Studio Hamburg, via Wikimedia Commons
1 Introduction
The EU’s Digital Services Act (DSA)
institutionalises
the tasks and responsibilities of ‘trusted flaggers’, key actors in the online
platform environment, that have existed, with roles and functions of variable scope, since the early
2000. The newly applicable regime fits with the rationale and aims pursued by
the DSA (Article 1): establishing a targeted set of uniform, effective and
proportionate mandatory rules at Union level to safeguard and improve the
functioning of the internal market (recital 4 in the preamble), with the
objective of ensuring a safe, predictable and trusted online environment, within
which fundamental rights are effectively protected and innovation is
facilitated (recital 9), and for which responsible and diligent behaviour by
providers of intermediary services is essential (recital 3). This article,
after retracing the main regulatory initiatives and practices at EU level that
paved the way for its adoption, looks at the DSA’s trusted flaggers regime and
at some open issues that remain to be tested in practice.
2 Trusted reporters: the precedents
paving the way to the DSA
The activity of flagging can be
generally recognised as that of third parties reporting harmful or illegal content
to intermediary service providers that hold that content in order for them to
moderate it. In general terms, it refers to flaggers that have “certain
privileges in flagging”, including “some
degree of priority in the processing of notices, as well as access to special
interfaces or points of contact to submit their flags”. This, in turn, poses
issues in terms of both the flaggers’ responsibility and their trustworthiness since,
as rightly noted, “not
everyone trusts the same flagger.”
In EU law, the notion of trusted
flaggers can be traced back to Directive 2000/31 (the ‘e-Commerce
Directive’), the foundational legal framework for online services in the EU.
The Directive exempted intermediaries from liability for illegal content they
managed if they fulfilled certain conditions: under Articles 12 (‘mere conduit’),
13 (‘caching’) and 14 (‘hosting’) – now replaced by Articles 4-6 DSA –
intermediary service providers were liable for the information stored at the
request of the recipient of the service if, once become or made aware of any
illegal content, such content was not removed or access to it was not disabled
“expeditiously” (also recital 46). The Directive encouraged mechanisms and
procedures for removing and disabling access to illegal information to be
developed on the basis of voluntary agreements between all parties concerned
(recital 40).
This conditional liability regime
encouraged intermediary services providers to develop, as part of their own
content moderation policies, flagging systems that would allow them to rapidly
treat notifications so as not to trigger liability. The systems were not
imposed as such by the Directive, but adopted as a result of the liability
regime provided therein.
Following the provisions of
Article 16 of the Directive, which supports the drawing up of codes of conduct
at EU level, in 2016 the Commission launched the EU
Code of Conduct on countering illegal hate speech online, signed by the
Commission and several service providers, with others joining later on. The Code
is a voluntary commitment made by signatories to, among others, review the
majority of the flagged content within 24 hours and
remove or disable access to content assessed as illegal, if necessary, as well
as to engage in partnerships with civil society organisations, to enlarge the
geographical spread of such partnerships and enable them to fulfil
the role of a ‘trusted reporter’ or equivalent. Within the context of the
Code, trusted reporters are entrusted to provide high quality notices, and signatories
are to make information about them available on their websites.
Subsequently, in 2017 the Commission
adopted the Communication
on tackling illegal content online, to provide guidance on the
responsibilities of online service providers in respect of illegal content
online. The Communication suggested criteria based on respect for fundamental
rights and of democratic values to be agreed by the industry at EU level
through self-regulatory mechanisms or within the EU standardization framework.
It also recognised the need to strike a reasonable balance between ensuring a
high quality of notices coming from trusted flaggers, the scope of additional
measures that companies would take in relation to trusted flaggers and the
burden in ensuring these quality standards, including the possibility of
removing the privilege of a trusted flagger status in case of abuses.
Building on the progress made
through the voluntary arrangements, the Commission adopted Recommendation 2018/334 on
measures to effectively tackle illegal content online. The Recommendation establishes
that cooperation between hosting service providers and trusted flaggers should
be encouraged, in particular, by providing fast-track procedures to process
notices submitted by trusted flaggers, and that hosting service providers
should be encouraged to publish clear and objective conditions for determining
which individuals or entities they consider as trusted flaggers. Those
conditions should aim to ensure that the individuals or entities concerned have
the necessary
expertise and carry out their activities as trusted flaggers in a diligent
and objective manner, based on respect for the values on which the Union is
founded.
While the 2017 Communication and
2018 Recommendation are the foundation of the trusted flaggers regime
institutionalized by the DSA, further initiatives took place in the run-up to
it.
In 2018, further to extensive
consultations with citizens and stakeholders, the Commission adopted a Communication on tackling online
disinformation,
which acknowledged once again the role of trusted flaggers to foster
credibility of information and shape inclusive solutions. Platform operators
agreed on a voluntary basis to set self-regulatory standards to fight
disinformation and adopted a Code
of Practice on disinformation. The Commission’s
assessment in 2020 revealed significant shortcomings, including
inconsistent and incomplete application of the Code across platforms and Member
States and lack of an appropriate monitoring mechanism. As a result, the
Commission issued in May 2021 its Guidance
on Strengthening the Code of Practice on Disinformation, containing
indications on the dedicated functionality for users to flag false and/or
misleading information (p. 7.6). The Guidance also aimed at developing the
existing Code of Practice towards a ‘Code of Conduct’ as foreseen in (now)
Article 45 DSA.
Further to the Guidance, in 2022
the Strengthened
Code of Practice on Disinformation was signed and presented by 34
signatories who had joined the revision process of the 2018 Code. For
signatories that are VLOPs, the Code aims to become a mitigation measure and a
Code of Conduct recognized under the co-regulatory framework of the DSA
(recital 104).
Finally, in the context of
provisions/mechanisms defined before the DSA, it is worth mentioning Article 17
of Directive 2019/790 (the ‘Copyright Directive’),
which draws upon Article 14(1)(b) of the e-Commerce Directive on the liability
limitation for intermediaries and acknowledges the pivotal role of rightholders
when it comes to flagging unauthorised use of their protected works. Under Article
17(4), in fact, “[i]f no authorisation is granted, online content-sharing
service providers shall be liable for unauthorised acts of communication to the
public, including making available to the public, of copyright-protected works
and other subject matter, unless the service providers demonstrate that they
have: (a) made best efforts to obtain an authorisation, and (b) made, in
accordance with high industry standards of professional diligence, best efforts
to ensure the unavailability of specific works and other subject matter for
which the rightholders have provided the service providers with the relevant
and necessary information; and in any event (c) acted expeditiously, upon
receiving a sufficiently substantiated notice from the rightholders, to disable
access to, or to remove from their websites, the notified works or other
subject matter, and made best efforts to prevent their future uploads in
accordance with point (b)” (emphasis added).
3 Trusted flaggers under the
DSA
The DSA has given legislative
legitimacy to trusted flaggers, granting a formal (and binding) recognition to
a practice that far developed on a voluntary basis.
According to the DSA, a trusted
flagger is an entity that has been granted such status within a specific area
of expertise by the Digital Service Coordinator (DSC) in the Member State in
which it is established, because it meets certain legal requirements. Online
platform providers must process and decide upon - as a priority and with undue
delay - notices from trusted flaggers concerning the presence of illegal content
on their online platform. That requires that online platform providers take the
necessary technical and organizational measures with regard to their notice and
action mechanisms. Recital 61 exposes the rationale and scope of the regime: notices
of illegal content submitted by trusted flaggers, acting within their
designated area of expertise, are treated with priority by providers of online
platforms.
The regime is mainly outlined in
Article 22.
Eligibility requirements
Article 22(2) sets out the three
cumulative conditions to be met by an applicant wishing to be awarded the
status of trusted flagger: 1) expertise and competence in detecting,
identifying and notifying illegal content; 2) independence from any provider of
online platforms; and 3) diligence, accuracy and objectivity in how it
operates. Recital 61 clarifies that only entities - being them public in
nature, non-governmental organizations or private or semi-public bodies - can
be awarded the status, not individuals. Therefore, (private) entities only
representing individual interests, such as brands or copyright owners, are not
excluded from accessing the trusted flagger status. However, the DSA displays a
preference for industry associations representing their member interests applying
for the status of trusted flagger, which appears to be justified by the need to
ensure that the added-value of the regime (the fast-track procedure) be
maintained, with the overall number of trusted flaggers awarded under the DSA
remaining limited. As clarified by recital 62, the rules on trusted flaggers
should not be understood to prevent providers of online platforms from giving
similar treatment to notices submitted by entities or individuals that have not
been awarded trusted flagger status, from otherwise cooperating with other
entities, in accordance with the applicable law. The DSA does not prevent
online platforms from using mechanisms to act quickly and reliably against
content that violates their terms and conditions.
The status’ award
Under Article 22(2), the trusted
flagger status shall be awarded by the DSC of the Member State in which the
applicant is established. Different from the voluntary trusted flagger schemes,
which are a matter for individual providers of online platforms, the status
awarded by a DSC must be recognized by all providers falling within the scope
of the DSA (recital 61). Accordingly, the DSC shall communicate to the
Commission and to the European
Board for Digital Services details of the entities to which they have
awarded the status of trusted flagger (and whose status they have suspended or
revoked - Article 22(4)), and the Commission shall publish and keep up to date
such information in a publicly available database (Article 22(5)).
Under Article 49(3), Member
States were to designate their DSCs by 17 February 2024; the Commission makes
available the list of designated DSCs on its website.
The DSCs, who are responsible for all matters relating to supervision and
enforcement of the DSA, shall ensure coordination in its supervision and enforcement
throughout the EU. The European Board for Digital Services, among other tasks,
shall be consulted on Commission’s guidelines on trusted flaggers, to be issued
“where necessary”, and for matters “dealing with applications for trusted
flaggers” (Article 22(8)).
The fast-track procedure
Article 22(1) requires providers
of online platforms to deal with notices submitted by trusted flaggers as a
priority and without undue delay. In doing so, it refers to the generally
applicable rules on notice and action mechanisms under Article 16. On the
priority to be granted to trusted flaggers’ notices, recital 42 invites providers
to designate a single electronic point of contact, that “can also be used by trusted
flaggers and by professional entities which are under a specific relationship
with the provider of intermediary services”. Recital 62 explains further that
the faster processing of trusted flaggers’ notices depends, amongst other, on “actual
technical procedures” put in place by providers of online platforms. The
organizational and technical measures that are necessary to ensure a fast-track
procedure for processing trusted flaggers’ notices remain a matter for the
providers of online platforms.
Activities and ongoing obligations
of trusted flaggers
Article 22(3) requires trusted
flaggers to regularly (at least once a year) publish detailed reports on the
notices they submitted, make them publicly available and send them to the
awarding DSCs. The status of trusted flagger may be revoked or suspended if the
required conditions are not consistently upheld and/or the applicable
obligations are not correctly fulfilled by the entity. The status can only be
revoked by the awarding DSC following an investigation, either on the DSC’s own
initiative or on the basis of information received from third parties,
including providers of online platforms. Trusted flaggers are thus granted the
possibility to react to, and fix where possible, the findings of the investigation
(Article 22(6)).
On the other hand, if trusted
flaggers detect any violation of the DSA provisions by the platforms, they have
the right to lodge a complaint with the DSC of the Member State where they are
located or established, according to Article 53. Such a right is granted not
only to trusted flaggers but to any recipient of the service, to ensure
effective enforcement of the DSA obligations (also recital 118).
The role of the DSCs
With the DSA it becomes mandatory
for online platforms to ensure that notices submitted by the designated trusted
flaggers are given priority. While online platforms maintain discretion as to entering
into bilateral agreements with private entities or individuals they trust and whose
notices they want to process with priority (recital 61), they must
give priority to entities that have been awarded the trusted flagger status
by the DSCs. From the platforms’ perspective, the DSA ‘reduces’ their burden in
terms of decision-making responsibility by shifting it to the DSCs, but
‘increases’ their burden in terms of executive liability (for the implementation
of measures ensuring the mandated priority). From the reporters’ perspective, the
DSA imposes a set of (mostly) harmonised requirements to be awarded the status
by a DSC, once and for all platforms, and to maintain such status afterward.
While the Commission’s guidelines
are in the pipeline, some DSCs have proposed and adopted guidelines to assist
potential applicants with the requirements for the award of the trusted flagger
status. Among others, the French ARCOM published
“Trusted flaggers: conditions and applications” on its website;
the Italian AGCOM published for consultation
its draft “Rules of Procedure for the award of the trusted flagger status under
Article 22 DSA”; the Irish Coimisiún na Meán
published the final version of
its “Application Form and Guidance to award the trusted flagger status under
Article 22 DSA”; as did the Austrian KommAustria,
the Danish KFST
and the Romanian ANCOM.
The national guidelines have been developed following exchanges with the other
authorities designated as DSCs (or about to be so) with the view to ensuring a
consistent and harmonised approach in the implementation of Article 22. As a
matter of fact, the published guidelines are largely comparable.
4 Open issues
While the DSA’s regime is in its
early stages and no trusted flagger status has been awarded yet, some of its
merits have been acknowledged already, such as the fact
that it has standardised existing practices, harmonised eligibility criteria, complemented
special regimes – such as the one set out in Article 17 Copyright Directive - confirmed
the cooperative
approach between stakeholders, and finally formalised
the role of trusted flaggers as special entities in the context of notice and action
procedures.
At the same time, the DSA’s regime
leaves on the table some open issues, such as the respective
role of trusted flaggers and other relevant actors in the context of
tackling illegal/harmful content online, such as end users and reporters that
reach bilateral agreements with the platforms, which remain to be addressed in practice
for the system to effectively work.
The role of trusted flaggers vis-à-vis
end users
While the DSA contains no
specific provision on the role of trusted flaggers vis-à-vis end users, some of
the national guidelines published
by the DSCs require that the applicant entity, as part of the condition
relating to due diligence in the flagging process, indicates whether it has
mechanisms in place to allow end users to report illegal content to it. In general,
applicants have to indicate how they select content to monitor (which may
include end users’ notices) and how they ensure that they do not unduly
concentrate their monitoring on any one side and apply appropriate standards of
assessment taking all legitimate rights and interests into account. As a matter
of fact, the organisation and management of the relationship with end users
(onboarding procedures, collection and processing of their notices, etc.) are
left to the trusted flaggers. For example, some organisations (such as those
part of the INHOPE network, operating
in the current voluntary schemes) offer hotlines
to the public to report to them, including anonymously, illegal content found
online.
Although it is clear from the DSA
that end users retain
the right to flag their notices directly to online platforms (Article 16) with
no duty to notify trusted flaggers, as well as their right to autonomously lodge
a complaint against platforms (Article 53) and to claim compensation for
damages (Article 54), it remains unclear whether, in practice, it will be more
convenient for end users to rely on specialised trusted flaggers for their
notices to be processed more expeditiously – in other words, whether the regime
provides sufficient incentives, at least for some end users, to go the trusted
flaggers’ way. On the other hand, it remains unclear to what extent applicant entities
will be actually ‘required’ to put in place effective mechanisms to allow end
users to report illegal or harmful content to them – in other words, whether the
due diligence requirements will imply the trusted flaggers’ review of end users’
notices, within their area of expertise.
From another perspective, in
connection with the reporting of illegal content, trusted flaggers may come
across infringements by the platforms, as any recipient of online services. In
such cases, Article 53 provides the right to lodge a complaint with the competent
DSC, with no difference being made between complaints lodged respectively by
trusted flaggers and by end users. If ‘priority’ is to be understood as the
main feature of the privileged status granted to trusted flaggers when flagging
illegal content online to platforms, a question arises about the possibility of
granting them a corresponding priority before the DSCs when they complain about
an infringement by online platforms. And in this context, one may wonder whether
lodging a complaint to the DSC on behalf of end users might also fall within
the scope of action of trusted flaggers (to the extent of claiming platforms’
abusive practices such as shadow banning, recital 55).
The role of trusted flaggers vis-à-vis
other reporters
The DSA requires online platforms
to put in place notice and action mechanisms that shall be “easy to access and
user-friendly” (Article 16) and to ensure an internal complaint-handling system
to recipients of the service (Article 20). However, as noted above, these
provisions concern all recipients, with no difference in treatment for trusted
flaggers. Although their notices are granted priority by virtue of Article 22,
which leaves platforms free to choose the most suitable mechanisms, the DSA
says nothing about ‘how much priority’ should be guaranteed to trusted flaggers
with respect to notices filed not only by end users, but (also - and especially)
by other entities/individuals with whom platforms have agreements in place.
In this respect, guidance would
be welcome
as to the degree of prevalence that platforms are expected to give trusted
flaggers’ notices compared to other trusted reporters’, as would a
clarification as to whether the nature of the content may influence such
prevalence. From the trusted flaggers’ perspective, there should be a rewarding
incentive to engage in a role that comes with the price tag of ongoing
obligations.
5 Concluding remarks
While the role of trusted
flaggers is not new when it comes to tackling illegal content online, the tasks
newly entrusted to the DSCs in this context are. This results in a different
allocation of responsibilities for the actors involved, with the declared aims
of ensuring harmonisation of best practices across sectors and territories in
the EU and a better protection for users online. Some open issues, as the ones
put forward above, appear at this stage to be relevant, in particular for
ensuring that the trusted flaggers’ mechanism effectively works as an expeditious
remedy against harmful and illegal content online. It is expected that the
awaited Commission guidelines under Article 22(8) DSA will shed a clarifying
light on those issues. In the absence, there is a risk that the costs-benefits
analysis - with the costs being certain and the benefits in terms of actual
priority uncertain - might make the “trusted flagger project” unattractive for
a potential applicant.
No comments:
Post a Comment