=Paper= {{Paper |id=Vol-3908/paper_8 |storemode=property |title=Governing Online Platforms After the Digital Services Act: an Analysis of the Commission Decision on Initiating Proceedings Against X |pdfUrl=https://ceur-ws.org/Vol-3908/paper_8.pdf |volume=Vol-3908 |authors=Matteo Fabbri |dblpUrl=https://dblp.org/rec/conf/ewaf/Fabbri24 }} ==Governing Online Platforms After the Digital Services Act: an Analysis of the Commission Decision on Initiating Proceedings Against X== https://ceur-ws.org/Vol-3908/paper_8.pdf
                                Governing Online Platforms After the Digital Services
                                Act: an Analysis of the Commission Decision on Initiating
                                Proceedings Against X*

                                Matteo Fabbri1

                                    1    IMT School for Advanced Studies, Lucca, Italy



                                                 Abstract
                                                 As of 17th February 2024, the Digital Services Act (DSA) applies to all online intermediaries in the EU: this
                                                 means that every online platform is now liable for missing to comply with its requirements on, among other
                                                 things, content moderation and recommendation, advertising transparency and systemic risk assessment.
                                                 For what concerns very large online platforms (VLOPs) and search engines (VLOSEs), defined as those
                                                 reaching more than 10% of the EU population or 45 million users, the deadline for full compliance with the
                                                 DSA was set in August 2023. The Commission has the exclusive supervisory and enforcement power for
                                                 the obligations concerning VLOPs and VLOSEs. Consequently, to verify any failure to comply with the new
                                                 rules, the Commission sent requests for information (RFIs) to various VLOPs including X, Facebook,
                                                 Instagram, AliExpress, TikTok and YouTube, starting from October 2023. The RFI is the first stage of an
                                                 investigation under the DSA, which can be followed by the access to the data and algorithms used by the
                                                 platform, the conduction of interviews to informed subjects and inspections at the platform’s premises. The
                                                 information gathered through RFIs has already motivated the opening of two proceedings against X and
                                                 TikTok          in       December           2023        and           February        2024,         respectively.
                                                 In the perspective of clarifying the structure and implications of the DSA enforcement process, this
                                                 contribution aims to analyse the progression from a RFI to the initiation of proceedings against a platform:
                                                 in particular, I aim to understand which type of evidence collected by the Commission through
                                                 investigations can justify the decision to open a procedure of infringement under the DSA. Given that the
                                                 text of already submitted RFIs is not publicly available, nor it is possible to access official documents about
                                                 further investigative steps (as they have not taken place yet), I will compare the press release about a specific
                                                 RFI with its first legal consequence, i.e. the Commission decision on initiating proceedings against X,
                                                 published on December 18th 2023. This comparison will set the ground for understanding how the evidence
                                                 collected by the Commission can justify the initiation of proceedings. The analysis relies on documents
                                                 available through the European Commission website, including press releases, executive summaries, call
                                                 for applications and the Commission decision itself, which is the only legal document publicly disclosed
                                                 until now. Moreover, I consider the definition of the respective roles of the institutional bodies responsible
                                                 for the enforcement of DSA, i.e. the European Centre for Algorithmic Transparency (ECAT) and the DSA
                                                 enforcement            team,           whose           work             is         closely          intertwined.
                                                 The public availability of integral legal documents about the DSA enforcement procedures is essential for
                                                 fostering scrutiny by researchers and platform users. Sharing information on who does what and how, i.e.
                                                 which entity of the Commission is responsible for selecting specific evidence about the illegal activities of
                                                 platforms and how such evidence can justify the opening of proceedings, is a necessary requisite for
                                                 transparent and accountable governance. Given the unavailability of these documents and information, I
                                                 argue that the DSA enforcement process is not sufficiently transparent nor accountable to the public at the
                                                 current stage and may possibly lead to unfair governance.
                                                 .



                                1. Introduction

                                As of 17th February 2024, the Digital Services Act (DSA) [1] applies “to all online intermediaries in
                                the EU” [2]: this means that every online platform is now liable for missing to comply with its
                                requirements on, among other things, content moderation and recommendation, advertising
                                transparency and systemic risk assessment. For what concerns very large online platforms (VLOPs)


                                ∗ EWAF’24: European Workshop on Algorithmic Fairness, July 01-03, 2024, Mainz, Germany
                                   matteo.fabbri@sns.it (M. Fabbri)
                                          © 2023 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).

CEUR
                  ceur-ws.org
Workshop      ISSN 1613-0073
Proceedings
and search engines (VLOSEs), defined as those reaching more than 10% of the EU population or 45
million users, the deadline for full compliance with the DSA was set in August 2023. The Commission
has the exclusive supervisory and enforcement power for the obligations concerning VLOPs and
VLOSEs. Consequently, to verify any failure to comply with the new rules, the Commission sent
requests for information (RFIs) to various VLOPs including X, Facebook, Instagram, AliExpress,
TikTok and YouTube, starting from October 2023 [3]. The RFI is the first stage of an investigation
under the DSA, which can be followed by the access to the data and algorithms used by the platform,
the conduction of interviews to informed subjects and inspections at the platform’s premises. The
information gathered through RFIs has already motivated the opening of various proceedings of
infringement, including against X, TikTok, AliExpress, Facebook and Instagram, between December
2023 and May 2024.
In the perspective of clarifying the structure and implications of the DSA enforcement process, my
research focuses on analysing the progression from a RFI to the initiation of proceedings against a
platform: in particular, I aim to understand which type of evidence collected by the Commission
through investigations can justify the decision to open a procedure of infringement under the DSA.
Given that the text of already submitted RFIs is not publicly available, nor it is possible to access
official documents about further investigative steps (as they may not have taken place yet), I will
compare the press release about a specific RFI with its first legal consequence, i.e. the Commission
decision on initiating proceedings against X (hereafter Commission decision), published on
December 18th 2023 [4]. This comparison will set the ground for understanding how the evidence
collected by the Commission can justify the initiation of proceedings. The analysis relies on
documents available through the European Commission website, including press releases, executive
summaries, call for applications and the Commission decision itself, which is the first legal document
publicly disclosed regarding the DSA enforcement process. In fact, it is worth noting that the official
text of the Commission decision on initiating proceedings against TikTok, which follows the one
against X and dates to February 19th 2024, has not been shared yet: its content can be inferred just
from the related press release [5] and a one-page summary note [6]. The only other Commission
decision on initiating proceedings whose text has been publicly disclosed is the one against
AliExpress, which dates to March 14th 2024.
The public availability of integral legal documents about the DSA enforcement procedures is
essential for fostering scrutiny by researchers and platform users. Moreover, sharing information on
who does what and how, i.e. which entity of the Commission is responsible for selecting specific
evidence about the illegal activities of platforms and how such evidence can justify the opening of
proceedings, is a necessary requisite for transparent and accountable governance. Given the
unavailability of these documents and information, I argue that the DSA enforcement process is not
sufficiently transparent nor accountable to civil society and researchers at the current stage.


2. The content and consequences of a request for information
According to the related press release, the RFI sent to X [7] on October 12th 2023 concerned “the
assessment and mitigation of risks related to the dissemination of illegal content, disinformation,
gender-based violence, and any negative effects on the exercise of fundamental rights, rights of the
child, public security and mental well-being”. The Commission aimed at scrutinizing X’s “policies
and actions regarding notices on illegal content, complaint handling, risk assessment and measures
to mitigate the risks identified”. Following the submission of the RFI, the Commission decision
identified five areas of concerns in which the platform is suspected to have infringed the DSA
provisions. A closer look at the document of the decision can help highlight the content and
consequences of a RFI to a VLOP.
Firstly, Twitter International Unlimited Company (TIUC), “the main establishment of the provider
of X in the European Union”, failed to “diligently assess certain systemic risks in the European Union
stemming from the design and functioning of X and its related systems, including algorithmic
systems, or from the use made of their services” [4]. In particular, the company did not “put in place
reasonable, proportionate and effective mitigation measures” for “the actual and foreseeable negative
effects on civic discourse and electoral process stemming from the design and functioning of X in
the European Union”: in fact, the current solutions “appear inadequate […] notably in the absence of
well-defined and objectively verifiable performance metrics” (ibidem). This failure is particularly
evident in the moderation of contents featuring languages different from English or pertaining to
specific local and regional contexts. Suspicions that “insufficient resources [are] dedicated to
mitigation measures” (ibidem) focus on the role of Community Notes, the collaborative feature that
allows users to “leave notes on any post and, if enough contributors from different points of view
rate that note as helpful, the note will be publicly shown on a post” [8].
Secondly, the company would systematically fail to process efficiently, “take decisions in a diligent,
non-arbitrary, and objective manner” and answer “without undue delay” to the “notices […] of
allegedly illegal content hosted on X” [4]: therefore, X’s content moderation would be insufficient
also when the input comes from users. Thirdly, the recent possibility of purchasing the blue
checkmark that once marked an account as verified is considered deceptive and manipulative for
users of X, who “are led to interpret […] checkmarks as an indication that they are interacting with
an account whose identity has been verified or is otherwise more trustworthy, when in fact no such
verification or confirmation of trustworthiness appear to have taken place” (ibidem). Fourthly, the
company did not abide by the transparency requirements for online advertising, “by not providing
searchable and reliable tools that allow multicriteria queries and application programming interfaces
to obtain all the information on such advertisements as required by Article 39(2)” of the DSA
(ibidem). Lastly, the VLOP seems “to have denied access to data that are publicly accessible on X’s
online interface to qualified researchers” (ibidem) by imposing costs for using the API and
prohibiting the scraping of publicly accessible data, in violation of Article 40(12) of the DSA.
As the excerpts from the Commission decision highlight, the proceedings against X were initiated as
a follow-up to the platform’s response to the RFI submitted on October 12th 2023: in particular, the
problematic issues regarding the handling of notices of illegal content, the measures to mitigate
systemic risks and the moderation policy, indicated in the press release about the RFI, correspond to
the main areas of concern addressed by the decision. The consequentiality between the RFI and the
decision is underlined also by points 4 and 5 of the latter: in fact, while point 4 refers to the
submission of the RFI and the response provided by X, point 5 states that, according to Article 66(1)
of the DSA, “the Commission may initiate proceedings in view of the possible adoption of decisions
[…] in respect of the relevant conduct by the provider of the very large online platform or of the very
large online search engine that the Commission suspects of having infringed any of the provisions”
(ibidem).

3. What to expect from the next steps

The areas of concern addressed by the Commission decision include issues that emerge both from
the bottom-up interaction with the platform and the top-down governance of its socio-technical
ecosystem: data access by researchers and users’ notices of illegal content pertain to the former,
while the identification of systemic risks and the measures adopted to mitigate them pertain to the
latter. The co-existence of top-down and bottom-up perspectives is not only motivated by the
necessity to address issues coming from different sources (e.g. complaints filed by recipients of the
service versus a letter from the Commission to the platform requesting clarifications); it is also
functional to investigate different aspects of the same area of concern (e.g. the use of Community
Notes to support content moderation, whose effectiveness is questioned both by users and by the
Commission as reliance on this functionality may cover the lack of specialized personnel). The
structure and content of the decision indicate that, after the enforcement of the DSA, the VLOPs and
VLOSEs may not be able to consider anymore prima facie compliance as separated from the
obligations to provide reasonably prompt and reliable answers to the issues raised by individual users
and researchers, the traditionally disadvantaged side in the interactions with platforms.
The investigation initiated by the Commission decision aims to establish whether the failures
outlined above “would constitute infringements of Articles 34(1), 34(2) and 35(1)” of the DSA as
regards the first area of concern (inadequate assessment and mitigation of systemic risks), Article
16(5) and 16(6) for what concerns the second one (handling of notices of illegal content), Article 25(1)
with respect to the third one (deceptive design of checkmarks), Article 39 with reference to the fourth
one (lack of tools to ensure transparency in ads) and Article 40 apropos the fifth one (denied data
access to researchers) [4]. According to [9], the next steps of the investigation will include gathering
further “evidence, for example by sending additional requests for information, conducting interviews
or inspections”. Following the opening of formal infringement proceedings, the Commission will be
empowered “to take further enforcement steps, such as interim measures, and non-compliance
decisions” and “to accept any commitment made by X to remedy on the matters subject to the
proceeding” (ibidem).
Interestingly, the “DSA does not set any legal deadline” for the end of the proceedings, whose
duration will depend on several “factors, including the complexity of the case, the extent to which
the company concerned cooperate with the Commission and the exercise of the rights of defence”
[9]. The responsibility of carrying the investigation pertains just to the Commission, whose decision
“relieves Digital Services Coordinators, or any other competent authority of EU Member States, of
their powers to supervise and enforce the DSA in relation to the suspected infringements of Articles
16(5), 16(6) and 25(1)” (ibidem). In this context, it is difficult to make hypotheses about a timeline for
the conclusion of the proceedings against X, as this investigation will be accompanied by similar
ones originating from the RFIs submitted to other VLOPs or VLOSEs.

4. The intertwined role of ECAT and DG Connect

Given the amount of investigative work which the Commission will embark on in the next months,
it is useful to focus on the entities involved in this process: on the one side, the European Centre for
Algorithmic Transparency (ECAT), within the Joint Research Centre (JRC), and, on the other side,
the DSA enforcement team, within Directorate F (Platforms Policy and Enforcement) [10] of the DG
Connect. While the DSA enforcement team should have the responsibility of enforcing the
regulation, some crucial investigative procedures, like the on-site inspections at the platforms’
premises, would be carried by ECAT: therefore, it is unclear how the collaboration and the division
of duties between the two institutional bodies will be managed.
On the one hand, the mission of ECAT is to “provide technical assistance and practical guidance for
the enforcement of the DSA” and “research the long-running impact of algorithmic systems to inform
policy-making and contribute to the public discussion” [11]: its activity will include inspecting and
testing the algorithmic systems used by VLOPs to understand their functioning, studying their
“short, mid and long-term societal impact” and develop “practical methodologies towards fair,
transparent and accountable algorithmic approaches, with a focus on recommender systems”. On the
other hand, the DSA enforcement team will be composed by “multi-disciplinary teams […] co-
operating with regulatory authorities in the Member States”, which “will engage with stakeholders
and gather knowledge and evidence to support the application of the DSA and to detect, investigate
and analyse potential infringements of the DSA” [10].
As can be seen, the information provided by the Commission is currently not sufficient to understand
how the work of ECAT and the DSA enforcement team intertwines specifically. Indeed, both these
entities focus on gathering knowledge and evidence about VLOPs and VLOSEs to support the
enforcement of the DSA, to the point that that ECAT features an “Algorithm inspections & DSA
enforcement” [12] team, whose work overlaps even nominally with that carried at DG Connect. In
particular, the profile of a technology specialist in the enforcement team at DG Connect is, if not
similar, at least complementary to that of an inspector at ECAT: [13] specifies that technology
specialists “will work in seamless cooperation with the European Centre for Algorithmic
Transparency (ECAT) and facilitate interactions with technical teams at very large online platforms
and search engines”.


5. Conclusion

The application of the DSA is proceeding at fast pace: on December 20th 2023, the Commission
designated three pornographic platforms as VLOPs while introducing more stringent requirements
for them; on January 18th 2024, it sent RFIs to seventeen VLOPs and VLOSEs focusing on the
measures they have taken to ensure data access to eligible researchers; on February 19th 2024,
TikTok was notified the initiation of proceedings, which have been opened also against AliExpress
on March 14th and against Facebook and Instagram on May 16th. While the DSA has come into force
for every online platform since February 17th 2024, the main questions about the modalities and
timeline of its application and the actions that platforms will need to take to ensure compliance are
still open. In particular, it still needs to be clarified which regulatory mechanisms and investigative
evidence undergird the progression from a RFI to the initiation of proceedings against a platform.
Considering that there is no legal deadline for the end of the proceedings, neither platforms nor users
can have an estimate of their duration. Users are the main beneficiaries of the protection granted by
the DSA, but if they are not involved in shaping its enforcement, the new provisions may not have
tangible beneficial outcomes. This contribution shows the direction of the first enforcement
procedure under the DSA, thereby highlighting the type of evidence that European regulators see as
symptomatic of non-compliance. The outcome of the ongoing proceedings against VLOPs will set a
milestone for the future DSA enforcement strategies and their implications for users’ online
experience. The DSA has the potential to change the interaction between digital companies and
European citizens by enhancing public accountability and users’ empowerment. However, for this
potential to be realized, regulatory principles need to be translated into clear and transparent
enforcement practices, which should be publicly disclosed to European researchers and citizens.




References
   [1] REGULATION (EU) 2022/2065 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL
of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC
(Digital           Services            Act).          URL:          https://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CELEX:32022R2065
   [2] European Commission (2024). Digital Services Act starts applying to all online platforms in
the EU. URL: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_881
   [3] European Commission (2024). Supervision of the designated very large online platforms and
search engines under DSA. URL: https://digital-strategy.ec.europa.eu/en/policies/list-designated-
vlops-and-vloses#ecl-inpage-lqfbha7w
   [4] European Commission (2023). Commission decision initiating proceedings pursuant to Article
66(1)             of            Regulation             (EU)            2022/2065.              URL:
https://ec.europa.eu/newsroom/dae/redirection/document/101292
   [5] European Commission (2024). Commission opens formal proceedings against TikTok under
the Digital Services Act. URL: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926
   [6] European Commission (2024). Summary of the TikTok decision. URL:
https://ec.europa.eu/newsroom/dae/redirection/document/102958
   [7] European Commission (2023). The Commission sends request for information to X under the
Digital Services Act*. URL: https://ec.europa.eu/commission/presscorner/detail/en/IP_23_4953
   [8] X Help Center. About Community Notes on X. URL: https://help.twitter.com/en/using-
x/community-notes
   [9] European Commission (2023). Commission opens formal proceedings against X under the
Digital Services Act. URL: https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6709
   [10] European Commission (2024). Do you want to help enforce the Digital Services Act? Apply
now to be part of the DSA enforcement team! URL: https://digital-strategy.ec.europa.eu/en/news/do-
you-want-help-enforce-digital-services-act-apply-now-be-part-dsa-enforcement-team
   [11] European Centre for Algorithmic Transparency. About ECAT. URL: https://algorithmic-
transparency.ec.europa.eu/about_en
   [12] European Centre for Algorithmic Transparency. Who we are. URL: https://algorithmic-
transparency.ec.europa.eu/team_en
   [13] European Commission (2024). Job opportunities within the Digital Services Act Enforcement
Team.             URL:             https://eu-careers.europa.eu/sites/default/files/eu_vacancies/2024-
01/2024_2nd%20call_Job_opportunities%20within%20the%20DSA%20Enforcement%20Team%20final
%20with%20privacy%20statement%20Final.pdf