<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Vulnerability under the Digital Services Act⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Elif Beyza Akkanat-Öztürk</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Istanbul University, Faculty of Law, Comparative Law Department</institution>
          ,
          <addr-line>İstanbul</addr-line>
          ,
          <country country="TR">Türkiye</country>
        </aff>
      </contrib-group>
      <fpage>0000</fpage>
      <lpage>0003</lpage>
      <abstract>
        <p>The Digital Services Act (DSA) introduces a paradigm shift in platform governance, placing transparency at the heart of regulatory efforts. Yet despite its promise to rebalance power asymmetries between users and platforms, this paper argues that the DSA's transparency obligations may create an unintended “transparency paradox,” where the excessive volume and technical nature of disclosures risk reinforcing, rather than reducing, digital vulnerability. Drawing on legal theory and empirical insights, this paper critically assesses the DSA's transparency regime, specifically considering users' cognitive constraints, interface design patterns, and informational inequality It argues that formal compliance with transparency norms does not necessarily yield meaningful understanding or foster true user empowerment. Instead, it may inadvertently obscure the structural power dynamics embedded in platform design and data governance. Building on interdisciplinary research, the paper proposes a shift from mere data-dump transparency toward contextual, user-tested, and layered communication strategies. By reframing transparency as a substantive, user-centric principle, this study offers normative and practical recommendations for European Union (EU) digital regulation to better address digitally enhanced power asymmetries and thereby promote democratic information environments.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Digital Services Act</kwd>
        <kwd>transparency paradox</kwd>
        <kwd>digital vulnerability</kwd>
        <kwd>user empowerment</kwd>
        <kwd>information asymmetry</kwd>
        <kwd>EU platform regulation</kwd>
        <kwd>1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
[
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
    </sec>
    <sec id="sec-2">
      <title>1. Introduction</title>
      <p>The EU’s Digital Services Act (DSA) aspires to reshape digital platform regulation through an
ambitious set of transparency obligations. By requiring platforms to disclose the logic of
recommender systems, content moderation procedures, advertising practices, and systemic risk
assessments, the DSA seeks to recalibrate the relationship between users and intermediaries. At its
normative core lies the promise that more transparency will empower users, hold platforms
accountable, and mitigate power asymmetries in the digital ecosystem.</p>
      <p>
        However, as transparency becomes the dominant legal tool for reining in platform power,
emerging scholarship questions whether these obligations fulfil their empowerment function.
Instead of reducing asymmetries, mandatory disclosures may overwhelm users with legalistic or
opaque information, leaving them no better off—or worse, falsely reassured ([
        <xref ref-type="bibr" rid="ref2 ref3">2, 3, 15</xref>
        ]. This
phenomenon has been labelled the “transparency paradox”: a regulatory condition in which formal
transparency obscures rather than clarifies, and procedural openness masks structural dominance
      </p>
      <p>This paradox is empirically observable in national contexts as well. A recent study by the Turkish
Competition Authority showed that 81.7% of surveyed users did not understand how free digital
services are funded, and 71.5% were unaware they were sharing personal data on such platforms</p>
      <p>
        This paper interrogates the DSA’s transparency framework through the lens of digital
vulnerability, a concept that captures the socio-technical fragilities exacerbated by data-driven
platforms [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. Drawing from interdisciplinary legal theory, consumer protection law, empirical
reports [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], and EU jurisprudence, it argues that the DSA’s model of transparency needs a substantive
overhaul. The goal is not merely to critique the shortcomings of current obligations, but to propose
a normative reframing: transparency should be seen not as the end itself, but as a means of
communicative justice.
      </p>
      <p>To this end, the paper is structured as follows. Section 2 outlines the normative role of
transparency in digital regulation and its evolution in EU law. Section 3 maps the architecture of
transparency obligations under the DSA. Section 4 introduces the informational crisis and the
transparency paradox as structural challenges. Section 5 expands on the concept of digital
vulnerability and its intersection with platform asymmetries. Section 6 contextualizes these issues
within broader content governance and algorithmic curation practices. Section 7 presents normative
and design-based recommendations for a user-centric transparency model. Finally, Section 8
concludes with reflections on transparency’s evolving regulatory role.</p>
    </sec>
    <sec id="sec-3">
      <title>2. Transparency as a Legal Tool in Platform Governance</title>
      <p>Transparency has become a foundational regulatory principle in the EU’s digital governance
framework. From the General Data Protection Regulation (GDPR) to the Platform-to-Business (P2B)
Regulation and now the Digital Services Act (DSA), it is widely presumed that transparency can
rebalance asymmetries between platform operators and end users, allowing individuals to
understand, challenge, or opt out of harmful digital practices. However, as the regulatory reliance on
transparency increases, so too does the risk of conflating disclosure with understanding, and
formality with fairness.</p>
      <p>
        At its core, transparency is intended to promote accountability and participation. In public law, it
is associated with open government and democratic legitimacy; in private law, particularly in
consumer protection and contract law, it underpins doctrines of informed consent and fairness [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
Within the DSA, transparency assumes a procedural form: users are to be informed about
recommender systems, content moderation logic, systemic risks, and advertising parameters. These
disclosures are presumed to foster user empowerment through informed digital choice. If users
remain unaware of how so-called “free” digital services are monetized, the very foundation of
consent in data exchanges—namely, an understanding of the transaction—is called into question
[
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
      <p>
        However, the instrumental role of transparency faces critical theoretical challenges. As
BenShahar and Schneider famously argue, “mandatory disclosure is the most common and least effective
form of regulation” [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. When disclosures are excessively long, overly technical, or poorly timed -a
common critique seen with GDPR-mandated privacy notices- they risk becoming performative: a
checkbox for legal compliance rather than a vehicle for user understanding or control. Consistently,
the Turkish Competition Authority reported that nearly 80% of users never revise their privacy
settings after initial selection, a trend that reflects the influence of design nudges on user inertia [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
In the digital context, the opacity of algorithmic systems, behavioural targeting, and personalization
techniques render many transparency measures illusory. Scholars have termed this condition
translucency: the appearance of openness without genuine visibility [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
      </p>
      <p>
        Moreover, the assumption that all users are equally positioned to benefit from transparency fails
to account for structural inequalities. As Mišćenić notes, digital environments create a dual
asymmetry: not only do users lack bargaining power, but they are also cognitively and
informationally disadvantaged [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. This leads to what Liu calls digitally enhanced power
asymmetries—platforms leverage scale, opacity, and data-driven insights to deepen their dominance,
while users are left with disclosures, they cannot meaningfully process [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>
        Findings from Türkiye reinforce this diagnosis: over 70% of users believe their personal data is not
used for its declared purpose, while more than half express concern over unauthorized access or
resale [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
      <p>Thus, although transparency retains normative appeal, its deployment in digital regulation
demands closer scrutiny. It is not enough to disclose: the substance, structure, and timing of
transparency matter profoundly. The DSA presents an opportunity to rethink transparency not as
an end, but as a communicative, contextual, and user-sensitive legal obligation. This rethinking is
essential to bridge the widening gap between formal transparency and actual empowerment.</p>
    </sec>
    <sec id="sec-4">
      <title>3. The Transparency Architecture of the DSA</title>
      <p>The DSA introduces one of the most comprehensive transparency regimes in global platform
regulation. Its design reflects a legislative ambition to impose visibility on previously opaque systems
of algorithmic decision-making, content moderation, and systemic risk management. At the heart of
this architecture lies the assumption that disclosure of internal processes will empower users, foster
public accountability, and enable regulatory oversight.</p>
      <p>Among the core transparency obligations are the publication of content moderation reports (DSA
art. 15), the disclosure of the logic behind recommender systems (DSA art. 27), clear notice and
justification of content removals (DSA art. 17), and the obligation to perform risk assessments and
audits, especially for Very Large Online Platforms (VLOPs) (DSA art. 34-37). Furthermore, the DSA
mandates access to ad repositories and transparency around targeting parameters and revenue
sources (DSA art. 39). These measures are designed not only for end-users but also for regulators,
researchers, and civil society actors.</p>
      <p>
        Yet this ambitious framework is already revealing practical and conceptual shortcomings. Early
implementation experiences suggest that platforms often respond to obligations by issuing
voluminous, generic, or highly technical reports that, while formally compliant, fail to deliver real
insight [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. Algorithmic transparency, for instance, is frequently reduced to the disclosure of abstract
design principles or conditional logic, offering little substantive information about how personalized
feeds are curated or how data inputs shape outputs.
      </p>
      <p>
        Moreover, empirical research shows wide variation in how platforms approach these obligations.
While companies like Meta or TikTok have developed structured, navigable transparency reports,
smaller or less resourced platforms tend to offer minimal, often inaccessible information [
        <xref ref-type="bibr" rid="ref7 ref8">7, 8</xref>
        ]. Even
among VLOPs, there is no harmonization in format, terminology, or presentation, making
comparisons difficult and limiting the utility of transparency for public scrutiny. This regulatory gap
has recently been addressed through a dedicated implementing act by the European Commission
[
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
      </p>
      <p>
        The fact that nearly half of Turkish users who read privacy notices do not understand them due
to length or complexity, a problem persistent even under frameworks like the GDPR, illustrates the
gap between formal transparency and functional comprehensibility-a distinction increasingly vital
in assessing regulatory efficacy [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
      <p>
        A further complication stems from the legalistic character of these disclosures. As Mišćenić
observes, information duties in the digital environment often mirror traditional consumer law
models—placing the burden on the user to read, interpret, and act upon standardized disclosures [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
This ignores the cognitive and behavioural realities of digital interaction, where users operate in
fragmented, time-pressured, and interface-optimized environments. The result is often an
information dump: legally exhaustive but practically incomprehensible.
      </p>
      <p>Ultimately, the DSA’s transparency architecture reflects a logic of formal accountability that may
miss its substantive goal. Compliance is assessed in terms of disclosure quantity and procedural
execution, rather than effectiveness or user impact. Without clear criteria for accessibility,
standardization, or usability, the DSA risks reinforcing the transparency paradox it seeks to
resolve.</p>
    </sec>
    <sec id="sec-5">
      <title>4. The Informational Crisis and the Transparency Paradox</title>
      <p>
        Despite the normative elegance of transparency, digital regulation increasingly suffers from what
scholars have termed an informational crisis—a systemic mismatch between the quantity of disclosed
information and the cognitive, temporal, and interpretive capacities of users [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>This crisis is not incidental but structural, arising from the regulatory tendency to equate
transparency with disclosure volume rather than with communicative efficacy.</p>
      <p>
        At the core of this paradox lies an epistemological assumption: that if information is disclosed, it is
thereby understood and can be acted upon. Yet as Sunstein [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] argues, too much information can be
as disabling as too little. Legal mandates that result in verbose, technical, or fragmented disclosures
often lead users to ignore, misinterpret, or feel overwhelmed by the information provided [16]. The
architecture of such transparency regimes reflects an intention-action gap, wherein regulators intend
to empower, but the structure of delivery leads to passivity or disengagement.
      </p>
      <p>
        Ben-Shahar and Schneider [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] provide a foundational critique of what they term the “failure of
mandated disclosure.” Their empirical and doctrinal analysis demonstrates that disclosure regulation
frequently overestimates the rationality and attentiveness of average users, especially when
interacting with complex contractual, algorithmic, or platform-based environments. In digital
contexts, this critique becomes even more acute: the pace, interface design, and asymmetries of
information architecture all contribute to what Liu [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] calls digitally enhanced power imbalances.
      </p>
      <p>
        Further insights from the Turkish Competition Authority’s empirical study reinforce this
argument, highlighting that 71.5% of users were unaware they had shared personal data while using
online platforms, and over 81% had no knowledge of how ad-financed services operate [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
Furthermore, privacy policies were often unread or not understood due to excessive length and
complexity, confirming the persistence of the transparency paradox even when disclosure is formally
fulfilled.
      </p>
      <p>These findings are further reinforced by other aspects of the same survey, which reveal that 70.2%
of users believe their data is not used in line with its intended purpose, and 55.4% express concerns
over unauthorized use or resale of their data. Additionally, 80% of users never revise their privacy
settings once selected, with a majority citing the length and complexity of privacy policies as
significant barriers to comprehension. These patterns underscore the existence of the “privacy
paradox” in Türkiye, where users’ stated concerns about data misuse are not reflected in their actual
digital behaviours—mirroring trends observed across other jurisdictions.</p>
      <p>
        These power imbalances are not just about the quantity of information withheld, but also about
the design of how information is presented and operationalized. Dark patterns, default settings,
nudges, and persuasive UI designs all interact with transparency to create what Mišćenić [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] describes
as a gap between digital fairness and digital formality. Users are nominally informed—via cookie
banners, standard terms, recommender disclosures—but their ability to comprehend and act remains
structurally constrained.
      </p>
      <p>
        The result is a transparency paradox: platforms are more transparent than ever in a procedural
sense, yet users are more vulnerable than ever in a substantive sense. Visualising this paradox,
Mišćenić highlights how traders’ terms and conditions often meet formal EU transparency
requirements but fail the test of intelligibility or real-world empowerment [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        This is exacerbated by the dynamic nature of digital environments—where terms are unilaterally
modified, disclosures are hidden behind hyperlinks, and standardization is absent.
Furthermore, the phenomenon of translucency, as developed by Rossi, [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] captures a critical
regulatory pathology: when platforms disclose in such a way that visibility is simulated but opacity
remains. Legal language becomes a shield, not a window; data is disclosed but not explained.
      </p>
      <p>
        To resolve this paradox, transparency must be redefined. It must shift from a narrow focus on
legal compliance to a broader, interdisciplinary understanding of communicative justice. As Liu [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]
and Crea &amp; De Franceschi [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] argue, only a user-centric, vulnerability-aware model of transparency
can close the gap between rights and reality. The DSA offers a valuable, yet under-realized
opportunity to move in this direction.
      </p>
    </sec>
    <sec id="sec-6">
      <title>5. Digital Vulnerability and Power Asymmetries</title>
      <p>
        The concept of digital vulnerability has emerged as a critical lens through which to reassess
traditional assumptions in consumer protection and platform regulation. Unlike the classical notion
of vulnerability—rooted in age, education, or economic status—digital vulnerability is situational,
systemic, and interface-driven. It captures how users become exposed to harm not solely because of
inherent traits, but because of how digital environments are designed, structured, and governed
[
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>
        Michelle Liu’s [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] analysis offers a foundational critique of EU law’s assumptions about power
asymmetries. She argues that EU instruments, including the DSA, often rely on static models of user
weakness—typically equating it with being a “consumer” or a “data subject.” Yet, in the digital
context, these categorizations are insufficient. Platforms actively shape the user experience through
behavioural analytics, data-driven nudging, and algorithmic curation ([16]). Power is not merely a
matter of information disparity, but of manipulability: users’ choices are not only uninformed, but
often pre-structured by design [15]. The perceived sense of being constantly tracked—identified by
Turkish users as a primary source of discomfort in online environments—adds a psychological
dimension to digital vulnerability that the DSA’s procedural obligations currently overlook [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
      <p>
        This structural manipulability manifests in several layers. First, users are embedded in interface
logics where agency is undermined through pre-selected defaults, dark patterns, and lack of exit
options [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Second, as Irina Domurath [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] explains, users suffer from hypo-autonomy—a condition
where their formal rights to choose are retained, but meaningful capacity to exercise those rights is
eroded. Digital vulnerability thus resides not in individual fragility but in relational
disempowerment: users are rendered fragile by architecture and governance, not by nature.
      </p>
      <p>
        A particularly acute form of digital vulnerability concerns cognitive asymmetry. As noted by
Goanta et al. [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], users cannot grasp the implications of personalized pricing, automated filtering,
or content prioritization, not because they are inattentive, but because these systems are
intentionally complex. Algorithmic opacity functions as a control mechanism, and transparency
obligations—without standardization or explanation—do little to alleviate this. Rather, they further
outsource the burden of vigilance to the already-disadvantaged party.
      </p>
      <p>The current legal framework only partially acknowledges these dynamics. While the DSA
addresses systemic risk and mandates independent audits for VLOPs, it does not embed digital
vulnerability as a guiding legal principle. Nor does it require design-based equity—the notion that
user interfaces should be calibrated to prevent exploitation and enable resilience. This omission
reflects a broader regulatory lag: legal regimes struggle to adapt to non-linear, feedback-driven
environments where vulnerability is generated in real-time and at scale.</p>
      <p>
        Crea and De Franceschi [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] call for a paradigm shift—one that places digital vulnerability at the
centre of private law reform. This requires new doctrinal tools, such as algorithmic fairness tests,
proactive interface standards, and dynamic assessments of user exposure. It also implies institutional
shifts: regulatory authorities must develop capacity to evaluate not just the legality of disclosures,
but their intelligibility and impact.
      </p>
      <p>In sum, digital vulnerability reframes transparency not as a formal gesture of openness, but as a
structural condition of participation. The DSA’s future success hinges on its ability to internalize this
perspective—not merely by refining obligations, but by transforming the underlying logic of user
protection from passive disclosure to active empowerment.</p>
    </sec>
    <sec id="sec-7">
      <title>6. Intersections with News, Algorithmic Filtering, and Content</title>
    </sec>
    <sec id="sec-8">
      <title>Regulation</title>
      <p>
        Building on the concept of digital vulnerability, the DSA’s transparency obligations, while primarily
aimed at regulating platform operations, have far-reaching implications beyond consumer protection
-particularly in the domain of news distribution and democratic discourse, where they can exacerbate
or mitigate user fragilities. As digital platforms become the primary gateway to journalistic content,
the modalities of algorithmic curation, ranking, and visibility directly influence the plurality, quality,
and accessibility of information available to the public [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ].
      </p>
      <p>The ACCC-commissioned report, The Impact of Digital Platforms on News and Journalistic
Content (4), underscores this shift: platforms no longer act merely as intermediaries but as de facto
editors. Their recommender systems shape which stories users see, in what order, and how
frequently introducing new layers of algorithmic gatekeeping. These curational processes are
typically opaque, even though they profoundly impact civic participation, public trust, and media
sustainability.</p>
      <p>Transparency, in this context, intersects with both epistemic justice and media pluralism. Users
are not simply consumers of digital services but democratic subjects who rely on credible
information to form opinions and make choices. The opacity of recommender systems and content
delivery mechanisms can thus be viewed as a democratic deficit. As the ACCC notes, news
consumers are often unaware of the criteria used to prioritize content or of the economic incentives
that shape platform–publisher relationships [8, Ch. 2–3].</p>
      <p>
        Moreover, the lack of standardization in transparency reporting across platforms contributes to
fragmented visibility. Urman and Makhortykh [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] document the inconsistency in how major
platforms report on their moderation and recommendation practices, making it nearly impossible to
assess systemic patterns or compare across services. This incoherence hampers not only user
comprehension but also academic and regulatory scrutiny. The result is a transparency regime that
discloses without informing and regulates without enabling democratic oversight.
      </p>
      <p>In addition, algorithmic filtering and content personalization often reinforce filter bubbles and
echo chambers, though empirical evidence remains mixed [8, Ch. 2.4]. Still, the DSA’s focus on
recommender transparency—especially for VLOPs—marks an important first step in mitigating these
effects. Yet without accompanying measures to translate disclosures into actionable knowledge (e.g.,
interface labels, user-choice toggles, or plain-language summaries), these obligations remain inert.</p>
      <p>The intersection of platform transparency with journalistic content also reveals deeper tensions
around informational integrity. As the ACCC highlights, digital monetization models incentivize
short, emotionally charged, and viral content—undermining editorial independence and long-form
journalism. This economic restructuring of content production, while not directly addressed by the
DSA, is reinforced by its narrow conception of transparency as a procedural duty rather than a
substantive safeguard of public goods.</p>
      <p>
        In this light, transparency must evolve to serve democratic ends. This involves not only clarifying
the mechanics of algorithmic distribution but also foregrounding the societal role of platforms in
shaping public discourse. As Pasquale [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] and others argue, platforms have become information
infrastructures with quasi-public responsibilities. Legal frameworks like the DSA must therefore
expand their scope—not just to regulate market failures, but to sustain epistemic diversity and
democratic resilience.
      </p>
    </sec>
    <sec id="sec-9">
      <title>7. Toward Meaningful Transparency: Normative, Design, and</title>
    </sec>
    <sec id="sec-10">
      <title>Institutional Recommendations</title>
      <p>The preceding analysis reveals that the Digital Services Act’s transparency framework, while
ambitious, remains entangled in formalism and suffers from limited normative depth. To avoid
reinforcing the transparency paradox, the EU must move toward a concept of meaningful
transparency—one that is not merely procedural but enables actual understanding, decision-making,
and agency. This shift entails both legal reform and design-based intervention.</p>
      <sec id="sec-10-1">
        <title>7.1. From Disclosure to Communication: Layered and User-Tested Formats</title>
        <p>
          Transparency obligations should be guided not by quantity, but by communicative clarity, leveraging
modern IT tools for dynamic and interactive user engagement. As Ben-Shahar &amp; Schneider
recommend [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ], disclosures must be tailored to human cognitive limits—employing visuals,
summaries, and progressive layers of detail. Rather than static, long legalistic documents, platforms
should be required to implement layered and interactive transparency formats: a brief user-friendly
explanation followed by expandable technical detail, allowing users to actively explore information
tailored to their context.
        </p>
        <p>
          This approach also demands rigorous usability testing, which can help bridge the
intentionaction gap observed in the privacy paradox by ensuring information is not only visible but also
actionable and empowers users to make meaningful choices. Much like accessibility standards in
disability law, information disclosures should be empirically evaluated for intelligibility. The GDPR
mandates “clear and plain language” in its articles 12–14 yet offers no systematic mechanism to
verify comprehension. The DSA could fill this gap by introducing transparency impact assessments
—requiring platforms to demonstrate that disclosures are not just visible but graspable [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ].
        </p>
      </sec>
      <sec id="sec-10-2">
        <title>7.2. Standardization and Interoperability of Transparency Formats</title>
        <p>
          The lack of harmonized formats undermines the comparative and regulatory utility of transparency
obligations. A legally binding EU Transparency Standard—co-developed by the Commission,
academic experts, and civil society—could address this issue. Such a standard should specify layout,
terminology, data structure, and even colour coding for key disclosures [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ].
        </p>
        <p>This standardization would also facilitate regulatory benchmarking and civic auditing. Academic
institutions, journalists, and NGOs could compare platform behaviour more reliably, fostering
external accountability and public trust.</p>
      </sec>
      <sec id="sec-10-3">
        <title>7.3. Transparency as Design Governance</title>
        <p>Legal obligations should not be limited to content; they must extend to presentation and delivery,
enabling dynamic user control and contextual interaction. Transparency must be integrated into user
interface (UI) design: opt-out mechanisms should be prominent and persistent; recommender
settings should be explained through interactive prompts, and content moderation criteria should be
displayed contextually at the point of action (e.g., when a user is flagged or sanctioned), facilitating
informed and dynamic consent.</p>
        <p>
          Crea &amp; De Franceschi [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ] advocate for design-based fairness: legal principles embedded in visual
and interactive architecture, not just in policy documents. This would mean regulating transparency
not as an isolated duty but as part of broader governance-by-design, aligning with the growing call
for human-centric digital environments.
        </p>
      </sec>
      <sec id="sec-10-4">
        <title>7.4. Institutional Guidance and Transparency Metrics</title>
        <p>Lastly, regulators must develop tools to assess the effectiveness of transparency. The DSA foresees
audits and risk assessments but lacks qualitative transparency metrics—benchmarks for user
understanding, behavioural influence, or misinformation resilience. An EU Observatory for Digital
Transparency, potentially embedded within the European Board for Digital Services, could centralize
evaluations, produce annual reports, and issue interpretive guidelines. Much like the EDPB under
the GDPR, such a body would lend coherence to fragmented enforcement and close the gap between
formal compliance and real-world impact, ensuring regulatory efforts effectively promote user
agency and address issues like the privacy paradox.</p>
      </sec>
    </sec>
    <sec id="sec-11">
      <title>8. Concluding Remarks</title>
      <p>The Digital Services Act aspires to reshape digital governance through enhanced transparency. Yet,
as this paper has shown, the current architecture of transparency risks repeating an early fallacy: the
belief that more disclosure equals more empowerment. Despite unprecedented formal openness,
digital platforms continue to concentrate power, obscure control, and exploit vulnerability. The
paradox is evident: we live in a regime of transparent opacity.</p>
      <p>Transparency, as deployed in the DSA, has been conceptualized as a legal remedy. But the
structural and behavioural dimensions of digital interaction require us to treat it as a governance
mode—one that must be carefully calibrated, empirically evaluated, and ethically grounded.
Procedural openness is no substitute for communicative justice.</p>
      <p>This paper has argued for a shift toward meaningful transparency: layered, user-tested, and
embedded into interface design. It has highlighted the need for standardization, institutional
guidance, and a reconceptualization of transparency as a relational duty, not a mere information
dump. Legal scholars and regulators alike must resist the tendency to treat transparency as a
cureall, and instead recognize its limits, conditions, and contexts.</p>
      <p>Ultimately, the true test of the DSA’s success will lie not in how much platforms disclose, but in
whether users can understand, challenge, and shape the digital environments they inhabit. As digital
vulnerability continues to evolve, so too must the legal imagination. Transparency may be a starting
point—but never the destination.</p>
    </sec>
    <sec id="sec-12">
      <title>Declaration on Generative AI</title>
      <p>During this language editing and sentence refinement process, suggestions generated by ChatGPT-4o
and NotebookLM were consulted. These technologies were used in a supportive capacity, without fully
adopting or directly incorporating all proposed changes.
https://www.rekabet.gov.tr/tr/Guncel/cevrim-ici-reklamcilik-sektoru-incelemesi-nRToUKUTLzU9c0fA1i1dK0
[15] McDonald, A. M., &amp; Cranor, L. F. (2008). The Cost of Reading Privacy Policies. ISJLP-I/S: A</p>
      <p>Journal of Law and Policy for the Information Society, 4(3), 543–565.
[16] Acquisti, A. (2024). The Economics of Privacy at a Crossroads. In A. Goldfarb &amp; C. E. Tucker
(Eds.), The Economics of Privacy (pp. 21–72). Chicago: University of Chicago Press.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Veale</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Borgesius</surname>
            ,
            <given-names>F. Z.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Demystifying the DSA's Transparency Tools</article-title>
          . European Law Blog.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Mišćenić</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Transparency and Fairness in the Digital Age: Between Formalism and User Empowerment</article-title>
          .
          <source>European Journal of Consumer Law</source>
          ,
          <volume>12</volume>
          (
          <issue>1</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>23</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Digitally Enhanced Power Asymmetries in EU Digital Law. Common Market Law Review (forthcoming).</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>ACCC.</surname>
          </string-name>
          (
          <year>2018</year>
          ).
          <source>Digital Platforms Inquiry - Preliminary Report. Australian Competition and Consumer Commission.</source>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Ben-Shahar</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Schneider</surname>
            ,
            <given-names>C. E.</given-names>
          </string-name>
          (
          <year>2014</year>
          ).
          <article-title>More Than You Wanted to Know: The Failure of Mandated Disclosure</article-title>
          . Princeton University Press.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Rossi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Translucency in Platform Transparency: Between Visibility and Opacity</article-title>
          .
          <source>Internet Policy Review</source>
          ,
          <volume>12</volume>
          (
          <issue>4</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>15</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Urman</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Makhortykh</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Evaluating Platform Transparency Reports under the DSA</article-title>
          .
          <source>Digital Society</source>
          ,
          <volume>2</volume>
          (
          <issue>1</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>18</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Wilding</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , et al. (
          <year>2018</year>
          ).
          <article-title>The Impact of Digital Platforms on News and Journalistic Content</article-title>
          .
          <article-title>Report for the ACCC</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Domurath</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Hypo-Autonomy and the Limits of Digital Choice</article-title>
          .
          <source>Journal of Consumer Policy (forthcoming).</source>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Goanta</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          , et al. (
          <year>2024</year>
          ).
          <source>Digital Cognitive Asymmetries and Legal Responses. Technology and Regulation Review</source>
          ,
          <volume>3</volume>
          (
          <issue>1</issue>
          ),
          <fpage>22</fpage>
          -
          <lpage>38</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Crea</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>De Franceschi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Digital Vulnerability and the Law: Rethinking User Protection in Platform Environments</article-title>
          .
          <source>European Law Journal (forthcoming).</source>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Pasquale</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>New Laws of Robotics: Defending Human Expertise in the Age of AI</article-title>
          . Harvard University Press.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>European</surname>
            <given-names>Commission.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Commission Implementing Regulation (EU) of 4 November 2024 laying down templates and reporting periods under Articles 15, 24 and 42 of Regulation (EU)</article-title>
          <year>2022</year>
          /
          <year>2065</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>Rekabet</given-names>
            <surname>Kurumu-Turkish Compatition Authority</surname>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Çevrim İçi Reklamcılık Sektör İncelemesi Nihai Raporu</article-title>
          . Ankara,
          <string-name>
            <surname>Türkiye (Turkish Competition Authority</surname>
          </string-name>
          (
          <year>2024</year>
          ).
          <source>Final Report on the Online Advertising Sector Inquiry</source>
          . Ankara, Türkiye). [Online]
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>