<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Social Media Platform Structures and Their Implications</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Vijay Keswani</string-name>
        </contrib>
      </contrib-group>
      <abstract>
        <p>Polarization, the spread of misinformation, and the creation of echo chambers on social media platforms are all phenomena that currently plague our online discourses. Providing an explanation for these phenomena requires an analysis of the broader social system that online content platforms enforce through their platform policies and designs. In this article, I discuss the problematic structures created by current social media platforms which prioritize engagement relations and demonstrate how these structures affect user interactions on the platforms. Parsing these structures provides insight into the platform policies that constrain and influence user actions online and I argue that these structures should be considered when explaining any user's radical interactions with online communities. Finally, considering the role of platform structures in shaping user actions, I contend that online social media platforms, in their current form, are not just “neutral publishers” anymore.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;social media</kwd>
        <kwd>social structures</kwd>
        <kwd>recommendation systems</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The early 1990s featured an active discussion on the role of online platforms in hosting and
shaping public discourse. At that time in the US, certain ISPs (CompuServe and Prodigy) refused
to regulate user content on their platforms beyond the necessary moderation of objectionable
material. Following these legal cases, a variety of legislative rulings in many countries established
protection for online platforms and ISPs from liability arising due to third-party content so that
they can serve the role of neutral publishers [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ]. One such example of a law is Section 230
of the Communications Decency Act in the US, which states that “no provider or user of an
interactive computer service shall be treated as the publisher or speaker of any information
provided by another information content provider.” Relatively more stringent laws in other
countries similarly protect platforms from liability due to illegal content unless they had “actual
knowledge” of it [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>In the case of social media platforms, these laws protect them from liability arising due to
inappropriate or malicious user posts. Suppose a user posts a threat or hate speech directed
towards another user or group of users on Twitter, then Twitter is not liable for this user’s action
or for follow-up actions arising from this interaction. While platforms are expected to moderate
content to broadly prevent “social harm” from their users’ activities, the mechanism and severity
of moderation are usually left up to the platforms themselves (except in extreme cases like
child sexual imagery). This legal immunity broadly ensures that platforms can serve as neutral
publishers and provide a “free” forum for their users.</p>
      <p>However, the neutrality of online platforms is perhaps arguable in the current age. Their goal is
to not simply provide a platform for users to create and share content, but also generate revenues
for themselves. Revenue generation is directly related to the time users spend on the platform.
For example, one avenue for revenue generation is advertisers, and the more the number of users
that view the advertised products, the higher the advertisers’ utility and the platform’s revenue
margins. Correspondingly, platforms have incentives to keep the users “engaged” with their
offered services so as to (a) get more users on the platform by harnessing offline social networks,
and (b) for revenue generation by creating higher visibility for platform services and advertised
products.</p>
      <p>
        In this article, I contend that this goal of engagement has changed the role of online content
platforms and they are not just neutral publishers anymore. To argue this point, I first define
and assess the structure enforced by social media platforms and show that platform structures
constrain user actions and interactions on the platform. Using the concept of social structures [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ],
I consider the question – “what is social media content?” – and argue that we cannot separate
user content on any platform from the platform’s structures. Furthermore, I also claim that as
platform structures have evolved over the years (given advances in recommendation systems), so
has user behavior on online platforms.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. The structure of social media content</title>
      <p>
        Explaining actions requires recognizing the factors that influence these actions. In this regard,
social scientists and philosophers, when explaining social phenomena, often study the “structures”
that influence and limit individual agency [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Societal hierarchies, governmental policies, and
cultural and familial relations are common structures that exert control over individual actions
and, hence, hold explanatory relevance.
      </p>
      <p>
        Concretely, Haslanger [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] proposed the following definition for a structure: “ Structures, broadly
understood, are complex entities with parts whose behavior is constrained by their relation to
other parts”. For example, the structure of government entities like legislatures or Courts is often
ifxed and pre-determined, independent of the individuals who occupy various positions when
the structure is instantiated, and the behavior of these individuals is often constrained by the
rules encoded in the structure (e.g., Constitution constraint court decisions). Given this concept
of structure, I contend that online content platforms create particular structures which encode
specific engagement-based relations and that explaining any individual action on the platform
requires understanding these platform structures. Specifically, if we want to explain extreme
online user actions - e.g., hate speech, online radicalization, and online discrimination, we have
to understand the underlying structural aspects associated with them to provide a complete
explanation.1
      </p>
      <p>
        To understand platform structures, we first have to define what constitutes a social media post.
Any user-generated social media content is constituted not just by the text/media in the content,
but also by the user themselves and the rank given to this content in any other user’s feed as well
as anchored by the constraints imposed by the platform on the content. There are two kinds of
constraints imposed by the platform and analyzing these constraints will give us a clearer picture
of the platform’s structure and the nature of a social media post. The first kind, say constraint
(a), is on what content can the users post on the platform. These constraints limit the users to
posting content that is agreeable to platform policies and/or social norms. For instance, Twitter’s
character limits and various platforms’ misinformation or hate speech moderation would be a
form of this constraint. The second kind, constraint (b), is the constraint on what users can see on
the platform as part of their feed and is present in the form of techniques used by platforms to
keep users “engaged”. Engagement is indeed a primary goal of any online content platform both
as a way of popularizing the platform and for generating revenue [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. The engagement appeal
of any platform in the current age is its ability to attract users and user networks and get them
to stay within the platform engaging with user or corporation-generated content. One way this
engagement constraint is realized on the platform is by controlling the ranking of content on any
user’s content feed. By providing users with personalized recommendations and personalized
content feeds, the platform tries to maximize user utility from the platform which incentivizes the
user to spend longer times on the platform.
      </p>
      <p>Within the limits of constraint (a), the users create their own posts and it is perhaps quite clear
that platforms are not liable for any harm caused by user content. Hate speech, online or offline
threats, or any other kind of offensive posts made by users are expected to be moderated by the
platform. This aspect of platform control over user posts is not under contention and is indeed
protected by legal doctrines.</p>
      <p>
        However, with respect to constraint (b), the techniques used for platform engagement – e.g.,
through personalized recommendations – also now play a role in structuring user content (both
through the rank assigned to one user’s post on another user’s feed and through the rules governing
the “degree of visibility” of user content). Any platform’s recommendation system fulfills one
simple goal – for a given user, show this user the content that they will “like”. What that last term
entails is highly debatable. But concretely, a personalized recommendation system ranks a given
set of content such that the ranking maximizes the combined utility of the user and the platform
[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. The user’s utility is the relevance of the recommended content. The platform’s utility consists
of multiple objectives such as maximizing user satisfaction and user engagement. The two are
different because satisfaction does not necessarily imply engagement. For instance, with Google
Search, I will be satisfied if the search results direct me to the website that contains the answer to
my queries. However, this keeps me on the search platform for a very short time. On the other
hand, now Google Search often provides short snippets of answers for any search query. This
way they hope to keep the user engaged and satisfied.
1A number of these structural aspects go even beyond the platform - social structures associated with polarization and
radicalization play a crucial role in explaining radical individual actions. These structures also manifest and can be
exacerbated by online platforms. Such social structural aspects have been covered by various scholars in other works
[
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ]. What I argue is that the structure encoded by the platform itself also plays a role in explaining the space of
actions available to any user on any platform.
      </p>
      <p>
        Correspondingly, engagement is a goal beyond user satisfaction. On current social media
platforms, it is often observed that users tend to engage with other users who share their values
and preferences [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. With the goal of maximizing engagement, this often leads to the platform’s
recommendation system generating user feeds in a manner that shows them posts that primarily
align with their values and opinions. These posts can contain misinformation or extremist ideas,
but nevertheless, the platform’s recommendation algorithm will believe (maybe even correctly)
that such posts will keep the user engaged on the platform [
        <xref ref-type="bibr" rid="ref10 ref11">10, 11</xref>
        ]. Tying this back to the earlier
discussion on structures, this form of engagement constraints user interactions on social media
to posts that the platform believes are “suitable” for them while ensuring that the number of
such interactions is maximized to keep the user on the platform as long as possible. Hence, the
engagement-based relation between users and their feeds created by the platform constrains any
user’s interactions in a manner that maximizes their engagement.
      </p>
      <p>
        While the above discussion just considers the platform using recommendations for engagement,
another aspect that needs to be mentioned is platform’s structures affect the kinds of posts users
create as well. Given the focus on engagement, users manipulate platforms by creating posts that
often try to trick recommendation algorithms into assigning them higher visibility. This regularly
happens on social media websites like Twitter, Facebook, and Instagram [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] and even occurs
on other online platforms like Uber and Lyft [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. With a focus on recommendation systems to
improve engagement and an increase in the use of social media for personal and professional
development, it is perhaps not surprising that high visibility is a user priority as well. Yet, what
this behavior does tell us is that user actions are influenced by platform structures and goals.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. The evolving structure of social media</title>
      <p>
        Structures change and evolve over time. As evident from the discussions of Haslanger [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], social
structures evolve and adapt to the changes in norms, values, and perceptions. Similarly, online
platform structures have also changed over time. Platforms that existed at the time when the
legal immunity for online services was enacted (for example, AOL and CompuServe) used
crude methods to create user feeds. Items on any user’s feed could very simply be sorted by
date of upload or some simple measure of popularity (like the number of user votes received
by each post) and the user feed could be filtered based on simple criteria. There was little
user-specific personalization. As such, these platforms with simplistic recommendation systems
indeed followed the crude definition of a “publisher” and functioned as neutral middle parties
that allowed content creators and consumers to connect to each other.
      </p>
      <p>This form of user feed generation is starkly different than the process followed to recommend
posts now. Almost all social media platforms create user-specific feeds, combining a variety of
factors like diversity and engagement to create a feed that would maximize both platform and
user utility. Considering the changes in recommendation systems and platform priorities/revenue
generation methods over the users, it would be fair to say that structures enforced by online
platforms have evolved over the years. Changes in structures imply changes in what we consider
to be user content on these platforms and hence, are also related to changes in user actions
executed via the platform.</p>
      <p>In its current state, social media platform priorities (and the corresponding structures) play a
huge role in determining user behavior on these platforms. Considering these mechanisms by
which platforms constrain user actions online, continuing to treat them as neutral publishers is
misleading at best and socially detrimental at worst.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusion</title>
      <p>
        My goal with this article is to emphasize the point that the role of social media platforms in
explaining user behaviors and actions online needs to be revisited. Note that I am not arguing
for repealing the legal immunity enjoyed by the platforms. Repealing the legal immunity can
potentially lead to platforms censoring any content that is deemed even marginally offensive.
However, I do assert that the laws governing these platforms need to be revisited so that we
can debate the best way to protect online forums for public discussions while ensuring that
these forums primarily serve user interests – see Van Alstyne et al. [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] for discussion on
potential platform modifications that would encourage greater transparency while maintaining
free discourse.
      </p>
      <p>The rise of online polarization, the spread of misinformation through social media, and the
creation of online echo chambers are not phenomena that can be individualistically explained.
There are social structures involved in the occurrence of these phenomena as well as platform
structures that continue to enable their development. Ensuring that we appropriately consider and
discuss platform responsibilities in enabling these phenomena will be crucial to realize the goal
of creating a free and accessible Internet.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A.</given-names>
            <surname>Johnson</surname>
          </string-name>
          , D. Castro,
          <string-name>
            <surname>How Other Countries Have Dealt With Intermediary Liability</surname>
          </string-name>
          ,
          <source>Technical Report, Information Technology and Innovation Foundation</source>
          ,
          <year>2021</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Z. E.</given-names>
            <surname>Chow</surname>
          </string-name>
          ,
          <article-title>Evaluating the approaches to social media liability for prohibited speech</article-title>
          , NYUJ Int'l
          <string-name>
            <given-names>L.</given-names>
            &amp;
            <surname>Pol</surname>
          </string-name>
          .
          <volume>51</volume>
          (
          <year>2018</year>
          )
          <fpage>1293</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S.</given-names>
            <surname>Haslanger</surname>
          </string-name>
          ,
          <article-title>What is a (social) structural explanation?</article-title>
          ,
          <source>Philosophical Studies</source>
          <volume>173</volume>
          (
          <year>2016</year>
          )
          <fpage>113</fpage>
          -
          <lpage>130</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>F.</given-names>
            <surname>Jackson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Pettit</surname>
          </string-name>
          ,
          <article-title>Structural explanation in social theory (</article-title>
          <year>1992</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>J. R.</given-names>
            <surname>Halverson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. K.</given-names>
            <surname>Way</surname>
          </string-name>
          ,
          <article-title>The curious case of colleen larose: Social margins, new media, and online radicalization</article-title>
          , Media,
          <source>War &amp; Conflict</source>
          <volume>5</volume>
          (
          <year>2012</year>
          )
          <fpage>139</fpage>
          -
          <lpage>153</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>C.</given-names>
            <surname>Bernatzky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Costello</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Hawdon</surname>
          </string-name>
          ,
          <article-title>Who produces online hate?: An examination of the effects of self-control, social structure, &amp; social learning</article-title>
          ,
          <source>American journal of criminal justice</source>
          (
          <year>2021</year>
          )
          <fpage>1</fpage>
          -
          <lpage>20</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>J.</given-names>
            <surname>Claussen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Kretschmer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Mayrhofer</surname>
          </string-name>
          ,
          <article-title>The effects of rewarding user engagement: The case of facebook apps</article-title>
          ,
          <source>Information Systems Research</source>
          <volume>24</volume>
          (
          <year>2013</year>
          )
          <fpage>186</fpage>
          -
          <lpage>200</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>A.</given-names>
            <surname>Anandhan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Shuib</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Ismail</surname>
          </string-name>
          , G. Mujtaba,
          <article-title>Social media recommender systems: review and open research issues</article-title>
          ,
          <source>IEEE Access 6</source>
          (
          <year>2018</year>
          )
          <fpage>15608</fpage>
          -
          <lpage>15628</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>S.</given-names>
            <surname>Perez</surname>
          </string-name>
          ,
          <article-title>Facebook partially documents its content recommendation system</article-title>
          ,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>R. Levy</surname>
          </string-name>
          ,
          <article-title>Social media, news consumption, and polarization: Evidence from a field experiment</article-title>
          ,
          <source>American economic review 111</source>
          (
          <year>2021</year>
          )
          <fpage>831</fpage>
          -
          <lpage>870</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>M.</given-names>
            <surname>Cinelli</surname>
          </string-name>
          ,
          <string-name>
            <surname>G. De Francisci Morales</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Galeazzi</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <string-name>
            <surname>Quattrociocchi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Starnini</surname>
          </string-name>
          ,
          <article-title>The echo chamber effect on social media</article-title>
          ,
          <source>Proceedings of the National Academy of Sciences</source>
          <volume>118</volume>
          (
          <year>2021</year>
          )
          <article-title>e2023301118</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>J.</given-names>
            <surname>Burrell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Kahn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Jonas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Grifnfi</surname>
          </string-name>
          ,
          <article-title>When users control the algorithms: values expressed in practices on twitter</article-title>
          ,
          <source>Proceedings of the ACM on human-computer interaction 3</source>
          (
          <year>2019</year>
          )
          <fpage>1</fpage>
          -
          <lpage>20</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>M.</given-names>
            <surname>Möhlmann</surname>
          </string-name>
          ,
          <string-name>
            <surname>L. Zalmanson,</surname>
          </string-name>
          <article-title>Hands on the wheel: Navigating algorithmic management and uber drivers'</article-title>
          , in: Autonomy',
          <source>in proceedings of the international conference on information systems (ICIS)</source>
          , Seoul South Korea,
          <year>2017</year>
          , pp.
          <fpage>10</fpage>
          -
          <lpage>13</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>M. Van Alstyne</surname>
            ,
            <given-names>M. D.</given-names>
          </string-name>
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Lin</surname>
          </string-name>
          ,
          <article-title>Improving section 230, preserving democracy, and protecting free speech</article-title>
          ,
          <source>Communications of the ACM</source>
          <volume>66</volume>
          (
          <year>2023</year>
          )
          <fpage>26</fpage>
          -
          <lpage>28</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>