<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>The challenge of personal attribute preferences in recommending diverse, reliable news sources</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Brooke Auxier</string-name>
          <email>bauxier@umd.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jennifer Golbeck</string-name>
          <email>jgolbeck@umd.edu</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Philip Merrill College of Journalism, University of Maryland</institution>
          ,
          <addr-line>College Park, Maryland</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>School of Information Studies, University of Maryland</institution>
          ,
          <addr-line>College Park, Maryland</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The affordances of social media and the internet allow users to encounter and engage with diverse and novel news content. However, user preference and bias may limit the content consumed in these spaces. Using signals of reliability, which have been studied as they relate to content and information sharers in social media environments, a recommender system could be built to suggest news content to users. However, as stated, assessments of trustworthiness and reliability come with some user bias and an algorithm that uses these preferences could be extremely limiting. We propose the following solutions: (1) using trusted social connections to surface content; (2) using bots to broaden a user's information ecosystem.</p>
      </abstract>
      <kwd-group>
        <kwd>news recommendations</kwd>
        <kwd>Reliability</kwd>
        <kwd>News sources</kwd>
        <kwd>Social media</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Billions of people across the globe use social media sites. In 2016,
2.28 billion people were on social media and that number is
expected to increase to 3.02 billion by 2021 [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Though there are
many reasons for using social media, news consumption is common
activity on the sites. In the U.S, 67 percent of adults report getting
at least some of their news on social media sites [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Though these
figures are already substantial, the numbers undoubtedly increase
when news usage metrics from mobile apps, messaging platforms
like WhatsApp, WeChat and GroupMe, and news websites are also
considered.
      </p>
      <p>
        Copyright © CIKM 2018 for the individual papers by the papers'
authors. Copyright © CIKM 2018 for the volume as a collection
by its editors. This volume and its papers are published under
Research suggests then when social media users encounter
information sharers on the platform who are unfamiliar to them,
they use multiple heuristic cues to gauge the trustworthiness and
credibility of the source, as well as the message. Earlier work by
[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] found that tweet content with non-standard grammar and
punctuation were viewed as having low credibility, whereas tweets
that included links to high quality sources were seen as more
credible.
      </p>
      <p>
        Audiences also judge the information sharers themselves. Users
that had a default image, cartoon or avatar as their profile photos
were rated as having low credibility, whereas users with high
follower counts who had a Twitter bio were seen as more credible
[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. The same study found that the sharer’s @handle or screen name
also influenced perceived credibility. Topically relevant screen
names were seen as more credible than ‘internet style names.’
Similar, more recent research we conducted suggests that users find
certain features of social media profiles more reliable than others
[
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. For example, the study (N=261) found that when exposed to
neutral news-oriented Twitter-like content from unknown
information sharers, users with Western names, gender-neutral
names and female avatars were perceived as most reliable.
Respondents were also most likely to share content from
information sharers with Western names, gender-neutral names and
who had non-human avatars (e.g. logos, non-human objects).
3
      </p>
    </sec>
    <sec id="sec-2">
      <title>Proposed solutions, recommendations</title>
      <p>These signals of reliability could be integrated into a recommender
system to suggest content to users. If we know they are likely to
engage more with posts from people with certain profile features,
an algorithm could be designed to bring more content from those
types of accounts.</p>
      <p>
        However, one can see that there is some bias in these evaluations
of trustworthiness and credibility. There are plenty of reliable,
trustworthy and talented journalists and who do not have Western
sounding names and who use cartoon avatars in their profile photos
- though subjects in our study gave these types of users among the
lowest credibility ratings. There are also many nefarious sources
with profiles that match the traditional standards of trustworthiness.
This raises the question of how to build a recommender system that
(1) incorporates information about profile attribute credibility
assessments, (2) suggests useful, reliable content that users will
perceive as such and (3) does not reinforce unfair biases.
First, recent results from our work [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] and others [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] indicate a
preference for news shared from accounts with certain personal
attributes. These preferences may vary among users. Leveraging
profile attribute preferences has the potential to improve perceived
recommendation quality, but this requires empirical analysis in
addition to experimental analysis.
      </p>
      <p>If these preferences do improve recommendation accuracy, they
will negatively impact the diversity of sources that are
recommended and potentially the diversity of news itself. To
counter this, we see research paths in countering these preferences
and in playing to them. To counter such preferences, we may look
to strong, trusted existing social connections and highlight
engagements (e.g. likes, shares) from a user’s friend who shares
content from a less-preferred profile type. This may weaken the
bias toward certain sources. When those biases may be difficult to
overcome, we are interested in the impact of bots. Recommender
systems or news organizations may consider creating automated
accounts with profiles that play to a user’s reliability bias, and have
those accounts share news that will broaden the user’s information
ecosystem.
4</p>
    </sec>
    <sec id="sec-3">
      <title>Conclusion</title>
      <p>Recommender systems help people find reliable news in an
ecosystem full of sources and perspectives. Previous results that
show users have preferences for a range of personal attributes of
news-sharing accounts, including avatar type, name, screen name,
and gender. This raises several interesting research questions.
Recommender systems may be able to improve their performance
by incorporating profile attribute preferences. This would need
experimental validation. However, any improved performance may
come at the cost of bias in the type of sources and in the diversity
of news a person sees. Countering user bias through
recommendation and leveraging the perceived reliability of a
profile to bring more diverse news to users are both interesting
research challenges going forward. They highlight the complexity
of social interaction and the role it plays in recommendation, but
also the opportunities that arise from a deeper understanding of how
users assess people they encounter online.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>[1] eMarketer, Number of social media users worldwide from 2010 to 2021 (in billions)</article-title>
          , https://www.statista.com/statistics/278414/number-of
          <article-title>-worldwidesocial-network-users/</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Elisa</given-names>
            <surname>Shearer</surname>
          </string-name>
          and
          <string-name>
            <given-names>Jeffrey</given-names>
            <surname>Gottfried. News Use Across Social Media Platforms</surname>
          </string-name>
          , http://www.journalism.org/
          <year>2017</year>
          /09/07/news-use
          <article-title>-across-social-mediaplatforms-2017/</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Seth</given-names>
            <surname>Flaxman</surname>
          </string-name>
          , Sharad Goel and
          <string-name>
            <given-names>Justin M.</given-names>
            <surname>Rao</surname>
          </string-name>
          . (
          <year>2016</year>
          ).
          <article-title>Filter Bubbles, Echo Chambers, and Online News Consumption</article-title>
          .
          <source>Public Opinion Quarterly</source>
          ,
          <volume>80</volume>
          (
          <issue>S1</issue>
          ),
          <fpage>298</fpage>
          -
          <lpage>320</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Thomas</surname>
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Johnson</surname>
            , Shannon Bichard and
            <given-names>Weiwu Zhang.</given-names>
          </string-name>
          (
          <year>2009</year>
          ).
          <article-title>Communication Communities or “CyberGhettos?”: A Path Analysis Model Examining Factors that Explain Selective Exposure to Blogs</article-title>
          . Journal of Computer-Mediated Communication,
          <volume>15</volume>
          (
          <issue>1</issue>
          ),
          <fpage>60</fpage>
          -
          <lpage>82</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Solomon</given-names>
            <surname>Messing</surname>
          </string-name>
          and
          <string-name>
            <surname>Sean J. Westwood.</surname>
          </string-name>
          (
          <year>2014</year>
          ).
          <article-title>Selective Exposure in the Age of Social Media: Endorsements Trump Partisan Source Affiliation When Selecting News Online</article-title>
          .
          <source>Communication Research</source>
          <volume>41</volume>
          (
          <issue>8</issue>
          ),
          <fpage>1042</fpage>
          -
          <lpage>1063</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Dimitar</given-names>
            <surname>Nikolov</surname>
          </string-name>
          , Diego F. M. Oliveira, Alessandro Flammini and
          <string-name>
            <given-names>Filippo</given-names>
            <surname>Menczer</surname>
          </string-name>
          . (
          <year>2015</year>
          ).
          <article-title>Measuring Online Social Bubbles</article-title>
          .
          <source>PeerJ Computer Science</source>
          ,
          <volume>1</volume>
          (
          <issue>38</issue>
          ),
          <volume>12</volume>
          pages.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Stephan</given-names>
            <surname>Winter</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Miriam J.</given-names>
            <surname>Metzger</surname>
          </string-name>
          and
          <string-name>
            <surname>Andrew J. Flanagin.</surname>
          </string-name>
          (
          <year>2016</year>
          ).
          <article-title>Selective Use of News Cues: A Multiple-Motive Perspective on Information Selection in Social Media Environments</article-title>
          .
          <source>Journal of Communication</source>
          ,
          <volume>66</volume>
          (
          <issue>4</issue>
          ),
          <fpage>669</fpage>
          -
          <lpage>693</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Meredith</given-names>
            <surname>Ringel Morris</surname>
          </string-name>
          , Scott Counts, Asta Roseway, Aaron Hoff and
          <string-name>
            <given-names>Julia</given-names>
            <surname>Schwarz</surname>
          </string-name>
          . (
          <year>2012</year>
          ).
          <article-title>Tweeting is Believing? Understanding Microblog Credibility Perceptions</article-title>
          .
          <source>In Proceedings of CSCW 2012</source>
          . ACM, Seattle, Washington,
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Brooke</given-names>
            <surname>Auxier</surname>
          </string-name>
          and
          <string-name>
            <given-names>Jennifer</given-names>
            <surname>Golbeck</surname>
          </string-name>
          . (
          <year>2018</year>
          ).
          <article-title>Factors influencing perceived reliability of information-sharers in social media spaces</article-title>
          . University of Maryland, College Park, MD.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>