<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>European Conference on Information Retrieval), April</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Online Information Disorder &amp; Children</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Hrishita Chakrabarti</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Diletta Micol Tobia</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Monica Landoni</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Maria Soledad Pera</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Delft University of Technology (TU Delft)</institution>
          ,
          <addr-line>Mekelweg 5, 2628 CD Delft</addr-line>
          ,
          <country country="NL">Netherlands</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>USI Università della Svizzera italiana (USI)</institution>
          ,
          <addr-line>Via Giuseppe Bufi 13, 6900 Lugano</addr-line>
          ,
          <country country="CH">Switzerland</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <volume>10</volume>
      <issue>2025</issue>
      <fpage>0000</fpage>
      <lpage>0001</lpage>
      <abstract>
        <p>The rise of digital platforms for accessing online content-from popular search engines to social media siteshas contributed to the (un)intentional propagation of misleading information. This phenomenon, known as Information Disorder, afects individuals and society. Extensive research has been conducted to study and address Information Disorder as it pertains to the general population. Yet, little is known about how children, who have specific needs and behaviours when interacting with digital content, deal with misleading information and how the algorithms, that underlay the information access tools they use, mitigate or exacerbate the issue. Through a systematic literature review, we present research eforts that address or discuss the impact of Information Disorder on children and their overall information-seeking experience. We analyse the literature from various perspectives, including children's behaviour across platforms and the solutions developed to mitigate misleading information. Inspired by the knowledge distilled and gaps identified in our review, we discuss research directions that tackle both technological and human-centred challenges children face when dealing with misleading information, seeking to establish a foundation to mitigate the efects of Information Disorder among children.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Children</kwd>
        <kwd>Information Disorder</kwd>
        <kwd>Misinformation</kwd>
        <kwd>Disinformation</kwd>
        <kwd>Information Access</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Information Disorder [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] describes the intentional and unintentional circulation of misleading
information in the public sphere. Misleading information can be broadly categorised as: (1) misinformation, i.e.,
false information shared without the intention to harm, (2) disinformation, false information shared
with the intent to harm, or (3) malinformation which refers to genuine information meant to be private,
shared to public spheres for malicious purposes. Information Disorder is pervasive in online mediums
in the current digital era. For instance, individuals can come across false advertisements produced
by companies to promote the sale of their products [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Such false claims manipulate consumers into
buying products they do not need but otherwise pose only a minimal risk to them. Information Disorder
can also lead to much more harmful consequences, e.g., individuals refusing vaccines against deadly
diseases due to misinformation about their side efects [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ], or deliberate propaganda circulated by
domestic or foreign forces manipulating public’s political opinions [
        <xref ref-type="bibr" rid="ref5 ref6 ref7">5, 6, 7</xref>
        ].
      </p>
      <p>
        Information Disorder is particularly worrisome in online environments as it directly hinders
information access. Regularly, individuals turn to search engines as the first point for information discovery on
diverse topics, e.g., the weather, news, and healthcare. In response to the user’s search inquiries, popular
search engines provide a list of resources but not all are genuine and factual sources of information
[
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ]. Search engines are no longer the only portal for information discovery. People nowadays also rely
on social media platforms like YouTube or TikTok and voice-based virtual assistants like Siri or Alexa
[
        <xref ref-type="bibr" rid="ref10">10, 11</xref>
        ]. These platforms ofer new ways of accessing information; they also introduce new challenges
related to Information Disorder. For example, users browsing social media, often without actively
searching for information, are anyway exposed to a range of content they interpret as information [12].
Integrating generative AI technologies into information access tools has exacerbated the proliferation of
false and misleading information online [13, 14, 15]. Given the increasing pervasiveness of Information
Disorder in our everyday lives, extensive research eforts have been allocated to study and address the
issues related to Information Disorder—from investigating reasons for users’ vulnerability to
Information Disorder [e.g. 16, 17, 18] to detecting and mitigating Information Disorder [e.g. 19, 20, 21, 22] along
with interventions to empower digital citizens against Information Disorder [e.g. 23, 24]. Most eforts,
however, have examined the issue concerning the general population, i.e., English-speaking adults from
the global north [25].
      </p>
      <p>A user group known for their extensive use of online information access tools who visibly difer
from mainstream users are children1. They are at a point in their life when they become cognizant of
the world around them and develop their principles and opinions. The information they encounter
during this phase plays a vital role in their development [27]. However, they do not possess the critical
thinking skills to discern the credibility of resources encountered on online information access tools
[28, 29]. They also tend to trust the Internet and often do not judge the nature of the source believing
all content they encounter to be factual [29, 30]. Given children’s in-development cognitive abilities and
distinct expectations from information access tools, we argue that their needs and challenges related to
Information Disorder difer from those of mainstream users. This makes it crucial to understand what
research eforts have been undertaken on the topic of Information Disorder with children specifically
as the main stakeholders.</p>
      <p>In this work, we conduct a systematic literature review to gain a comprehensive overview of research
related to online Information Disorder and children’s information access. For our review, we focus on
literature that considers children as distinct stakeholders and their perceptions instead of an external
person acting as their proxy. Our analysis through the lenses of the Interactive Information Retrieval
(IIR) community involves various perspectives encompassing the user demographics and struggles,
online tools and platforms studied, as well as the solutions proposed to address the highlighted struggles
of this user group. Based on the knowledge distilled from our analysis, we discuss open challenges and
propose future research directions, reflecting on the technical and ethical challenges that researchers
may encounter when advancing research in this area.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Literature Selection Process</title>
      <p>In this section, we describe the process used to gather and select manuscripts to inform our analysis;
the diferent stages of this process are shown in Figure 1.</p>
      <p>To gather the set of publications that anchors our comprehensive overview of research related to
children and online Information Disorder, we formulate a search query capturing keywords reflecting
term variations related to children (e.g. child, teen, adolescent) and online Information Disorder (e.g.
misinformation, disinformation, fake news). Given their global research coverage, we rely on two
popular databases: DBLP2 and SCOPUS3. We adjusted the search query to abide by the syntax of each
in the considered databases. For SCOPUS we limit the results to works published in venues related to
computer science due to the database’s wider coverage, unlike DBLP which is a database for computer
science research literature specifically. We also limited SCOPUS search results to English publications
but could not do so in DBLP due to limitations in ofered search features. The final search queries used
in the databases are listed below.
#KEYWORD QUERY
(child* OR kid* OR youngster* OR teen* OR adolescent*)
AND
(misinformation OR disinformation OR credibility OR pollution</p>
      <p>OR "fake news" OR malformation OR "Information Disorder")
1UNICEF’s Convention on the Rights of the Child [26] defines a child as any individual below the age of 18.
2https://dblp.org/
3https://www.scopus.com/search/form.uri?display=basic#basic
#DBLP
(child*|kid*|youngster*|teen*|adolescent*)
(misinformation|disinformation|credibility|</p>
      <p>pollution|fake news|malinformation|Information Disorder)
#SCOPUS
TITLE-ABS-KEY (
( misinformation OR disinformation OR credibility</p>
      <p>OR "information pollution" OR “fake news“</p>
      <p>OR malinformation OR “Information Disorder”)</p>
      <p>AND ( child* OR kid* OR teen* OR adolescen* OR youngster*)
)
AND ( LIMIT-TO ( SUBJAREA,"COMP" ) )
AND ( EXCLUDE ( DOCTYPE,"cr" ) )</p>
      <p>AND ( LIMIT-TO ( LANGUAGE,"English" ) )</p>
      <p>The search yielded 259; removing 23 duplicate works led to the 236 publications which two authors
screened in two phases using Covidence4, an online systematic review tool. In the first phase, the
authors individually ascertained the relevance of each work for analysis using its title and abstract
based on the following inclusion criteria:
• The target user group is (or consists of) children.
• The term “child" (and related terms) is used to describe human children.
• Terms related to online Information Disorder (such as “credibility") are used in the same context
in the title and abstract.
• The online content under consideration adheres to the definition of Information Disorder, i.e.,
content considered misinformation, disinformation, or malinformation.</p>
      <p>Following the first screening phase, only 76 publications remained. The prominent drop in
publications was largely due to the context in which the query terms were used in the excluded publications.
For example, publications discussing the propagation of misinformation in a network that used the term
“child" to refer to a child node in a graph were excluded; so were publications discussing misinformation
in healthcare related to kidney diseases which were retrieved due to the term “kid*" in the search query.</p>
      <p>In the second phase, the authors scanned the full body of the remaining publications and excluded
those contributing to medical or environmental sciences, considering adults’ viewpoints only as a
proxy for children’s experiences, unrelated to online Information Disorder, or about the propagation of
Information Disorder in social networks. The second screening phase resulted in 38 publications; one
was excluded as it described a user study but no preliminary results were discussed. Thus the final set
of publications for analysis is comprised of 37 publications.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Emergent Themes</title>
      <p>We analyse the selected manuscripts (Section 2), via an inductive thematic analysis approach [31].
The manuscripts were split randomly between two authors to look for patterns—from an IIR lens—in
the purpose, target population, methodology, and findings reported in each manuscript. The analysis
resulted in nine themes; for an overview of the themes and the associated manuscripts see Table 1.
Theme#1: Research venues &amp; perspectives. To gain an overview of the research eforts related to
Information Disorder and children, we consider the publication venue of the reviewed manuscripts,
in addition to the focus of the research presented in these manuscripts. Among the 37 manuscripts
reviewed, 20 were published in peer-reviewed journals, 15 in conferences, and the rest in workshops and
symposiums. Although 37 manuscripts is a limited number compared to those referring to Information
Disorder in general, the fact that most of the manuscripts focus on children are part of journals and
conference proceedings, indicates that Information Disorder with children as the main target is an
area of interest for a wide group of researchers and not limited to niche research groups. Most of
these manuscripts were published in Social Sciences (21), Information Science (9), or Human-Computer
Interaction venues (6). Despite limiting our search to publications related to computer science and the
context of online Information Disorder being closely related to Information Retrieval (IR), only one
manuscript, i.e., [49], was published in a core IR venue, raising a critical concern on the limited eforts
from the IR community in this area.</p>
      <p>In terms of research focus, 23 manuscripts explore, through user studies and empirical analysis,
diferent aspects of the interaction between children and online information. Additionally, another
14 manuscripts propose strategies to support children’s struggles with Information Disorder, ranging
from proposals aimed at the child’s cognitive learning to more concrete solutions to educate children
to tackle Information Disorder. Overall, we find that research eforts so far have largely focused on
children and not much on the systems that the child uses to access information.
Overview of most common research com- [32, 33, 34, 35, 36, 37, 38, 39, 28, 40,
munities and perspectives when advancing 41, 42, 43, 44, 45, 46, 47, 48, 49, 50,
knowledge on Information Disorder as it per- 51, 52, 53, 12, 54, 55, 56, 57, 58, 59,
tains to children 23, 60, 61, 62, 63, 64, 65]
How Information Disorder is addressed
across diferent age groups within the broad
category of children
Theme#2: Age. More than 85% of the manuscripts (32 out of 37) explicitly define the age range
of the population probed—-whether via user tests or empirical studies. Among these 32 papers, 8
focus on primary and middle school children, 9 examine high school students, and 15 address children
spanning both primary and secondary school years, highlighting some diferences between age groups
[52, 34, 55, 37, 48]. It emerges from these works that the ability of children to assess information
evolves with age. For primary school children, online content interpretation depends more on personal
experiences or emotional responses rather than on objective criteria [52]. Younger children are less
likely to compare multiple sources, making them more vulnerable to misinformation [37, 48]. Moreover,
they struggle to detect fake news without any initial training, despite their high cognitive reflection
scores [28]. High school children, instead, exhibit a more advanced approach to credibility evaluation.
Studies indicate that they are more likely to check the source of information, compare multiple sources,
and rely on external references when assessing information [46, 55, 42, 52]. These findings suggest an
improvement in judging online information due to greater experience and school education, leading to
the development of more critical thinking skills.</p>
      <p>Theme#3: Ethnography. As the majority of the manuscripts consist of studies involving human
participants (33 out of 37), ethnographical attributes inherent to the participants stood out, particularly
the gender and country of origin of the examined population under study.</p>
      <p>From a gender diversity perspective, 23 papers report the gender ratio of participants; most works
adopt a simplified binary gender definition with few exceptions [ 34, 38, 43, 46, 47, 50, 23]. Largely,
participants reflect a gender-balanced population (from a binary perspective), which suggests that the
insights gained from the analysis are generalizable to children of either gender.</p>
      <p>From a geographical perspective, except for the user study conducted in [43], each study targeted a
specific country, covering a total of 17. Specifically, 43% of the user studies involve children from the
USA [e.g. 28, 40] and another 43% children from European countries [e.g. 32, 35]. The studies focus
on understanding and addressing the challenges children face due to their inherent limited cognitive
abilities. Nevertheless, socio-economic factors like financial status or ethnicity–although sometimes
reported when describing the participants–are never considered as a factor influencing the reported
conclusions. This lack of a cultural component in the analysis is acknowledged as a limitation in only a
small subset of user studies under review [35, 61, 33]. Studies involving children from Singapore [56]
and Brazil [65] indicate that external factors, such as the influence of political and religious ideologies
and report that such socio-economic factors also have a prominent influence on how children access
information online and their perception of Information Disorder. Similarly, in the interviews conducted
by Jean et al. [37], children expressed a lack of trust in their government, influencing the online sources
they prefer when accessing information about their health.</p>
      <p>Theme#4: Perception. Adults often believe children to be naive and unaware of online Information
Disorder [41]. However, findings from 11 user studies involving focus groups or interviews with children
to discuss their preferences when accessing information online, reveal that contrary to popular belief,
children are aware of online Information Disorder but possess a limited perception of what constitutes
online Information Disorder [41, 44].</p>
      <p>Children equate Information Disorder to content that is “obviously false" [41] often relying on their
“gut feeling" to ascertain whether the information is to be trusted or not [64, 43]. If children deem an
online resource trustworthy, they consider it a reliable source of information and rarely reflect on the
content presented in the source. For example, in a study conducted by Jean et al. [37], they observed
children search for keywords that match their inquiry to determine if the information is suitable for
their information needs without paying attention to the context the keywords are used. Children also
tend to assume the benevolence of the content creator and do not consider the creator’s intentions
behind producing the content [34, 60]. The direct consequence of their presumed benevolence of the
author was observed in a study conducted by Hartwig et al. [47] in which children failed to recognise
the misleading information in satirical content.</p>
      <p>Theme#5: Discerning information disorder. Given the recurrent theme of children’s (limited)
awareness of online Information Disorder, we look for common strategies that children undertake–
consciously or otherwise–to tackle Information Disorder. It became apparent that children consider
various factors to ascertain if they should trust the information they encounter. Most often these factors
are surface-level elements such as the amount of information and graphics presented, the vocabulary,
and the tone of writing [37, 64, 59]. Some studies report that children consider source attributes such as
the author’s domain expertise and presence of evidence to support the information, to ascertain the
credibility of an online resource [34, 37, 42]. Social factors like the source’s popularity as well as other
people’s reactions to the particular piece of media also influence the child’s decision on whether to
trust the information or not [41, 47, 59, 42]. They tend to exhibit confirmation biases, relying on their
prior knowledge of the subject to determine if the information can be trusted [34, 41, 47].</p>
      <p>Children seem to account for various factors when determining whether online resources are to
be trusted. However, studies that examine children’s process of evaluation find children to do so in a
very heuristic manner [44, 65, 63, 54]. For instance, while using the presence of supportive evidence to
evaluate the quality of the information, children struggle with assessing the quality of the evidence in
corroboration with the information [34, 60]. Children’s inability to reason deeply is also reflected in the
vague and sometimes irrelevant reasons they provided when prompted to justify their evaluation of the
quality of information presented on an online resource [34, 41, 65].</p>
      <p>Theme#6: Context-dependent vigilance. Despite children’s self-reported awareness of Information
Disorder and their vigilance to tackle it, there appears to be a huge discrepancy between what they
report to do versus what they have been observed to do [59, 65]. Children’s trust in online information
is heavily dependent on the context in which they access the information. Children are more critical of
the online content they consume when they actively seek information about their personal interests
or education [65]. Even in these scenarios, however, children often accept the first response they
receive from the information access system for their inquiry [48, 52] and compare diferent sources
only when exposed to contrasting viewpoints [52]. A classroom setting can also sometimes make
children lenient in their critical analysis as seen in a study conducted with children in a primary school
in the Netherlands [32] wherein some participants reported feeling shocked that they were unable
to realize the information provided under the guise of a classroom assignment was fake information
despite having received training in media literacy. In the case of information access for leisure and
entertainment, children undertake a very diferent approach to ascertain the trustworthiness of the
content. Instead of analysing the content, they focus on social factors such as the popularity of the
content and other people’s reaction to it [41, 47, 59, 42].</p>
      <p>Theme#7: Information &amp; media literacy. Looking at the challenges highlighted in user studies and
addressed in proposed solutions, close to 30% of the manuscripts (10 out of 37) focused on children’s
ability to assess online information being influenced by their understanding of the tools they use to access
it. According to [47, 48] many children tend to trust algorithms, such as AI or Google, blindly believing
the accuracy of the information. Since they do not yet have the literacy skills to assess credibility
[37, 49], without any tool or knowledge to question online information, they remain vulnerable to
algorithms, which hinders the development of their critical thinking skills [36, 41]. On the other hand,
information access tools often reinforce this blind trust. Transparency regarding algorithms is not
commonly addressed when developing new technical solutions and many children lack understanding
of how algorithms work [36, 35]. Moreover, the rise of digital influencers leads this vulnerable group to
trust everything they promote, without fully understanding the dynamics behind social media [39].
Even though children may be cautious about online information sources, their limited media literacy
and understanding of Information Disorder prevent them from critically evaluating the content [41, 44].
Theme#8: Research outcomes. From the review, it emerges that only 14 out of 37 manuscripts
ofer solutions or strategies to address challenges faced by children in relation to Information Disorder.
Although limited in number, these proposals aim to address the diferent needs of children as information
seekers by considering the challenges from distinct points of view. However, findings suggest that most
research eforts prioritize children over the systems they use to access online information, resulting in
a gap in algorithmic solutions specifically tailored for them.</p>
      <p>Without proposing definitive solutions, 6 papers ofer techniques that can be adopted to enhance
children’s evaluation of online information. The work of Kiili et al. [34], for example, suggests that
longer deliberation on the credibility of sources leads to more accurate evaluations among children.
Additionally, three studies [51, 45, 38] indicate that increasing awareness of the importance of credibility
evaluation encourages them to adopt a more critical approach to the quality of the encounter information.
Promoting emotional engagement has also been identified as efective in tackling Information Disorder
[34], acknowledging the influence of emotions on how children assess information. Furthermore,
research by [54] suggests that enhancing reading skills can improve children’s ability to evaluate the
credibility of information. While these insights point to cognitive developmental interventions as
potential solutions to combat Information Disorder, 8 out of the 14 papers suggest a more educational
approach, with a range of solutions designed to engage children and enhance their skills. Interactive
lessons and hands-on activities—such as serious games or classroom exercises to create fake content—
have been shown to improve their critical thinking skills [45, 32, 38, 40, 54]. Some studies focus more on
interface design solutions, such as incorporating warning labels or increasing transparency of the source
attributes [47, 64], which can provide visual cues to help children recognize misleading information.
Furthermore, using verification tools, such as fact-checking apps, has proven beneficial in discerning
fake news from credible information [43].</p>
      <p>Theme #9: Relevant vs. useful From an IR viewpoint, an online resource is relevant if it meets
the user’s information needs. For children, however, relevance goes beyond resources matching the
keywords expressed in a query or prior user interactions. For instance, the online content’s emotional
undertone as well as its alignment with the sentiment in the child’s search query plays an important role
in relevance from a child’s point of view [66, 67]. Children’s perceived complexity of the content also
influences which resource they choose to engage with [ 66, 49]. Findings from two user studies [64, 37]
highlight a similar tradeof in children’s online search behaviour between what children deem to be the
“best" response to their inquiry and which response they find to be helpful, i.e., comprehensibility of
the content in the online resource is sometimes more important to children than the reliability of the
source. This tradeof presents a unique perspective on the kind of resources children would deviate
towards when seeking information, which can also influence children’s exposure to misleading content
as discussed in another reviewed manuscript - [49]. However, none of the other reviewed papers provide
insights into this perspective.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Discussion</title>
      <p>In this section, we reflect on the knowledge distilled from our literature analysis and discuss open
problems, research directions, and potential challenges that may hinder research aimed at supporting
children’s information access in the presence of Information Disorder.</p>
      <p>
        Revising the concept of Information Disorder. In their majority, reviewed manuscripts
discuss children dealing with fake news which is one aspect of the much broader Information Disorder
concept[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], overlooking components such as disinformation, malinformation, and in general any
intentional or unintentional circulation of misleading or misinterpreted information. Further, the definition
of Information Disorder looks at online content from two aspects–the falseness and intent to harm
[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Both these aspects rely on the source and at most the propagator of the content, and not the
consumer. However, in the case of children, genuine information lacking any intent to harm can also
be misleading due to children’s inability to interpret the content appropriately (e.g. satirical content)
[36, 41, 47]. This distinction between misinformation and misinterpretation is particularly important
when analysing children’s interaction with online content, emphasizing the need to re-evaluate and
expand our understanding of Information Disorder to include the cognitive challenges children face
when interpreting even credible content. In other words, future research is needed to help reshape the
definition of this concept when considering children.
      </p>
      <p>Inclusive user studies. Researchers tend to generalise findings based on observations of children
at a specific age to a much broader age group. However, in the first 18 years of their lives, children
go through various developmental phases that influence how they perceive the world around them
[27]. This raises the question of whether findings from existing literature that focus on specific ages
truly apply to the full spectrum of the definition of ‘children’. Furthermore, most insights reported
in Section 3 reflect the perceptions and behaviour of children from the global north. In an expert
survey conducted by Blair et al. [25], there is a prominent diference in the intervention strategies
that experts believe would be efective for (general) populations from the global north and south for
combating misinformation. Based on the impact of cultural and socio-economic factors on children’s
information-seeking behaviour [56, 65, 37], the diference may persist for children. This issue may also
be exacerbated in children given the influence of social factors in their natural strategies to identify
Information Disorder [41, 47, 59, 42].</p>
      <p>Another aspect that underlies the challenge of inclusivity–particularly noted to be lacking in user
studies in the reviewed manuscripts–is the impact of language on children’s struggles with Information
Disorder. Familiarity with the language can greatly influence the way a person interprets content and
with English being the lingua franca of the Internet [68, 69], the problem of Information Disorder for
children growing up in non-English speaking communities may be exacerbated.</p>
      <p>Designing algorithms for retrieval, ranking, and filtering of resources to address Information Disorder
on online platforms based on the knowledge acquired from the aforementioned studies can only
be expected to cater to and/or reflect the needs and expectations of the children belonging to the
communities represented in the studies. While a valuable starting point, to enable the design of inclusive
solutions that more broadly tackle children’s struggles with online Information Disorder, future user
studies would benefit from exploring the diferences in children’s behaviour across developmental stages
as well as extending the reach of their research to involve children from diferent cultural communities.
ROMCIR from children’s perspective. The IR community has conducted extensive research
to understand user perception and challenges related to the problem of Information Disorder [e.g.
70, 71, 72, 73]. It has also developed algorithms to detect and mitigate misleading information on online
platforms [e.g. 19, 20, 21, 22]. These algorithms however do not address the problem as it pertains to
children. A clear indication of the absence of children’s perspective in the designed solutions is the data
used to train and validate these algorithms.</p>
      <p>Datasets related to Information Disorder like FEVER [74] and LIAR [75] contain content that mostly
adults are exposed to and labelled as genuine or fake from an adult’s perspective. However, in the
case of children, even genuine content can contribute to Information Disorder. For example, children
struggle to diferentiate between factual and opinionated content, relying on both as objective sources
of information [40]. Limited reading and critical thinking skills also lead them to misinterpret content
such as satire [47]. With this in mind, we argue that existing IR algorithms must be validated against
labelled datasets that contain content that constitutes Information Disorder as it pertains to children.
Unfortunately, such datasets do not exist. This gap highlights the need for more targeted resources
that reflect children’s interaction with online information. To enable research in this direction, the IR
community must also allocate efort to create such representative datasets.</p>
      <p>Children as research subjects. Online resources children are exposed to potentially influence their
development [49]. This makes it crucial for the research community to not only focus on how children
interact with misleading information but also consider the ethical implications of exposing them to
such content. The study by Loos et al. [32] highlights the benefits of teacher or parental involvement
in mitigating the adverse consequences of exposure to content that is preemptively known to elicit
emotional duress in children. Their study suggests that guidance from trusted adults can help children
better contextualize and critically engage with the encountered information, reducing the risk of harm.
Similarly, Landoni et al. [49] argues that ensuring a safe and positive online experience for children
needs expert guidance. However, since most of the reviewed user studies directly expose children
to possible harmful content, it is also important for researchers to ensure that children have a solid
understanding of its reliability.</p>
      <p>As children may not fully comprehend the risks associated with participating in user studies, it
is fundamental for researchers to obtain explicit consent both from children and their guardians,
clearly explaining the implications and the nature of the study, as well as how their data will be used
[76]. Furthermore, recent work has emphasized the value of engaging children not only as passive
testers but as active collaborators in studies [49]. This participatory approach, coming from the "child
as protagonist" principle [77], ensures that research is more inclusive and reflective of children’s
experiences. When they are directly involved in the design process, they become experts in the field,
making more valuable contributions to understanding how they deal with online Information Disorder.
By considering children as researchers’ partners, studies can better account for their unique cognitive,
emotional, and developmental needs, leading to more inclusive outcomes.</p>
      <p>
        Finally, children are not merely miniature adults; they have distinct needs [78, 79]. As it emerged from
our review, they navigate and interpret online information in unique ways, challenging the traditional
concept of Information Disorder [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Their ability to engage with, understand, and evaluate online
content difers from that of the general population, demanding a tailored approach in the research field.
While it is crucial to involve children in the research process, there is a need to design more tailored
and accurate studies that reflect children’s cognitive abilities and respect their needs and perceptions
when dealing with online Information Disorder.
      </p>
      <p>Scafolding for independence. Children are at a stage in their lives where they are developing
their sense of self and becoming cognizant of the world around them. The opinions and personal beliefs
they form at this age greatly influence the choices they make and the information they consume has
a strong influence on their development [ 27]. Due to the pervasiveness of digital technology, online
platforms are the most common source of information for children. Even when not actively seeking
information, they come across content that they perceive as information. Amid never-ending exposure
to information, often in the absence of adult supervision, children need to be empowered to tackle the
problem of Information Disorder independently.</p>
      <p>Preventive algorithms that detect and mitigate misleading online content can support children in
combating Information Disorder. However, it is impossible to anticipate every scenario of Information
Disorder for children and prevent children’s exposure to it. Moreover, an overtly protective ecosystem
also hinders children from developing the essential digital skills they need to become independent.
Hence, rather than a protectionist approach, research must adopt a holistic approach that introduces
children to online Information Disorder in a manner that enables them to distinguish genuine factual
sources of information from misleading online content.</p>
      <p>For the IIR community, a holistic approach would entail going beyond optimising resource retrieval
for children to retrieval and ranking of resources that may require critical reflection on the presented
content. For instance, children must read and interact with an IAS response to extract information
from it and given that children are often drawn to resources that evoke emotional engagement [67],
developing novel (re-)ranking strategies that rank resources with content that aligns with the afective
expectations of the child during their search task could promote the child’s engagement with the
resources. Children also tend to be more critical when exposed to multiple and contrasting points of
view [52], novel child-centred retrieval and ranking strategies could be developed to retrieve resources
that provide information from multiple (factual) perspectives, possibly some of which are also conflicting
in nature.</p>
      <p>Interdisciplinary research. The digital environment alone cannot bear the responsibility of
children’s development. To ensure the feasibility of proposed solutions to address the problem of Information
Disorder for children, the IIR community must adopt a more interdisciplinary approach that combines
cognitive, educational, and technical interventions. Further, technological solutions can support
children’s skill development only when children are taught to make use of the scafolding. This can be
enabled by revising school curriculums to raise awareness amongst children about Information Disorder
from an early age and inculcate the habit of using information access systems with a critical mindset.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Concluding Remarks</title>
      <p>This work ofers a comprehensive overview of the status quo of research in computer science on online
Information Disorder, specifically concerning children. We present our findings in a summary table
that future researchers can consult to enhance their understanding of this topic. Through our literature
analysis based on the lenses of the IIR community, we identified nine themes that highlight diferent
perspectives on the issue of information disorder as it relates to children.</p>
      <p>Although children are aware of misleading information, they struggle to recognize Information
Disorder, often relying on gut feeling or superficial cues. This dificulty primarily stems from their limited
media and information literacy, which leads them to blindly trust any algorithm behind information
access tools. Additionally, their approach to managing information disorder is influenced by their
young age and various ethnographic factors, including cultural and socio-political contexts. This
underscores the importance of including children from diverse cultural and socio-economic backgrounds
in research. In general, we noted that research community eforts to combat children’s Information
Disorder are predominantly published in Social Science and Human-Computer Interaction venues,
proposing educational interventions and cognitive strategies to improve the critical thinking and
credibility assessment skills of this vulnerable group. Therefore, we advocate for direct involvement
of the IR community by possibly following the outlined research directions, which combined with
our reflections on the potential challenges, provide valuable guidance for shaping future research and
practical implementations to address the problem.</p>
      <p>In today’s digital era, children frequently turn to online information access tools for both learning
and leisure. In this work, we largely focus our discussion on children’s struggles with Information
Disorder when they actively search for information. However, we must also acknowledge that not all
the content they come across is through active information seeking. With the increasing presence of
recommender algorithms on the platforms that they visit, particularly in social media platforms such as
YouTube and TikTok [80], the distinction between active information-seeking and passive exposure to
content–which children may view as informative–is becoming less clear. To navigate this landscape
efectively, children must be equipped with the essential digital skills to recognise Information Disorder
when exposed to online content “out in the wild" [81]. For this, it is crucial not only to reflect on
what the research community has done to understand the problems that hinder children’s information
discovery and the associated risks but also what remains to be done to enhance children’s overall access
to online information, for which our literature review may serve as the needed spark of inspiration.</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgments</title>
      <p>Work supported by SNSF Award # [IC00I0-227887 project n.10000973 SOL]
[11] A. Express, Using tiktok as a search engine | adobe express, 2024. URL: https://www.adobe.com/
express/learn/blog/using-tiktok-as-a-search-engine.
[12] A. Hassoun, I. Beacock, S. Consolvo, B. Goldberg, P. G. Kelley, D. M. Russell, Practicing information
sensibility: How gen z engages with online information, in: 2023 CHI Conference on Human
Factors in Computing Systems, CHI ’23, Association for Computing Machinery, New York, NY,
USA, 2023.
[13] S. Monteith, T. Glenn, J. R. Geddes, P. C. Whybrow, E. Achtyes, M. Bauer, Artificial intelligence
and increasing misinformation, The British Journal of Psychiatry 224 (2024) 33–35.
[14] S. S. Sohail, F. Farhat, Y. Himeur, M. Nadeem, D. Ø. Madsen, Y. Singh, S. Atalla, W. Mansoor, The
future of gpt: A taxonomy of existing chatgpt research, current challenges, and possible future
directions, Current Challenges, and Possible Future Directions (April 8, 2023) (2023).
[15] S. R. Ahmed, E. Sonuç, M. R. Ahmed, A. D. Duru, Analysis survey on deepfake detection and
recognition with convolutional neural networks, in: 2022 International Congress on
HumanComputer Interaction, Optimization and Robotic Applications (HORA), IEEE, 2022, pp. 1–7.
[16] B. Vidgen, H. Taylor, M. Pantazi, Z. Anastasiou, B. Inkster, H. Margetts, Understanding vulnerability
to online misinformation, The Alan Turing Institute. Retrieved September 27 (2021) 2021–02.
[17] M. Pantazi, S. Hale, O. Klein, Social and cognitive aspects of the vulnerability to political
misinformation, Political Psychology 42 (2021) 267–304.
[18] J. De Freitas, B. A. Falls, O. S. Haque, H. J. Bursztajn, Vulnerabilities to misinformation in online
pharmaceutical marketing, Journal of the Royal Society of Medicine 106 (2013) 184–189.
[19] R. Hou, V. Perez-Rosas, S. Loeb, R. Mihalcea, Towards automatic detection of misinformation in
online medical videos, in: 2019 International Conference on Multimodal Interaction, ICMI ’19,
Association for Computing Machinery, New York, NY, USA, 2020, p. 235–243.
[20] J. Jing, F. Li, B. Song, Z. Zhang, K.-K. R. Choo, Disinformation propagation trend analysis
and identification based on social situation analytics and multilevel attention network, IEEE
Transactions on Computational Social Systems 10 (2022) 507–522.
[21] K. Sharma, F. Qian, H. Jiang, N. Ruchansky, M. Zhang, Y. Liu, Combating fake news: A survey on
identification and mitigation techniques, ACM Transactions on Intelligent Systems and Technology
(TIST) 10 (2019) 1–42.
[22] Y. Chen, N. K. Conroy, V. L. Rubin, News in an online world: The need for an “automatic crap
detector”, Proceedings of the Association for Information Science and Technology 52 (2015) 1–4.
[23] G. Orosz, L. Faragó, B. Paskuj, P. Krekó, Strategies to combat misinformation: Enduring efects of
a 15-minute online intervention on critical-thinking adolescents, Computers in Human Behavior
(2024) 108338.
[24] J. Roozenbeek, S. Van Der Linden, T. Nygren, Prebunking interventions based on “inoculation”
theory can reduce susceptibility to misinformation across cultures (2020).
[25] R. A. Blair, J. Gottlieb, B. Nyhan, L. Paler, P. Argote, C. J. Stainfield, Interventions to counter
misinformation: Lessons from the global north and applications to the global south, Current
Opinion in Psychology 55 (2024) 101732.
[26] UNICEF, Convention on the rights of the child | ohchr, 1990. URL: https://www.ohchr.org/en/
instruments-mechanisms/instruments/convention-rights-child.
[27] J. H. Danovitch, Growing up with google: How children’s understanding and use of internet-based
devices relates to cognitive development, Human Behavior and Emerging Technologies 1 (2019)
81–90.
[28] S. Xu, A. Shtulman, A. G. Young, Can children detect fake news?, in: Proceedings of the Annual</p>
      <p>Meeting of the Cognitive Science Society, volume 44, 2022.
[29] M. J. Metzger, A. J. Flanagin, A. Markov, R. Grossman, M. Bulger, Believing the unbelievable:
Understanding young people’s information literacy beliefs and practices in the united states,
Journal of Children and Media 9 (2015) 325–348.
[30] L. N. Girouard-Hallam, Y. Tong, F. Wang, J. H. Danovitch, What can the internet do?: Chinese
and american children’s attitudes and beliefs about the internet, Cognitive Development 66 (2023)
101338.
[31] V. Braun, V. Clarke, Using thematic analysis in psychology, Qualitative Research in Psychology 3
(2006) 77–101. URL: https://www.tandfonline.com/doi/abs/10.1191/1478088706qp063oa.
[32] E. Loos, L. Ivan, D. Leu, “save the pacific northwest tree octopus”: a hoax revisited. or: how
vulnerable are school children to fake news?, 2018. doi:10.1108/ILS-04-2018-0031.
[33] A. Pellegrino, A. Stasi, A bibliometric analysis of the impact of media manipulation on
adolescent mental health: Policy recommendations for algorithmic transparency, Online Journal of
Communication and Media Technologies 14 (2024) e202453.
[34] C. Kiili, I. Bråten, H. I. Strømsø, M. S. Hagerman, E. Räikkönen, A. Jyrkiäinen, Adolescents’
credibility justifications when evaluating online texts, Education and Information Technologies 27
(2022) 7421–7450.
[35] C. Papapicco, I. Lamanna, F. D’Errico, Adolescents’ vulnerability to fake news and to racial hoaxes:</p>
      <p>A qualitative analysis on italian sample, Multimodal Technologies and Interaction 6 (2022).
[36] E. Theophilou, F. Lomonaco, G. Donabauer, D. Ognibene, R. J. Sánchez-Reina, D. Hernàndez-Leo,
Ai and narrative scripts to educate adolescents about social media algorithms: Insights about
ai overdependence, trust and awareness, in: O. Viberg, I. Jivet, P. Muñoz-Merino, M. Perifanou,
T. Papathoma (Eds.), Responsive and Sustainable Educational Futures, Springer Nature Switzerland,
Cham, 2023, pp. 415–429.
[37] B. S. Jean, N. G. Taylor, C. Kodama, M. Subramaniam, Assessing the health information source
perceptions of tweens using card-sorting exercises, Journal of Information Science 44 (2018)
148–164.
[38] J. R. Carl-Anton Werner Axelsson, Thomas Nygren, S. van der Linden, Bad news in the civics
classroom: How serious gameplay fosters teenagers’ ability to discern misinformation techniques,
Journal of Research on Technology in Education 0 (2024) 1–27.
[39] D. Soares, J. Reis, Behaviour of the Adolescents and Their Parents in Relation to the
Micro</p>
      <p>Influencers in Instagram, 2023, pp. 361–374.
[40] S. Ali, D. DiPaola, I. Lee, V. Sindato, G. Kim, R. Blumofe, C. Breazeal, Children as creators, thinkers
and citizens in an ai-driven future, Computers and Education: Artificial Intelligence 2 (2021)
100040. URL: https://www.sciencedirect.com/science/article/pii/S2666920X21000345.
[41] F. Sharevski, J. Vander Loop, Children, parents, and misinformation on social media, in: 2024 IEEE</p>
      <p>Symposium on Security and Privacy (SP), IEEE, 2024, pp. 1536–1553.
[42] A. Kolarić, M. Juric, N. Peša Pavlović, College students’ credibility judgments on healthy diet
information on social media, in: European Conference on Information Literacy, Springer, 2021, pp.
62–74.
[43] T. Nygren, M. Guath, C.-A. W. Axelsson, D. Frau-Meigs, Combatting visual fake news with a
professional fact-checking tool in education in france, romania, spain and sweden, Information 12
(2021). URL: https://www.mdpi.com/2078-2489/12/5/201.
[44] K. L. Powers, J. E. Brodsky, F. C. Blumberg, P. J. Brooks, Creating developmentally-appropriate
measures of media literacy for adolescents, in: Proceedings of the Technology, Mind, and Society,
2018, pp. 1–5.
[45] J. v. Helvoort, M. Thissen, Creating news: An activating approach to make children news literate,
in: European Conference on Information Literacy, Springer, 2021, pp. 29–37.
[46] C. Kiili, E. Räikkönen, I. Bråten, H. I. Strømsø, M. S. Hagerman, Examining the structure of
credibility evaluation when sixth graders read online texts, Journal of Computer Assisted Learning
39 (2023) 954–969.
[47] K. Hartwig, T. Biselli, F. Schneider, C. Reuter, From adolescents’ eyes: Assessing an indicator-based
intervention to combat misinformation on tiktok, in: Proceedings of the 2024 CHI Conference on
Human Factors in Computing Systems, CHI ’24, Association for Computing Machinery, New York,
NY, USA, 2024. URL: https://doi.org/10.1145/3613904.3642264.
[48] N. Abdullah, S. K. R. Basar, How children gauge information trustworthiness in online search:
Credible or convenience searcher?, Pakistan Journal of Information Management &amp; Libraries 21
(2019) 1–19.
[49] M. Landoni, E. Murgia, T. Huibers, M. S. Pera, How does information pollution challenge children’s
right to information access? (2023).
[50] C.-A. W. Axelsson, M. Guath, T. Nygren, Learning how to separate fake from real news: Scalable
digital tutorials promoting students’ civic online reasoning, Future Internet 13 (2021).
[51] T. Nygren, M. Guath, Mixed digital messages: The ability to determine news credibility among
swedish teenagers., International Association for Development of the Information Society (2018).
[52] L. Salmerón, M. Macedo-Rouet, J.-F. Rouet, Multiple viewpoints increase students’ attention to
source features in social question and answer forum messages, Journal of the Association for
Information Science and Technology 67 (2016) 2404–2419.
[53] T. Al-Dala’in, J. H. S. Zhao, Overview of the benefits deep learning can provide against fake
news, cyberbullying and hate speech, in: Proceedings of the Second International Conference on
Innovations in Computing Research (ICR’23), Springer, 2023, pp. 13–27.
[54] E. K. Hämäläinen, C. Kiili, M. Marttunen, E. Räikkönen, R. González-Ibáñez, P. H. Leppänen,
Promoting sixth graders’ credibility evaluation of web pages: An intervention study, Computers
in Human Behavior 110 (2020) 106372.
[55] T. Mladenova, I. Valova, Research on the ability to detect fake news in users of social networks,
in: 2022 International Congress on Human-Computer Interaction, Optimization and Robotic
Applications (HORA), IEEE, 2022, pp. 01–04.
[56] A. M. Dufy, T. Liying, L. Ong, Singapore teens’ perceived ownership of online sources and
credibility, First Monday (2010).
[57] T. G. Meitz, A. Ort, A. Kalch, S. Zipfel, G. Zurstiege, Source does matter: contextual efects
on online media-embedded health campaigns against childhood obesity, Computers in Human
Behavior 60 (2016) 565–574.
[58] R. Anttonen, K. Kiili, E. Räikkönen, C. Kiili, Storifying instructional videos on online credibility
evaluation: Examining engagement and learning, Computers in Human Behavior 161 (2024)
108385. URL: https://www.sciencedirect.com/science/article/pii/S074756322400253X.
[59] A. L. Terra, S. Sá, Strategies to assess web resources credibility: results of a case study in primary
and secondary schools from portugal, in: Worldwide Commonalities and Challenges in Information
Literacy Research and Practice: European Conference on Information Literacy, ECIL 2013 Istanbul,
Turkey, October 22-25, 2013 Revised Selected Papers 1, Springer, 2013, pp. 492–498.
[60] E. K. Hämäläinen, C. Kiili, E. Räikkönen, M. Marttunen, Students’ abilities to evaluate the credibility
of online texts: The role of internet-specific epistemic justifications, Journal of Computer Assisted
Learning 37 (2021) 1409–1422.
[61] C. Rodríguez-Hidalgo, D. Rivera-Rogel, A. M. B. Flandoli, R. C. Tapia, L. A. Vargas, Teens facing
fake news. media literacy needs in the classroom, in: 2023 18th Iberian Conference on Information
Systems and Technologies (CISTI), 2023, pp. 1–7.
[62] F. Aprin, N. Malzahn, F. Lomonaco, G. Donabauer, D. Ognibene, U. Kruschwitz, D. Hernández-Leo,
G. Fulantelli, H. U. Hoppe, The “courage companion”–an ai-supported environment for training
teenagers in handling social media critically and responsibly, in: International Workshop on
Higher Education Learning Methodologies and Technologies Online, Springer, 2022, pp. 395–406.
[63] A. Flanagin, M. Metzger, The perceived credibility of online encyclopedias among children, in:</p>
      <p>Proceedings of the AAAI Conference on Web and Social Media, volume 4, 2010, pp. 239–242.
[64] L. Bowler, J. Monahan, W. Jeng, J. S. Oh, D. He, The quality and helpfulness of answers to eating
disorder questions in yahoo! answers: Teens speak out, Proceedings of the Association for
Information Science and Technology 52 (2015) 1–10.
[65] C. Almeida, M. Macedo-Rouet, V. B. de Carvalho, W. Castilhos, M. Ramalho, L. Amorim, L.
Massarani, When does credibility matter? the assessment of information sources in teenagers
navigation regimes, Journal of Librarianship and Information Science 55 (2023) 218–231.
[66] F. van der Sluis, B. van Dijk, A closer look at children’s information retrieval usage, in: 33st Annual
International ACM SIGIR Conference on Research and Development in Information Retrieval
(SIGIR’10), 2010.
[67] M. Landoni, M. S. Pera, E. Murgia, T. Huibers, Inside out: Exploring the emotional side of search
engines in the classroom, in: Proceedings of the 28th ACM conference on user modeling, adaptation
and personalization, 2020, pp. 136–144.
[68] I. S. Foundation, What are the most used languages on the internet? , 2023. URL:
https://www.isocfoundation.org/2023/05/what-are-the-most-used-languages-on-the-internet/#:
~:text=English%20was%20often%20considered%20the,Internet%20were%20also%20English%
20speakers.
[69] Statista, Most used languages online by share of websites 2024 | statista, 2024. URL: https://www.</p>
      <p>statista.com/statistics/262946/most-common-languages-on-the-internet/.
[70] M. Fernández-Pichel, M. Bink, D. E. Losada, D. Elsweiler, A user study on people’s perception to
the credibility of online health information., in: ROMCIR@ ECIR, 2024, pp. 2–16.
[71] C. Stöcker, How facebook and google accidentally created a perfect ecosystem for targeted
disinformation, in: Disinformation in Open Online Media: First Multidisciplinary International
Symposium, MISDOOM 2019, Hamburg, Germany, February 27–March 1, 2019, Revised Selected
Papers 1, Springer, 2020, pp. 129–149.
[72] E. Hussein, P. Juneja, T. Mitra, Measuring misinformation in video search platforms: An audit
study on youtube, Proceedings of the ACM on Human-Computer Interaction 4 (2020) 1–27.
[73] E. Mustafaraj, P. T. Metaxas, The fake news spreading plague: Was it preventable?, in: Proceedings
of the 2017 ACM on web science conference, 2017, pp. 235–239.
[74] J. Thorne, A. Vlachos, O. Cocarascu, C. Christodoulopoulos, A. Mittal, The fever2. 0 shared task,
in: Proceedings of the 2nd workshop on Fact Extraction and VERification (FEVER), 2019, pp. 1–6.
[75] W. Y. Wang, " liar, liar pants on fire": A new benchmark dataset for fake news detection, arXiv
preprint arXiv:1705.00648 (2017).
[76] N. Thomas, C. O’Kane, The ethics of participatory research with children, Children &amp; Society 12
(1998) 336–348. URL: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1099-0860.1998.tb00090.x.
[77] O. S. Iversen, R. C. Smith, C. Dindler, Child as protagonist: Expanding the role of children in
participatory design, in: Proceedings of the 2017 Conference on Interaction Design and Children,
IDC ’17, 2017, p. 27–37.
[78] M. Landoni, T. Huibers, E. Murgia, M. S. Pera, Good for children, good for all?, in: European</p>
      <p>Conference on Information Retrieval, Springer, 2024, pp. 302–313.
[79] D. Bilal, The mediated information needs of children on the autism spectrum disorder (asd), in:
Proceedings of the 31st ACM SIGIR Workshop on Accessible Search Systems, Geneva, Switzerland,
ACM Geneva, 2010, pp. 42–49.
[80] A. Anandhan, L. Shuib, M. A. Ismail, G. Mujtaba, Social media recommender systems: Review and
open research issues, IEEE Access 6 (2018) 15608–15628. doi:10.1109/ACCESS.2018.2810062.
[81] J. Pilgrim, Are we preparing students for the web in the wild? an analysis of features of websites
for children, The Journal of Literacy and Technology 20 (2019) 97–124.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>C.</given-names>
            <surname>Wardle</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Derakhshan</surname>
          </string-name>
          , Information disorder:
          <article-title>Toward an interdisciplinary framework for research and policymaking</article-title>
          , volume
          <volume>27</volume>
          , Council of Europe Strasbourg,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A. M. A.</given-names>
            <surname>Ahmed</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. K.</given-names>
            <surname>Othman</surname>
          </string-name>
          ,
          <article-title>False advertising and consumer online purchase behaviour</article-title>
          ,
          <source>Journal of Emerging Economies and Islamic Research</source>
          <volume>12</volume>
          (
          <year>2024</year>
          )
          <fpage>1521</fpage>
          -
          <lpage>1521</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>T.</given-names>
            <surname>Burki</surname>
          </string-name>
          ,
          <article-title>Vaccine misinformation and social media</article-title>
          ,
          <source>The Lancet Digital Health</source>
          <volume>1</volume>
          (
          <year>2019</year>
          )
          <fpage>e258</fpage>
          -
          <lpage>e259</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>S.</given-names>
            <surname>Loomba</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. De Figueiredo</surname>
            ,
            <given-names>S. J.</given-names>
          </string-name>
          <string-name>
            <surname>Piatek</surname>
          </string-name>
          ,
          <string-name>
            <surname>K. De Graaf</surname>
            ,
            <given-names>H. J.</given-names>
          </string-name>
          <string-name>
            <surname>Larson</surname>
          </string-name>
          ,
          <article-title>Measuring the impact of covid-19 vaccine misinformation on vaccination intent in the uk and usa</article-title>
          ,
          <source>Nature human behaviour 5</source>
          (
          <year>2021</year>
          )
          <fpage>337</fpage>
          -
          <lpage>348</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>K.</given-names>
            <surname>Munger</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. J.</given-names>
            <surname>Egan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Nagler</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ronen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Tucker</surname>
          </string-name>
          ,
          <article-title>Political knowledge and misinformation in the era of social media: Evidence from the 2015 uk election</article-title>
          ,
          <source>British Journal of Political Science</source>
          <volume>52</volume>
          (
          <year>2022</year>
          )
          <fpage>107</fpage>
          -
          <lpage>127</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>J.</given-names>
            <surname>Green</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Hobbs</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>McCabe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Lazer</surname>
          </string-name>
          ,
          <article-title>Online engagement with 2020 election misinformation and turnout in the 2021 georgia runof election</article-title>
          ,
          <source>Proceedings of the National Academy of Sciences</source>
          <volume>119</volume>
          (
          <year>2022</year>
          )
          <article-title>e2115900119</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>K.</given-names>
            <surname>Nagi</surname>
          </string-name>
          ,
          <article-title>New social media and impact of fake news on society</article-title>
          ,
          <source>ICSSM Proceedings, July</source>
          (
          <year>2018</year>
          )
          <fpage>77</fpage>
          -
          <lpage>96</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>N.</given-names>
            <surname>Arif</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Al-Jefri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. H.</given-names>
            <surname>Bizzi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. B.</given-names>
            <surname>Perano</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Goldman</surname>
          </string-name>
          , I. Haq,
          <string-name>
            <given-names>K. L.</given-names>
            <surname>Chua</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Mengozzi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Neunez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Smith</surname>
          </string-name>
          , et al.,
          <article-title>Fake news or weak science? visibility and characterization of antivaccine webpages returned by google in diferent languages and countries</article-title>
          ,
          <source>Frontiers in immunology 9</source>
          (
          <year>2018</year>
          )
          <fpage>1215</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>D.</given-names>
            <surname>Metaxa</surname>
          </string-name>
          ,
          <string-name>
            <surname>N.</surname>
          </string-name>
          <article-title>Torres-Echeverry, Google's role in spreading fake news and misinformation</article-title>
          , Available at SSRN 3062984 (
          <year>2017</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>K.</given-names>
            <surname>Haan</surname>
          </string-name>
          ,
          <article-title>Is social media the new google? gen z turn to google 25% less than gen x when searching</article-title>
          ,
          <year>2024</year>
          . URL: https://www.forbes.com/advisor/business/software/social-media-new-google/.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>