=Paper= {{Paper |id=Vol-3138/paper4_jot |storemode=property |title=Cognitive Security and Resilience: A Social Ecological Model of Disinformation and other Harms with Applications to COVID-19 Vaccine Information Behaviors |pdfUrl=https://ceur-ws.org/Vol-3138/paper4_jot.pdf |volume=Vol-3138 |authors=Shawn Janzen,Carolin Orr,Sara-Jayne Terp |dblpUrl=https://dblp.org/rec/conf/ecir/JanzenOT22 }} ==Cognitive Security and Resilience: A Social Ecological Model of Disinformation and other Harms with Applications to COVID-19 Vaccine Information Behaviors== https://ceur-ws.org/Vol-3138/paper4_jot.pdf
Cognitive Security and Resilience: A Social Ecological
Model of Disinformation and other Harms with
Applications to COVID-19 Vaccine Information
Behaviors
Shawn Janzen1,2 , Caroline Orr1 , Ph.D. and Sara-Jayne Terp2
1
  Applied Research Laboratory for Intelligence and Security (ARLIS), University of Maryland, College Park, Maryland,
USA
2
  College of Information Studies (iSchool), University of Maryland, College Park, Maryland, USA


                                         Abstract
                                         Access to and discovery of credible information is the product of numerous, interacting factors including
                                         individual characteristics and behaviors as well as features of the information environment, social, cul-
                                         tural, and institutional norms, policies and regulations, and more. To date, most research on information
                                         disorder has focused either on the individual or on the information environment (or on the technology
                                         that allows an individual to access the information environment), but there is a lack of systematic,
                                         theory-driven research on the dynamic relationship between the individual and their environment. In
                                         this study, we propose a novel application of Brofenbrenner’s social ecological model to the study of
                                         cognitive security and resilience in the context of information disorder. First, we describe the refitting
                                         of the model from public health and human development to cognitive security. Using extant literature
                                         in the field, we identify the key factors at each level of influence — including individual-level factors
                                         such as attitudes/beliefs, knowledge/experience, and demographic characteristics, as well as higher-level
                                         factors at the interpersonal-, organizational/institutional-, community-, and policy/culture-levels — that
                                         shape susceptibility and resilience to information disorder. We also consider the dynamic interactions
                                         between individuals, groups, societies, and characteristics of the technological environment, including
                                         how algorithms and artificial intelligence interact with individual behaviors, policies, and organizational
                                         decision-making to shape access to and discoverability of credible information. Finally, we describe
                                         an application of the model to a use case involving COVID-19-related information behaviors. To our
                                         knowledge, this is the first time Brofenbrenner’s social ecological model has been applied in full as a
                                         conceptual foundation for the study of cognitive security and resilience. Our findings provide impor-
                                         tant new insight into the social, cultural, and structural factors that shape information behaviors and
                                         access to credible information, as well as the impact of information disorder. The results can be used
                                         to identify vulnerabilities and targets for future information-related initiatives and interventions (such
                                         as fact-checking and journalism initiatives) and to inform evaluations of such initiatives, as well as to
                                         better understand variation in susceptibility and resilience to information disorder. Further, this study
                                         lays an important conceptual foundation for future research to expand on this use case and refine the
                                         application of the social ecological model to the information domain.

                                         Keywords
                                         cognitive security, social ecological model, misinformation, information disorder, information behavior,
                                         COVID-19



ROMCIR 2022: The 2nd Workshop on Reducing Online Misinformation through Credible Information Retrieval, held as
part of ECIR 2022: the 44th European Conference on Information Retrieval, April 10-14, 2022, Stavanger, Norway
$ sjanzen@umd.edu (S. Janzen); corr@umd.edu (C. Orr); sjterp@umd.edu (S. Terp)
                                       © 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)
1. Introduction
The United States of America (US) presidential elections in 2016 and 2020, the Brexit referendum
in 2016, and now the ongoing coronavirus pandemic, have thrust the issue of information
disorder into the global spotlight, leading to greater awareness of the challenge and a surge of
new efforts to address it. First conceptualized by Wardle and Derakhshan, information disorder
describes the creation and/or sharing of false or misleading information, whether deliberately
or unwittingly, with or without the intent to cause harm [1]. The concept encompasses misin-
formation — that is, the unintentional sharing of false information — as well as disinformation,
or information that is deliberately false or misleading [2]. It also includes a third category of
misleading content, malinformation, which describes genuine information that is shared with
the intent to cause harm.
   Although the spread of false and misleading content is not a new phenomenon, the internet,
and particularly social media, have given rise to a fundamental change in how people commu-
nicate and how information is disseminated and accessed, which has contributed to the viral
spread of mis- and disinformation as well as targeted propaganda and influence campaigns.
The “three Vs” of volume, velocity, and variety from big data also contribute to the reach of
disinformation, cognitive overloads in processing it, and in the ability of disinformation creators
to rapidly test and adjust messaging, channels and media to maximize impact. The scale of the
problem and its far-reaching effects have created an urgent need to develop effective strategies
to counter information disorder and facilitate better access to and engagement with credible
information, but these efforts have been hampered by fundamental challenges including incon-
sistent terminology, a lack of integration of research from different disciplines, and underuse
of theory. To more effectively address the challenges posed by information disorder, it is first
necessary to develop a better understanding of the problem and identify the most promising
avenues to counter it.
   The causes of and contributors to information disorder are complex and multifaceted, and
the existing literature on information disorder spans numerous fields including communication,
media studies, public health, psychology, computer and information science, and security studies.
Studies have examined the characteristics that make individuals susceptible to information
disorder, as well as the characteristics of the information itself, the networks in which it spreads,
and the platforms and technologies that enable individuals to form networks and engage with
information. However, there is a lack of foundational, theory-based research examining how
these agents, processes, and environments interact with each other and respond to change.
   In this paper, we propose a novel framework for conceptualizing cognitive security and
resilience in the context of information disorder and information-based harms. We take a
sociotechnical systems view of mis- and disinformation, drawing from information security tools
and processes, as well as cognitive security’s twin definitions of adversarial machine learning
that affects machine beliefs, and social engineering at large scale, which we term adversarial
cognition [3, 4]. In this paper, we define cognitive security as the ability to detect, characterize,
and counter misinformation, disinformation, and other information-based harms and forms of
malign influence among people. Resilience, as part of cognitive security, includes the structural
context that protects humans from exposure to disinformation in the first place, as well as the
ability to identify it, limit its spread, and mitigate its effects once exposed. Throughout this
paper, we use the term “information-based harms” to refer to misinformation, disinformation,
conspiracy theories, and a variety of other types of potentially harmful information.
   The framework we are proposing builds on existing work that has applied fundamental
concepts from the field of public health to the study of information and information disorder.
For example, the spread of rumors and other falsehoods on social media is often compared
to the spread of contagious viruses, which is why widely-shared posts are said to have “gone
viral.” However, there are limits to this epidemiology-based model; discussing rumors and
misinformation as malicious viruses to be contained and removed creates stigma on participants
in them, especially where they arise from genuine information behaviors and social interaction.
There are parallels here to the first applications of the Social Ecological Model and other
ecological frameworks in the field of the public health, which came in response to criticism that
traditional approaches to studying health and disease — which largely focused on individual
characteristics and behaviors — promoted a victim-blaming mentality in which blame for poor
health outcomes was placed on the shoulders of individuals, often without consideration of the
structural and environmental causes.
   Our proposed model is an adaptation of Broffenbrenner’s Social Ecological Model (SEM),
which was initially developed as a framework through which to understand human development,
with a particular emphasis on the dynamic interactions between individuals and their environ-
ments [5]. Since its inception in the 1970s, the SEM has been applied in various formats across
a variety of domains including public health [6], health literacy [7], media and communications
[8], risk management [9], and organizational change [10]. The model recognizes individuals as
embedded within multiple levels of interacting systems, and within each of these systems there
are myriad factors that directly and indirectly influence the individual and are influenced by
the individual. This core assumption — that individuals can influence their environment and
that the environment can influence the individual — is known as reciprocal causation.
   Where and how you retrieve information matters: information seekers not only analyze
retrieved information, they also co-create the information search terms that they use with
different levels of their personal SEMs, leading to term-based information silos [11]. The
credibility of retrieved information is based on factors that include, according to Self as quoted
by Pasi, "(i) the source of information, (ii) the information that is diffused, i.e., the message,
considering both its structure and its content, and (iii) the media used to diffuse information"
[12], [13]. The Admiralty Code [14] is widely used to assess credibility in information retrieval
and open source intelligence (OSINT) by separately rating information contents and sources,
but there is scant research on the effects of the type of source, and their relationship to the
information seeker, or on the credibility they assign to information retrieved through personal
communication, online search, networking, and OSINT. This SEM extends information retrieval
to give ways to consider the source and the effects of source assessment on retrieved information
credibility.
   While the levels of the SEM have been conceptualized and labeled in different ways over
the course of the past five decades, this study builds on a version of the model that is widely
used in public health, health promotion, and behavior change research. This framework, an
adaptation of Brofenbrenner’s model put forth by McLeroy and colleagues, specifies five levels
of influence that interact with each other and with the individual, starting with the individual
level, which encompasses the most proximal layer of influences such as demographic factors,
identity, political ideology, attitudes, beliefs, emotions, knowledge/skills, behaviors, and more
[15]. The second level of the model, the interpersonal level, comprises the external social
influences of family, friends, and other close relationships, as well as related social factors
such as group norms and social support. The organizational level of influence describes the
organizations and institutions in which social relationships occur and in which policies and
regulations originate. In the public health context, this would include local, state, federal, and
global health agencies such as the Centers for Disease Control and Prevention (CDC) and the
World Health Organization (WHO). The next level of influence is the community level, which
focuses on the networks that connect organizations and institutions, the settings in which they
exist, and the culture and norms that emanate from these spaces. Examples include the public
health community, the global aid community, the information security community, and the
education community. The fifth level is the policy/societal level, which includes broad societal
factors that create a climate in which certain practices, behaviors, and phenomena are either
reinforced/encouraged or inhibited/discouraged, as well as factors such as poverty, inequality,
discrimination and bias, and strength of democracy. This level also includes the policies that
create or reduce poverty, inequality, discrimination, and related factors, as well as policies
focused on technology, information, security, and defense. For the purposes of this study, we
chose to describe these layers separately, as we identified several key areas where policy and
society were moving at different speeds, and/or where coalitions involved in policy-making
spanned numerous, heterogeneous societies and thus were not accurately captured in a single
level.
                                                    Throughout this paper, we also consider differ-
                                                 ent stages of cognitive security. These include
                                                 planning and preparation of responses, prediction
                                                 and prevention, intervention and interdiction, re-
                                                 action and recovery. Planning builds connections
                                                 to responders across the SEM model, and creates
                                                 a “muscle memory” for later responses. Prediction
                                                 works to identify potential and emerging cogni-
                                                 tive security incidents, Intervention responds to
 Figure 1: Emergency Cycle Crisis Stages incidents, and reduces the capability of incident
                                                 components to do damage (e.g. by rate-limiting
botnets). Recovery is damage repair after an incident, including evaluating incident responses
and learning from these. Issues with stages include that responders often find themselves
in multiple stages at the same time, and that there are observe–orient–decide–act (OODA)
loop-style feedback loops in these cycles. Figure 1 illustrates this.


2. Related Work
The SEM has been used as a framework for research and program planning in a variety of
fields, particularly within the areas of human development, public health, and intervention
planning. In recent years, increasing attention has been given to ecological models like the
SEM, in large part because federal agencies like the National Institutes of Health (NIH) Office
of Behavioral and Social Science Research (OBSSR) and the CDC have issued calls for more
research incorporating transdisciplinary science and systems science methods in an effort to
better understand the multilevel influences on health and disease [16]. Although there are
many parallels between the study of information disorder and the fields of public health and
behavioral science, there are few applications of the SEM and related models in the area of
information-based harms.
   Lewin was among the first scientists to adopt an ecological approach to understanding
human behavior [17]. Brofenbrenner, a student of Lewin, is credited with formulating ecological
systems theory, which views individuals as agents who influence, and are influenced by, their
environments [5]. The SEM, which is rooted in ecological systems theory, also incorporates
principles from systems science, including Watzlawick, Weakland, and Fisch’s theory of problem
formation and problem resolution, which provides a framework for understanding why certain
problems persist while others are resolved [18]. Watzlawick’s theory is based on the idea that
there are at least two distinct types of change: first-order change and second-order change.
First-order change occurs when a change is made inside a system, or a particular symptom or
need is addressed, but the system itself (and its processes and structures) remains unchanged.
Second-order change, on the other hand, occurs when a modification is made to the system
itself, such as when a process or structure is added or removed from the system.
   Expanding on this line of work, Glass and McAtee developed a multilevel model of human
behavior to advance the study of behavioral science within the context of public health [19]. At
the time, the public health community was struggling to develop more effective approaches to
reduce behavioral risk factors such as smoking and physical inactivity. Historically, behavioral
science had approached these problems by focusing on individual characteristics and behaviors,
but by the 1990’s it was becoming increasingly clear that there was a pressing need to better
understand the social context in which these behaviors are shaped. In response to calls from
leading public health scholars to advance a new research agenda focusing on the social causes
and context of disease [20, 21, 22], Glass and McAtee proposed an integrated approach to
studying health behavior that recognized that nearly all public health problems have multiple
causes and are shaped by multiple factors at different levels of influence, and that behavioral
health is the product of both social-environmental and biological processes and systems [19].
   In the fields of public health and behavioral science, there is strong evidence that interventions
are more likely to be effective if they are based on ecological models like the SEM, rather than
individual-level theories, because of the SEM’s focus on multicausality and multilevel influences
on health and behavior [23]. As such, the SEM has been applied in numerous ways to a variety
of health conditions, behaviors, and public health problems, ranging from intimate partner
violence[24] and firearm injuries [25], to obesity [26] and cancer screening [27], to bullying
[28] and suicide prevention [29]. The SEM has also been used during the COVID-19 pandemic
as a framework to study the determinants of preventive behaviors related to the virus, including
vaccine intentions [30] and mask use [31], to understand vulnerability and resilience among
elderly populations [32], to explain country-level variation in COVID-19 abatement efforts [33],
and to conceptualize the impact of the pandemic on other health issues such as opiate use [34].
   McCormack and colleagues applied the SEM to the study of health literacy and patient engage-
ment, showing how both concepts are influenced by social and contextual factors such as the
delivery of health-related information, the communication skills of public health professionals
and medical providers, the characteristics of public health institutions, and the policies that
affect health-related organizations, providers, and patients [35]. After identifying the factors
at each level that influence health literacy and patient engagement, the authors incorporated
ecological processes such as pooled interdependence — a term that describes the cumulative
impact of intervention effects — to specify intervention strategies that could be used to target
factors at each level of the model.
   The SEM and other ecological models have also been applied to the study of infectious disease
outbreaks [36] and emerging infectious diseases [37] to understand the dynamic interactions
between pathogens, hosts, individuals, and their environments, and how changes to any one of
these can influence the spread of disease, the susceptibility of populations and subgroups, the
severity of disease outcomes, and more. In this context, ecological models have primarily been
used for the purposes of risk and needs assessment, identifying priorities for intervention, and
evaluating the impact of prevention and treatment strategies.
   Additionally, the SEM was used by members of an international coalition funded by the United
States Agency for International Development (USAID) to identify ideal communication strategies
to promote health behavior change in response to the 2014 Ebola epidemic, during which fear,
mistrust, and miscommunication severely hampered outbreak response efforts [38]. Although
the Ebola epidemic differs from the coronavirus pandemic in many key aspects, there are also
many parallels between the two situations — namely, the challenge of effective communication
in the face of an unprecedented crisis, widespread mistrust eroding public health efforts, and a
rapidly evolving, emotionally-charged situation that left the population vulnerable to rumors
and misinformation — that make the Ebola epidemic an important example from which we
can learn key lessons to apply in the present. Initial communication efforts during the Ebola
outbreak were largely focused on psychosocial determinants of behavior change, however as the
authors noted, it soon “became evident that controlling the epidemic required communication
interventions to address levels higher than the individual, namely, community and normative
level factors that could influence the desired behaviors, service-level factors that provided critical
resources for the ill, and policy-level factors to guide a coordinated response within a very
limited timeframe” [38]. As such, members of the USAID-sponsored Health Communication
Capacity Collaborative (HC3) project turned to the SEM to formulate a more comprehensive
strategy that explicitly identified possible causal mechanisms to promote behavior change
through domain-based communication activities focused on community dialogue, social change,
service delivery, and individual and household factors.
   Most recently, during the coronavirus pandemic, the SEM has been used to guide the explo-
ration of COVID-19 vaccine intentions and identify subgroups with negative vaccine intentions,
who may represent ideal targets for intervention [30]. The study used survey data and broke
down the items into the levels of the SEM, then used univariate and multivariate models to
compare participants who intended to get vaccinated against COVID-19 to respondents who
did not intend to get vaccinated or who were ambivalent about getting vaccinated. The results
pointed to several potential factors to target in vaccine promotion campaigns, including gender
(males were significantly more likely to have negative intentions to get vaccinated), race (partic-
ipants who identified as Black were significantly more likely to have negative intentions to get
vaccinated), conservative political ideology, and social norms (participants whose peers did not
engage in or support COVID-19 prevention behaviors were significantly more likely to have
negative intentions). This builds on previous research using the SEM and related ecological
models to investigate vaccine attitudes, including Walker and colleagues’ qualitative study
of confusion, mistrust, and hesitancy among mothers who had accepted the HPV vaccine for
their children but were not confident in their decision [39]. This is an important subgroup for
several reasons. First, many vaccinations require multiple doses to be effective, so ongoing
hesitancy after initial acceptance can be a barrier to completing a full vaccination series. Sec-
ondly, although individuals may choose to accept one vaccine and reject others (or vice versa),
there is a risk that hesitancy about one vaccine could develop into more generalized vaccine
hesitancy. For these reasons, individuals who have accepted a vaccine but remain hesitant are
still a key group to consider in vaccine communication and promotion activities. In the study
of HPV vaccine-accepting mothers, interview data revealed that media and social media were
key sources of mistrust and confusion, and that although most mothers indicated a high degree
of trust in their children’s health care providers, the information they got from providers was
often undermined by information they got from other sources, such as friends, family, and the
media. The authors suggested that, in light of parents’ increased access to and engagement with
credible and noncredible sources of information online, and the subsequent expectation to be
more involved in health decision-making, traditional models of the patient-provider relationship
and communication may need to be revised [39].
   Much like the field of public health at the turn of the century, we now find ourselves facing a
complex challenge that threatens the health of both individuals and societies, but which has
been resistant to most efforts to promote change. To date, the vast majority of research on
information-related harms has focused either on individual characteristics and behaviors — such
as why certain people are more susceptible to mis- and disinformation — or on the platforms and
technologies that facilitate the spread of the problem. Other research has approached this prob-
lem by exploring characteristics of information itself, such as why certain misleading content is
more likely to go viral. However, there is a lack of theory-based research that integrates these
different approaches and explicitly considers the interactions between individuals, information,
and the technologies and environments that enable individuals to encounter and engage with
information. We hope to help fill that gap with our proposal for a novel application of the SEM
to the study of cognitive security.


3. Social Ecological Model of Cognitive Security
Figure 2 offers a graphical representation of the SEM for cognitive security with factors discussed
in the sections below.
Figure 2: Social Ecological Model for Cognitive Security and Resilience
3.1. Individual
Individual-level determinants of cognitive security and resilience encompass a wide variety
of characteristics including demographic and psychosocial factors, ideology, knowledge and
technical skills, digital literacy and digital numeracy, as well as information needs, information
evaluation, and information behaviors. These factors interact with each other and with the
information environment, creating a dynamic situation in which characteristics of the individual
influence their information needs, evaluation, and behavior, which in turn influences the types
of information they seek, attend to, share, and recall.
   Demographics: Cognitive security and resilience are influenced by a variety of demographic
factors, including age, race, gender, socioeconomic status, language and vocabulary. For ex-
ample, older individuals (over the age of 65) have been found to be more vulnerable to false
information disseminated via social media and messaging apps [40, 41]. Among youth, social
and cognitive development are important determinants of cognitive security and resilience
due to their influence on information evaluation and uptake. For example, younger children
struggle with certain website design features such as content lists and maps, but benefit more
than older children from learning cues such as pop-ups explaining the main point of a webpage
[42]. Although children generally struggle more than adults with tasks that require analytical
thinking and complex judgments, youth tend to be more comfortable using new technologies
and are often more motivated to engage with emerging technologies [42]. Other research has
found enduring racial, socioeconomic, and age-based divides in access to and use of commu-
nication technologies [43, 44, 41]. Certain dynamics of religiosity, including fundamentalism
and dogmatism, have been found to be associated with reduced analytical thinking and, sub-
sequently, greater susceptibility to conspiracy theories and other falsehoods [45]. Political
ideology has also been shown to influence susceptibility to misinformation, such that individ-
uals who identify as conservative appear to be more susceptible to political misinformation
than left-leaning or ideologically neutral individuals [46]. Language and vocabulary are also
important determinants of cognitive security and vulnerability. These factors can interact with
the technological and information environments, leading to challenges such as the “vocabulary
mismatch problem,” which describes a phenomenon in which different people and/or systems
use different labels to describe the same concept. Put differently, vocabulary mismatch occurs
when “the way users express concepts differs from the way they appear in the corpus” [47] or
when the terms between queries and documents are lexically different but semantically similar
[48].
   Psychosocial factors: Cognitive security and resilience are also influenced by a variety of
psychosocial determinants, including attitudes and beliefs about technology and about the topic
at hand, trust in information and sources of information, as well as the technology used to
access it, subjective norms surrounding source credibility and information-sharing, cognitive
biases, risk perceptions, stress, trauma, emotional state and emotional reactivity, and more
[49, 50].
   Digital literacy and numeracy: Technical knowledge and skills, as well as subject-specific
knowledge and familiarity with the topic at hand, are also important determinants of cognitive
security and resilience. In particular, digital literacy, which the Department of Education defines
as “the skills associated with using technology to enable users to find, evaluate, organize, create,
and communicate information,” [51] and digital numeracy, which involves the ability to process
basic numeric concepts and is closely tied to decision-making ability, have been shown to
be associated with susceptibility to misinformation, such that individuals with low digital
numeracy tend to be more likely to believe misinformation they encounter online [52]. In one
recent study on coronavirus-related misinformation, digital numeracy was found to be the
strongest predictor of susceptibility to misinformation [49].
   Information-related factors: Other individual-level determinants of cognitive security and
resilience include information needs and how they are expressed (e.g., how people interact with
search engines), evaluation of information and sources (e.g., relevance and credibility judgments),
perceived usefulness (of the information), time spent with the information, and a variety of
information behaviors including search behaviors and information seeking, information sharing,
and engaging with information.

3.2. Interpersonal
At the relationships layer, influence on an individual’s cognitive security depends upon the
viewpoints of family, peers, and other social connections. The social environment enables indi-
viduals to affect how others receive, perceive, and understand information, as well as be affected
by others. Likewise, social norms link individuals and through interpersonal relationships
within the social environment, contributing toward cognitive security development within the
group. Similar protective information sharing occurs through prosocial gossip as a way to
defend members of the social group [53] and between the social strata of leadership [54, 55].
    Families and peers: Families and peers are the first connections in the social environment.
This group serves as a natural support system, signal for belonging, and potential source of and
filter for credible information. They serve as fact-checkers and validators to increase individual
cognitive security [56]. Yet, individual attributes of family members and peers, as well as group
demographics, mediate credibility [41]. Additionally, family and peers as sources of credibility
information and attitude change agents is further mediated by individual cognitive ability [57].
    Relationship quality: In addition to the presence of this social connection, the quality of that
interpersonal relationship matters. Where an individual may initially recognize and dismiss
instances of disinformation, they are more likely to become involved when family and peers
consume or are affected by that disinformation [58]. Attempts to correct incorrect information,
an outward cognitive security exercise, could be interpreted as quarrelsome behavior. Therefore,
it is more likely to occur among family and peers, where the relationship could mitigate
perceptions of aggressive communications [58].
    Homophily: Homophily is another factor that shapes social connections, which in the context
of cognitive security considers diversity of information behaviors and receptiveness to new
information. Degrees of homophily allow individual cognitive security to reflect and transfer
within the group. The greater the homophily within a group, the more close-knit and greater
the chance to become an echo chamber. Differentiating individual opinions to correctly ac-
knowledge disinformation in groups with echo chambers can be difficult if some individuals
lack the “necessary instruments and cognitive abilities to assess the level of credibility of pieces
and sources of information with which they come into contact” [59]. When disinformation is
accepted within groups with high homophily, it diffuses quickly through the group and bridges
to similar groups [60]. On the flipside, groups with low homophily may prevent disinformation
from having wide acceptance within the group [60]. Groups with low homophily then demon-
strate a greater chance for individuals within the group to intercede the disinformation and
benefit from differing levels of individual cognitive security within the group. Malicious actors
that produce disinformation recognize the role of homophily and confirmation bias in social
connections and leverage those relationships to create more sophisticated types of antagonistic
information operations [61].
   Social roles and overloading: Other relationship factors include social roles and overloading
behaviors. Social roles can emerge within interpersonal relationships and affect the development
and transmission of cognitive security capabilities. Social exchanges, which include sharing and
correcting disinformation may require use of social capital within the group. Individuals may
weigh their role and value in a relationship or group as a factor whether to act on correcting
others in the group based on the potential social costs [60]. In addition to social roles, the overall
volume of information flowing between social connections can affect the ability to recognize
disinformation. Large amounts of information, correct and incorrect, cycle through various
communication platforms and parse through groups. The degree to which the amount and type
of information presented within a social circle, particularly within a content delivery system,
can overload individuals [62]. Thus, while an overloaded individual could benefit from the
cognitive security of others within the group for information group members share, they would
also rely upon those group members to identify information injected into the group by content
delivery systems.

3.3. Organizations
An organization is a group of people with a common purpose: this includes government de-
partments, businesses (e.g., companies, social media platforms), nonprofits (e.g., United Nations
agencies etc), topic-specific facilities (e.g., hospitals, health facilities, etc), and informal groups
(e.g., cognitive security monitoring and response groups: fact-checking, election monitoring)
[63]. It also includes the groups and businesses that support the creation, dissemination, and
use of mis- and dis-information.
   Organizations’ role in cognitive security: Organizations might maintain their own cognitive
security, or be part of the cognitive security of the vertical (e.g., elections, health, transport)
system. Organizations also affect the cognitive security of individuals, communities, and other
stakeholders, including the end-users that they serve.
   Organizational influence: Organizations influence cognitive security through narrative- and
activity-based mitigations and counters to misinformation and other cyber harms. Narrative-
based counters include prebunking, debunking, and making clear information available in the
spaces where individuals seek, post, and share both information and misinformation. Activity-
based mitigation and counters include reducing the visibility of online misinformation content,
sites, and creators, and training influencers in areas where misinformation has spread offline, in
local languages, or to communities that are hard to reach with broader online campaigns.
   Boundary issues: Organizations implementing cognitive security for themselves have a
boundary problem. Unlike other areas of security, the organization needs to monitor and act on
not just its own systems, but also on systems, e.g., social media platforms, controlled by other
bodies. This forces organizations to cooperate on cognitive security mitigation and counters.
Organizations control the information that they produce, their own responses to external content,
and cooperation with other bodies. Larger organizations have communications and marketing
departments that scan media, such as social media, traditional media, and trade publications, for
mentions of the organization and subjects that affect it, respond to, and produce information
and media about the organization. Few organizations are scanning for misinformation that
affects them; fewer still (outside media and social media organizations) scan for misinformation
about their vertical, or affecting their stakeholder populations.
   Factors in organizational cognitive security: Factors that affect organizations’ own cognitive
security include the organization’s access to information monitoring and response resources
(which is often related to organization size), access to collaboration resources (e.g., with other
organizations, and communities in its area), and visibility of subcommunities within or overlap-
ping the organization. For example, health organizations often contain medical staff who are
part of their own information communities. Organization structure can create issues for which
the solutions are not always clear, such as acknowledging who is responsible for cognitive
security. The dynamics of an organization’s social power also affect its cognitive security: ex-
ternal perceptions (e.g., stigma) and accessibility both of the group and by the group to external
mitigation and response resources that it needs. Like individuals, organizational characteristics
can make cognitive security for the organization and its people easier or harder to obtain.
   The importance of plans: An important first step in implementing cognitive security with
organizations is to create a cognitive security plan, detailing potential and allowed responses to
a disinformation incident, steps in those plans, with contacts listed for internal and external
collaboration (e.g., image production, social media contacts), and mitigation steps that could be
taken to reduce the potential spread and effect of future events (e.g., creating and amplifying
narratives in information voids).

3.4. Communities
A community is a group of people who may or may not be spatially connected and could be local,
national or international, but who share common interests, concerns, or identities. Factors that
affect community cognitive security include the community structure (e.g. community cohesion),
trust levels, existing shared beliefs, and the challenges specific to that community, including
communications challenges. Other factors include literacy, language, existing communications
channels and communication skills (e.g., some communities don’t understand maps, and there
are similar issues with other new information types), access to information and bandwidth, and
the monetary cost of access (for searching, posting, and sharing information).
   Trust: Internews’ work on community-based misinformation response covers several commu-
nity factors [64]. Trust is key in community cognitive security incidents and defenses. Trusted
community information sources include individual influencers (e.g., community leaders, and
community information sources like librarians and local officials), and influential organizations
(e.g., religious bodies) and meeting points (e.g., barbershops). Less-trusted information sources
are generally less local, and include government organizations and websites, science, and main-
stream media. Community cognitive security plans should take these different levels of trust,
and their management, into account.
   Existing shared beliefs: Disinformation often takes advantage of existing in-group narra-
tives and schisms between groups. Examples include the use of Buddhist/Muslim tensions,
Dinka/Nuer tensiobs in South Sudan, and community-level distrust of government fuelling po-
larization during COVID-19. Factors listed include political conflict, social upheaval, economic
stress, and other sociological or psychological factors [65].
   Response origins: Where cognitive security plans originate from is also important. Heeks
described development as either pro, para, or per-poor communities, but this categorization
also applied to other types of community [66]. Pro-community work occurs outside commu-
nities, but on their behalf; para-community work is done working alongside communities;
and per-community work occurs within and by communities. Local context is important to
cognitive security: pro-community centralized responses miss that context, e.g. that the US con-
tains multiple Hispanic communities with different cognitive security needs, whilst grassroots
per-community cognitive security originates from the community needs. This can create a
disconnect between what the community thinks is appropriate intervention, and what funders
and information controlling organizations think it should be, and often gives rise to discussions
about who represents a community, and whose voices in it should be heard.
   Online communities: Online and physical communities differ in several ways [65]. Lieberman
and Schroeder identified four main differences as fewer nonverbal cues, greater anonymity,
more opportunity to form new social ties and bolster weak ties, and wider dissemination of
information [67]. Online disinformation takes advantage of the three “Vs” of big data: greater
volume, at greater velocity, and over a wider variety of channels, languages, formats, and
community structures. Anonymity allows users to spoof (pretend) membership of offline
communities, increasing trust in their roles as community members or influencers. Wider
dissemination of information gives access to many more communities than those bound by
geography or spoken language, with automation and electronic content production making the
management of multiple “sock puppet” accounts and groups feasible [68]. More opportunity to
create and affect social graphs (the ability to form new ties and bolster weak ones) has been
useful to the creators of misinformation-led online groups [69]. Despite this, online communities
do still have geographical factors, including algorithmic containment of what they view because
of their location, language, search terms, and influencers.
   Finally, not all community falsehoods are bad: Community information and coherence is
also sometimes based on false information in the form of myths, convenient untruths, such as
backstories, and other misinformation. These can be used as signals of belonging to a community
without being believed. Santa Claus dropping down chimneys is clearly a myth, but determining
which false information is normal to a community and not to be countered might not be easy to
determine from outside.

3.5. Policy
Public policy plays a substantial role in the governance of the flow, use, and storage of informa-
tion, as well as guide actors within the information and social environments. Such policies also
influence and are influenced by individual, group, and organizational culture and dynamics,
creating a symbiotic system that can reinforce systems of cognitive security. Moreover, policy
is one vehicle that guides how individuals navigate and interact with the institutions and other
actors in the cognitive security landscape. Policy affects several key factors which can trickle
down toward the development of individual cognitive security. Such factors include areas such
as legislation and executive policy, coordinated planning documents, stakeholder involvement,
policy hesitancy or resistance, funding and resources, and research and reporting endeavors.
   Whether individually or as a collaborative group, countries can take a proactive approach to
stem the flow of disinformation to their populations. For example, the US and the European
Union (EU) actively work with experts to identify and counter sources through legislation [70].
Regulation is one type of legislation and executive policy that has a forcing function on the
transmission of disinformation. Regulation in the US requires advertisers to track, sometimes
publicly, who purchased ads and for much in an effort to improve election transparency [71, 70].
At a subnational level, the US states of Washington and California introduced policy to improve
media literacy in schools [72, 73]. Separate similar policies from different groups, such as
intergovernmental or across sectors, can form overlapping mosaics that enhance or hinder
cognitive security against botnets and offset disinformation. These policy mosaics are program
laboratories that produce vital knowledge developments. Yet, policy mosaics can also result
in the creation of a fractured field of consent management platforms for data protection,
where platforms and use of data vary in response to each privacy policy directives [74, 75].
Governments can also use policy planning documents, such as strategic security plans, to
help set agendas, as well as establish leadership and security topic importance. The Biden
Administration’s recent updates to the United States National AI Initiative through AI.gov
present strategic pillars like education and training to increase and improve the workforce
pipeline and emphasize the importance of incorporating socio-technical perspectives [76].
   The stakeholders involved in policy development and implementation have direct and down-
stream impacts on the type, quality, and quantity of policies that affect cognitive security and
disinformation through their formal and informal policy institutions and networks. The number
and type of stakeholders across organizations to be involved in public and other macro-level
policies vary by purpose, while Maynard and colleagues identified nine groups of stakehold-
ers that should be involved information security processes within an organization: executive
management, ICT specialists, security specialists, legal and regulatory, business unit representa-
tives, the user community, human resources, public relations, and external representatives [77].
Moreover, stakeholders should remain involved throughout information security life cycles
[78, 79]. Whether within or across stakeholder organizations, each participant and group bring
their own context regarding what matters. Kshetri found the contexts of formal and informal
institutions and their institutional changes to be informative toward the networks formed and
relationship dynamics, which in-turn impacted cloud technology and its security development
[80]. As cognitive security would encompass use of cloud systems, the contextual drivers and
premise may also be applicable more broadly to cognitive security and resilience.
   The private sector may hesitate or resist policy if there is perceived overreach, censorship, or a
lack of limits and boundaries. Technology remains politically contentious. Examples of national
security policy resistance include allowances of data collection and use, disclosure mandates,
costs to small businesses, unclear liabilities, information sharing that becomes adversarial
roadmaps, and failure to use pre-existing legislation [81]. Reevaluating policy through other
lenses and being interdisciplinary are two avenues to help mitigate policy resistance and
align matters of importance, particularly for wicked problems like disinformation. Cognitive
science-based approaches serve as a lens to better understand the ethics and policy implications
of issues such as AI and botnet mitigation [82]. Funding, timing, and resource allocation
are central elements of policy which are also determined by involved stakeholders, and may
influence policy resistance. Funding, timing, and resources also serve as signals which affect
communities, organizations, and individuals working on cognitive security and resilience topics.
Resource allocation also reflects decision-making priorities and shapes the environment for
future cognitive security efforts. In their work at the National Institute of Standards and
Technology (NIST) advising government executives, Bowen, Chew, and Hash identified capital
planning as an essential element alongside awareness and training as part of information
security program development [83].
   Policies mandates and support for research and reporting endeavors are additional types
of barriers or accelerants for innovation and accountability. The US federal government’s
2020 National Defense Authorization Act (NDAA) honed in on the relationships between
social media platforms and information operations. In response to the NDAA and other rising
disinformation concerns, new organizations, like the Cognitive Security Intelligence Center
and National Commission for Countering Influence Operations (NCCIO), emerged together
to harness academic, civil society, industry talents to defend against online disinformation
[84]. Inadequately funded organizations, such as university labs and private companies, may
not be able to fully participate equally with other peer stakeholders. On the flipside, research
organizations with their reputations on the line may engage less if participation brings unwanted
adversarial attention, such as from botnets [82].
   Yet, for all that policy can do for cognitive security, more policy-related questions arise that
may continue to sway security discussions. This is due to the inherently complex and emergent
nature of the socio-technical challenges that these policies are trying to address. As policies
attempt to eliminate disinformation and reduce its impact, those policy solutions will continue
to raise perceptual, economic, sectoral, ideological, ethical, and legal questions [85]. Many of
these are carried out at the society level.

3.6. Society
A variety of societal characteristics have been shown to influence cognitive security and re-
silience. This level encompasses some of the most impactful but distal determinants that are also
among the most enduring and resistant to change, such as cultural values and traditions, media
and social media influences, economic factors like poverty and inequality, and discrimination
and marginalization.
   Culture and ideology: A recent cross-national comparison of resilience to online disinfor-
mation found that societal polarization decreases resilience to online disinformation, likely
due to increasingly disparate representations of reality that make it more difficult to distin-
guish between false and correct information [86]. Societies characterized by a higher degree
of populism may also be more susceptible to disinformation due to the underlying worldview,
which includes sentiments such as anti-elitism and mistrust of expert knowledge. The same
factors that make populist societies vulnerable to information disorder also make them more
susceptible to anti-vaccine appeals.
   Media and free press: Weak public broadcasting services/public service media is associated
with greater susceptibility to online disinformation [86]. Similarly, societies with stronger media
infrastructures and an independent and free press tend to be more resilient to disinformation,
while those with higher levels of censorship tend to be more vulnerable [87]. However, societies
in which news consumers are distributed across a diverse and fragmented media ecosystem may
have increased susceptibility because of the greater number of entry points for both foreign and
domestic disinformation [86]. Low levels of distrust in institutional sources of knowledge (such
as science and medicine) and higher levels of funding for public service media are associated with
greater resilience to disinformation, as is a lower degree of media polarization and fragmentation
[87].
   Social media: Societies with greater numbers of social media users, higher rates of social
media use, and greater reliance on social media as a news source tend to be characterized by
poorer knowledge of public affairs, reduced political learning, and increased susceptibility to
online disinformation [86].
   Economy: Economic factors such as poverty rates, unemployment, resource allocation can also
make populations more vulnerable to information disorder [88]. Similarly, societies with larger
advertising markets and more potential consumers tend to be more susceptible to disinformation
than those with smaller advertising markets [86]. This is attributed in part to the significant
amount of false content that is produced for the purpose of generating advertising revenue.
   Discrimination and marginalization: Factors such as racism, discrimination, and oppression
are also important determinants of cognitive security. Societies characterized by a greater
degree of marginalization of minorities may be more susceptible to disinformation, in part
because of associated perceptions that the political, social, and economic systems are “rigged”
[89, 90].


4. Use Case
With the SEM applied to a cognitive security context, we now apply it to a use case of US-based,
COVID-19 vaccine hesitancy and refusal based on disinformation and disinformation response.

4.1. Individual
At the individual level, vaccine-related mis- and disinformation can lead to vaccine hesitancy
through several different pathways, many of which also represent opportunities for possible
intervention and mitigation. Key mediating factors include knowledge and understanding,
personal experiences, cognitive and emotional appraisals, and risk perceptions, as well as
demographic and personal characteristics such as race, religion, and political ideology.
   There is a well-documented pattern of racial disparities in vaccine-related beliefs and behav-
iors, and this has continued during the coronavirus pandemic [91]. In the US, members of racial
minority communities have been specifically targeted by vaccine-related mis- and disinforma-
tion, which is believed to have contributed to increased rates of vaccine hesitancy within these
communities. Given the legacy of racism in science and medicine, race may also impact factors
such as trust in scientific and medical institutions, which could influence vaccine hesitancy
among members of racial minority communities by driving them to seek out information from
alternative sources who they trust but may not always be credible. Further discussion of the
impact of race and racism on vaccine hesitancy, and their relationship with vaccine-related
misinformation, is available in the societal section.
   Previous experiences with vaccination and/or knowledge of someone who suffered from a
vaccine-preventable disease or an adverse event associated with vaccination are associated with
vaccine hesitancy [92], as is being a parent [93]. During the COVID-19 pandemic, particularly,
research has found that Americans who identify as evangelical Christians are more likely to
be vaccine hesitant and less likely to respond to persuasive messaging aimed at shifting their
attitudes about vaccination [94]. Political affiliation has also been shown to influence risk
perceptions during the COVID-19 pandemic, with Republicans far less likely to view the virus
as a major threat to public health [95]. Given that low levels of perceived risk and susceptibility
are associated with a lower likelihood of vaccine uptake, this may be another avenue through
which exposure to misinformation influences vaccine hesitancy [96].
   Previous research shows that certain psychological factors, such as conspiratorial thinking
and endorsement of conspiracy theories, disgust sensitivity, and higher levels of reactance
and non-conformity are associated with negative vaccine attitudes and lower intentions to get
vaccinated [97, 98]. These patterns have also been documented during the COVID-19 pandemic,
with studies showing that conspiratorial beliefs are associated with negative attitudes about the
COVID-19 vaccine and reduced intentions to get vaccinated [99].
   Concerns about the safety of vaccines and the potential risks are two of the main drivers
of both general vaccine hesitancy and, specifically, COVID-19 vaccine hesitancy [100, 101].
Exposure to vaccine-related misinformation may alter risk perceptions, resulting in negative
attitudes toward vaccination and, in some cases, vaccine hesitancy [98, 102]. This may be
particularly true for personal narratives about adverse events associated with vaccination, which
are a very common and influential form of vaccine misinformation [103]. Cognitive biases may
also contribute to vaccine hesitancy after exposure to vaccine-related mis- and disinformation
[100]. For example, since mis- and disinformation tend to be more emotionally salient than
accurate information, these messages may be easier to recall and lead to misperceptions about
the frequency of rare events, ultimately resulting in vaccine hesitancy. Additionally, emotionally
driven messaging and vaccine-related misinformation that manipulate emotions such as fear
and anger have been identified as key contributors to vaccine hesitancy [104]. Exposure to
COVID-19-related misinformation has been found to be associated with increased fear and
stress, which in turn may impair information processing and lead to poorer health-related
decision-making [105]. Studies show that psychological distress is associated with vaccine
hesitancy, and there is evidence that this relationship is mediated by mistrust and belief in
conspiracy theories [106]. Additionally, research indicates that people may be more likely to
share low-credibility sources of COVID-19 vaccine information as they cope with anxiety, anger,
and fear, suggesting that there may be a feedback loop involving exposure to emotionally-driven
mis- and disinformation, emotional appraisal and response, and information behaviors, which
may in turn result in increasing levels of vaccine hesitancy [107].
   People who rely on Facebook as their primary source of news about COVID-19 are more
likely to be vaccine hesitant than those who get their news from other sources [108]. This may
be at least partially attributable to the types of information individuals are likely to encounter
on Facebook, or it may be a reflection of underlying characteristics that make certain individuals
more likely to seek out news content on Facebook and more likely to be skeptical of vaccination.
Either way, the association between using Facebook as a primary news source and vaccine
hesitancy points to a potential avenue for targeting future vaccine communication efforts.
   The majority of interventions aimed at countering or mitigating the effects of vaccine-related
mis- and disinformation target individual-level factors, such as susceptibility, vaccine-related
beliefs, perceptions of personal risk, or digital literacy. Research suggests that refutation
messages that address the affective and cognitive evaluations of vaccine-related misinformation
may help reduce vaccine hesitancy among people exposed to misinformation. Promoting
vaccine uptake by addressing the motivational roots of vaccine hesitancy, such as concerns
about vaccine safety and effectiveness, may be a promising strategy for countering certain
vaccine-related information harms, while fact-based rebuttals focused on knowledge deficits do
not appear to have much of an impact [109, 110]. During the COVID-19 pandemic, there has
also been a focus on recruiting social media influencers to promote vaccine uptake, but research
suggests that exposure to authoritative information about the vaccine is a stronger incentive
to get vaccinated than endorsement from influencers [109]. As is the case in public health,
individual-level approaches to countering information harms are limited in their potential
impact, and are likely to be more effective when paired with approaches targeting higher-level
factors.

4.2. Interpersonal
When it comes to individuals choosing whether to become vaccinated against COVID-19 or
not, family, peers, and other social connections play an important role. These interpersonal
interactions are key factors in shaping individual perceptions of what to believe or reject about
the vaccine. Adolescent level doses came later than adults and the interim period was strife
with a spectrum of mis- and disinformation as to whether children should receive the vaccine
or in what dose [111]. Recent findings from Rogers and colleagues discovered that family,
particularly parents, norms largely influenced adolescent vaccine intent; peer norms had a
lesser but still significant impact [112]. Stepping back to consider all adults, surveys around
May of 2021 consistently found that peer effects from advice to pressure had far more effect on
vaccination decision than political preference, despite divergent political views of COVID-19
largely dividing the nation [113].
   Following expectations, loose social environments such as Twitter conversations about
vaccine fraud with thousands of people had high associations between low vaccination rates and
negative attitudes toward the vaccine; yet, that effect disappeared among similar discussions with
family and close friends [114]. Close social connections as fact checkers serve as a final defensive
line against misinformation, out-performing validation from experts like Dr. Anthony Fauci
[114]. We speculate that the CDC’s recognition of the influence of interpersonal relationships
supported the CDC creation of entire guides about how to discuss COVID-19 with friends and
family [115].
   Other interpersonal factors continue to yield results for vaccine uptake and overcoming
disinformation. Homophily, particularly on measures of race and ethnicity, was a primary
driver of becoming vaccinated and positive views about the vaccine, which follows expectations
from social contagion theory [116]. Similar homophily results are found comparing COVID-19
vaccine update to other vaccines and antibiotic use [117] and socio-demographics effect on
various prophylactic measures [118]. Buttenheim’s Congressional expert testimony on strategies
to reduce vaccine hesitancy recommended overcoming misinformation through leveraging
the social capital of well-known individuals within a community, such as stylists and barbers
[119]. Lastly, the length and severity of the COVID-19 pandemic gave rise to large swaths
of information, increasing the potential for individuals to experience information overload,
in-turn leading to issues like mental fatigue and determining credibility through cognitive
heuristics, but groups offset this through group coping practices and extend individual ability
to communicate about COVID-19 [120].

4.3. Organizations
Organizations involved in US COVID-19 vaccine promotion efforts include the CDC, which
together with the WHO is active in trying to counter what has been termed an “infodemic”
[121, 122]. CDC’s advice to communities includes using social listening on social media and
traditional media channels, logging and analyzing misinformation in these channels; listening
to the community to identify content gaps, perceptions, information voids, and misinforma-
tion; sharing clear, accurate information, and using trusted messengers (influencers) to boost
credibility. Outputs from the CDC include regular State of Vaccine Confidence Insights reports
[123].
   Other organizations involved in US COVID-19 vaccine promotion efforts include state and
local health departments. For example, the Massachusetts State Department of Health has
a Vaccine Ambassadors scheme, making public health professionals available to community
forums and meetings in 12 languages including American Sign Language [124]. However, these
agencies have also faced significant challenges during the pandemic, in large part due to poor
coordination and communication between local, state, and federal health agencies, particularly
in the early months. Miscommunication, lack of or delayed information sharing, incompatible
databases, inconsistent reporting practices, and insecure communication channels are just a few
of the problems that have arisen. Similarly, efforts to develop effective vaccine communication
strategies have been complicated by local and state-level variation in COVID-19 trends, which
at times has resulted in confusion due to seemingly conflicting information coming from federal
versus state and local health agencies. Hospitals, schools, universities, and churches are other
examples of organizations involved in vaccine promotion and communication efforts.

4.4. Communities
Communities are active in vaccine disinformation creation, amplification, mitigations, and
counters. Online communities have formed around antivax narratives and vaccine conspiracy
theories, or been created by admins, some linked to known disinformation creators, seemingly to
advance these narratives, and increase the division they create. High-profile online influencers
also amplify these narratives [125].
   Communities affected by vaccine misinformation include immigrant communities. In Boston,
the Haitian immigrant community sees vaccine misinformation in Haitian Creole; community-
based responses to this include faith leader narratives, such as Pastor Keke on platforms
including local radio, and the Mattapan community health center placing local ads and flyers
[126]. Hispanic communities in the US have also been targeted with vaccine disinformation;
one issue with this is the assumption of many higher-level (e.g., government level) responders
that Hispanic communities in the US are a uniform population, rather than targeted separately
by factors including their countries of origin: Cuban-American, Puerto-Rican, and Mexican
targeted disinformation, narratives, and channels vary significantly from Cuban-American fear
of leftist politics to documented histories of medical experimentation on Puerto Ricans [127].
   Other communities targeted by disinformation in the US include Black communities [128] -
already targeted disproportionately with election related disinformation, wellness communities,
and parents of small children. Each of these communities has online groups and spaces which
could be targeted by fake profiles, misinformation, and artificial amplification of vaccine mis-
information. Healthcare worker communities are also targeted: Doximity, the “LinkedIn for
doctors” connecting 80% of US physicians, also contains vaccine disinformation [129].

4.5. Policy
With COVID-19 at global scale, infecting nearly 300 million people and killing almost 5.5 million,
it was inevitable that governments would get involved. Governments at all levels across the US
activated a wide range of policies to reduce vaccine hesitancy and increase vaccine uptake. Yet,
differences in policy formation and compliance varied largely by political lines. Messaging from
elected officials and political elites drove narratives and vaccine endorsements, or lack thereof,
which shaped early individual perceptions of the vaccine [130, 131, 132]. Despite political
fractures, leadership from the top initiated efforts to reduce misinformation and disinformation,
such as through science-based public health campaigns identified and supported through
the US National Strategy documents [133]. Additionally, the US Department of Health and
Human Services (HHS) initiated a 5-year strategic plan on vaccines, which included goals
specifically toward partnership development to combat disinformation and reduce vaccine
hesitancy [134]. The plan is national in orientation, yet global in scope, and covers more
vaccines than COVID-19, but maintains targets and metrics at the individual level. In addition to
rolling out policy to reduce disinformation and increase vaccinations, some policy was inward
looking to conduct governmental self-checks. The US Cares Act set requirements for the US
Government Accountability Office to perform oversight on COVID-19 related policies, including
stakeholder efforts on vaccine administration and information sharing [135].
   In this use case, these examples combine funding and resources considerations with research.
With no end in sight, agencies across the federal government initiated research funding on dis-
information and vaccine hesitancy. Even before the COVID-19 pandemic, HHS was sponsoring
work to “help individuals make informed decisions about immunization for themselves and
their families.” [136]. Of course, the NIH are deep into supporting vaccine and decision-making
research; they a wide range of grants and cover issues such as community-level interventions
for vaccination uptake, evaluation of government policies or initiatives that “that mitigate or
exacerbate disparities in vaccine access, uptake, and series completion”, and examine barriers,
access, and other measure among populations “who experience health disparities” [137]. Even
NASA opened grants for access to its remote sensing and satellite data to better understand
spatial effects on “environmental, economic, and/or societal impacts of the COVID-19 pandemic”
and how its systems can benefit decision-making research [138].
4.6. Society
Anti-Asian sentiment and discrimination have been widely documented during the COVID-19
pandemic, in large part because the virus was first discovered in Wuhan, China, which has led to
the proliferation of conspiracy theories and attributions of blame for the pandemic. This is even
apparent in search terms about the pandemic, which reflect stigmatizing beliefs about the virus
and its origins [139]. There is concern that these beliefs, combined with existing ethnic and
racial biases, may have spilled over to the healthcare system and public health communication,
resulting in culturally insensitive vaccine messaging and poorer quality interactions with
healthcare providers [140, 141]. Given that trust is a key factor in determining vaccine-related
attitudes and behaviors, it is possible that these negative experiences may have contributed to
vaccine hesitancy among some subgroups of Asian Americans.
   The history of anti-Black discrimination and racism in the U.S. healthcare system is also
believed to play a significant role in driving COVID-19 vaccine hesitancy in Black communities
[128]. This problem is compounded by vaccine-related misinformation targeting Black com-
munities, which fuels mistrust and negative attitudes toward vaccination. Additionally, high
levels of mistrust can increase susceptibility to misinformation [49]. Inequality-driven mistrust
has been recognized as a distinct phenomenon among communities who have historically
experienced disenfranchisement [142]. During the COVID-19 pandemic, this has manifested
itself in false belief systems such as the idea that vaccines and therapeutics are being deliberately
withheld from certain racial groups. Recognizing the significant role of racism in fueling mis-
trust and harmful health beliefs such as vaccine hesitancy, researchers are calling for solutions
to information disorder that directly address racism [142].
   Media coverage is believed to have contributed to fear, mistrust, and stress during the
COVID-19 pandemic, which may have resulted in increased vaccine hesitancy [143, 144]. While
clear, accurate, and timely information from trusted sources is necessary to make informed
decisions during public health crises, there is a delicate balance to strike between providing
too little versus too much information. On the one hand, information vacuums and infrequent
updates during ongoing crises can lead to the proliferation of rumors and increased levels of
uncertainty, psychological distress, and fear, but too much information may cause people to
become overwhelmed, confused, and unsure of who or what to trust. Paradoxically, as reporters
and news outlets tried to keep the public informed about the outbreak, excessive exposure
to news stories about COVID-19 may have had a negative impact on preventive behaviors
such as vaccination. This has been attributed in part to the observed impacts of information
overload, which has been shown to lead to maladaptive behaviors and information avoidance
during emergencies [145, 146]. Additionally, perceptions that the media exaggerated the risk of
COVID-19 are associated with vaccine hesitancy, possibly due in part to disengagement with
authoritative sources of information and increased engagement with “alternative” news sources
[143].
   At a national level, social media use and the prevalence of foreign disinformation online
has been shown to be significantly associated with COVID-19 vaccine hesitancy among the
population [147]. Low levels of societal trust in scientific and biomedical institutions, and low
levels of citizen engagement with the scientific community, are also associated with COVID-19
vaccine hesitancy [148]. Societal norms that prioritize individual freedom over the protection
of vulnerable groups have also been identified as a significant driver of COVID-19 vaccine
hesitancy [149]. This coincides with trends in anti-vaccine messaging, which in recent years
have increasingly framed vaccine refusal as a civil right and vaccine mandates as a form of
tyrannical government overreach [150]. Political and voting trends have been shown to be
associated with COVID-19 vaccination attitudes and behaviors, such that higher percentages of
votes for Donald Trump are significantly associated with lower vaccination rates and increased
vaccine hesitancy [151]. Additionally, research suggests that Christian nationalism is among the
strongest predictors of vaccine hesitancy, in large part because of its association with distrust of
science, hostility towards authorities other than the church, and endorsement of misinformation
espoused by Donald Trump [149].

4.7. Integration
While application of the SEM allows breakdown analysis of stress points and interventions
of an information-based harm within each level, the levels do not operate in isolation from
each other. This integration section details examples of cross-level analysis for a microchip
in the vaccine misinformation scenario. October 2, 2020, Charlamagne Tha God, while on
The Breakfast Club radio show claimed, “[m]illions will line up to take the vaccine, and boom,
microchips for all of y’all, right in time for goddamn Thanksgiving” [152]. From there, the
rumor of the vaccine injecting microchips spread rapidly. It was exacerbated and amplified by
media outlets and influencers, spinning comments from Bill Gates in a Reddit Ask Me Anything
conversation about digital vaccine cards into a narrative of microchipped vaccine cards as part
of a larger tracking system [153, 154]. As of March 2021, two percent of surveyed individuals
representing the American adult population believed the vaccine contained a microchip, but
nearly 27 percent of survey respondents were unsure which together accounted for almost 75
million people [155]. In truth, neither the vaccine nor vaccine passports and certification cards
contain microchips, but the microchip information-based harm is widespread [156, 154].
   Despite the prevalence of the microchip rumor, there are viewable countermeasures and
counter-initiatives that span the social ecological levels. Making the vaccine ingredient list
publicly available is one approach to demonstrate it does not contain any microchips. The CDC
provides vaccine ingredients on their website as part of the broader vaccine information packet,
along with guidance as to who should or should not receive a particular vaccine, possible side
effects, other safety data, and clinical trial data [157, 158]. This information is used by other
agencies, such as the FDA when determining vaccine approval status (organization), but these
ingredient lists also help obtain endorsement from faith-based groups (community). Moreover,
ingredient transparency helped Islamic faith leaders to determine that vaccine uptake follows
Sharia (Islamic) law and leadership in the The Church of Jesus Christ of Latter-day Saints to
actively support vaccination and not provide religious waivers to their membership [159, 160].
   Some actors transformed the injected microchip topic from misinformation to disinformation
by selectively editing video interviews of prominent business leaders and news anchors to
reshape a narrative that portrays the microchips as true [156]. Institutions such as the CDC
[121] and FactCheck.org [161] (organizations) provide mythbusting analysis and media coverage
carries this message to the public [162, 163] (society). Live conversations between public health
officials, vaccine suppliers, politicians, members of the media, and the public provide another
avenue to overcome the microchip misinformation. A televised question session by the Orange
County Board of Supervisors in California with their public health administrator (organization)
included inquiry about injected tracking device which was quickly debunked; this engagement
may boost public awareness and transparency with constituents (individuals and communities),
but it was also distributed widely by NBC news for wider dissemination (society) [162].
   Beyond traditional news outlets, news is increasingly consumed via social media. Younger
individuals are more likely to obtain the majority of their information online [164] and well-
established journalist publications reference social media sites as references [165]. Microchip
and other vaccine-related rumors spread across social media platforms, and in response, tech
giants like YouTube and Twitter developed COVID-19 misinformation policies that allow for re-
moval of posts and potential account bans (organization, policy) [166, 167]. Similarly, Facebook
and Instagram, following World Health Organization guidance, (organization) incorporated
group administrators on their sites (community) to help control the presence of COVID-19
misinformation and ban users violating those policies (policy) [168]. Although creating and
implementing social media COVID-19 misinformation policies have mixed success [169], any
actions have the potential to affect how platform users create and share their content (in-
terpersonal), as well as engage with influencers, celebrities, public officials, and other users
(communities).
   At a more local and personal level, there are intervention efforts to help individuals overcome
COVID-19 information-based harms, including injectable microchipping. School educators have
professional training available (organization) to help “leaders, teachers and parents to become
“vaccine ambassadors” to communicate better with parents,” (interpersonal) including how to
diffuse misconceptions about microchips without being dismissive [170]. Local investigative
reporters, like Mahsa Saeidi in Tampa, Florida with WFLA news (society), connect interviews
with concerned parents with a physician (community) to address the injected microchip false-
hood and other fact versus fiction [171]. Other groups skeptical of the government include
communities of color, young people, and the LGBTQ+ community. Recognizing this gap, the
Biden administration brought together Dr. Anthony Fauci and teen social media influencers
(community); those influencers then reached out to their millions of followers (society) to
help dispel microchip and other misinformation while also reducing vaccine hesitancy [172].
The influencers bridged Gen-Z, Black, Spanish-speaking, LGBTQ+ communities. Moreover,
influencer outreach carried into conversations between parents and their youth followers [172].


5. Discussion and Conclusion
Vaccination related information-based harms continue to pervade while the COVID-19 pandemic
continues to affect individuals and societies around the world. The web of falsehoods creates
and reinforces vaccination hesitancy. The SEM, adapted from Broffenbrenner, offers a holistic
approach to understand information behaviors across a layered spectrum from the individual
through society. Our SEM analysis extended previous information disorder and COVID-19
related research to identify contributing factors and complex relationships to vaccine uptake.
Through our SEM factors, practitioners can develop interventions that span interdependent
relationships for greater efficacy. Academics may leverage this adaptation of the SEM to link
research across traditional disciplinary boundaries and encourage future work on the causal
relationships within COVID-19 information behaviors.
   The SEM can be used to classify cognitive security, at the individual, interpersonal, community,
organizational, policy, and societal levels. It should be useful for researchers and responders
assessing the coverage of responses, the implications of actions, and barriers that could diminish
the effectiveness of interventions at each SEM level. Expanding analysis to include not just
the individual, but also the effect of family and friends, communities, etc., and considering the
interactions between SEM levels, potentially increases the reach and scalability of cognitive
security.
   Our usage of the SEM here relies upon a heavily qualitative approach as part of our effort to
establish its foundation within this information science space. The lack of quantified elements
within this SEM may give some readers pause. However, we believe that lack of quantitative
aspects within the SEM we present does not make it any less relevant for quantitative research.
Other scholars tied areas in which the SEM can or could work alongside quantitative methods.
Onwuegbuzie, Collins, and Frels [173] posited that the levels within Brofenbrenner’s SEM [5]
are useful for both qualitative and quantitative research. Moreover, the SEM as a framework aids
the pursuit of generalization, often a focus of quantitative research, by helping scope empirical
methods and designs, such as sampling frames, appropriate to one or more of the model’s levels
[173].
   Our paper offers a novel adaptation of the social ecological model for a cognitive security
context which leans heavily on qualitative description to provide a first-step foundation upon
which future research, including quantitative approaches, may identify measurable variables
within each level. One example of this quantitative use within another context assessed political
violence and child adjustment in Northern Ireland with individual (individual level) and family
(interpersonal level) demographic data, social and economic measures from the local political
and religious communities (community level), as well as education system and attainment
values, sectarian segregation, and policy (all society level) [174]. The political violence study
then used the variables associated with the model to derive index scales and conduct exploratory
factor analysis, correlations, and path analysis, also known as structural equation modeling in
other disciplines. Regression coefficients and R2 values from the path analysis then mapped
back to the social ecological model levels, illustrating potential relationships for variables
within and between the levels [174]. A second example used the levels a social ecological
model to frame interrelated, multi-level, quantifiable characteristics to understand alcohol
use behaviors; their statistical approaches included logistic multilevel random effects models
and censored regression (TOBIT) random effects models [175]. Related, the social ecological
model can be used to establish the environment in which computational approaches such as
nested network analysis or agent-based models operate. The levels within a social ecological
model naturally lend themselves to multi-level modeling. Brown and colleagues discussed the
benefits of using computational approaches to explore human-computer interactions situated
in complex social environments within an HIV prevention context [176]. Of the different
computational perspectives mentioned, they included the development of an agent-based model
that fit within a simplified version of the social ecological model with interactions and strategies
based on the environment’s information landscapes [176]. Saha and colleagues took a different
approach to using the social ecological model. Their work focused on using machine learning
models to help impute missing data, but the authors relied upon a social ecological model
for theoretical foundation to better understand the context of their data and the environment
in which that data originates and interrelates. In-turn, they believed opportunity exists to
use their imputation modeling to improve upon and discover new dimensions of intersection
within a social ecological model [176]. Future use of our model could leverage similar variable
identification or modeling perspectives to enhance a quantitatively oriented research design, or
used in a mixed-methods design to improve triangulation.
   The SEM provides a novel framework for conceptualizing cognitive security, but there are
limitations to its use. The primary limitation is a corollary of its strength: The SEM is broad in
scope and is meant to be a comprehensive framework to guide needs assessment, evaluation,
surveillance, and more, but its broad scope comes at the cost of some degree of precision. Related,
the scope of the model also makes it difficult to quantify in a single measure. In previous uses,
efforts to quantify the model have been carried out by using validated measures for factors
at each level of the model, rather than in one comprehensive measure. There have also been
successful efforts to validate hypotheses based on factors included in the SEM by using path
analysis to test various models of the relationship(s) between predictor variables at multiple
levels, and between those variables and a specified behavioral outcome [177]. This can be
further systematized by using SEM to guide meta-analysis and systematic reviews to develop
empirically grounded and testable lists of factors at each level. Integrating the SEM with other
novel approaches like agent-based modeling is another promising approach to harnessing
the breadth and qualitative nature of the SEM. As Rounsevell and colleagues explicated, an
“ABM may include quantitative, equation-based approaches, but the rules that characterize this
approach are qualitative” [178].
   Questions of generalizability arise given the challenges associated with measuring the SEM.
On the issue of generalizability, the findings from the current study would not be expected to
be applicable to other subject matter issues, though the model and its underlying assumptions
and relationships should be expected to remain stable across many different settings. Although
this is a limitation, it is an inherent characteristic of research exploring the dynamic human-
computer-information nexus. As Antill articulated, “By the very act of installing an information
system, one is changing the situation into which it is installed [179]. Therefore, no particular
‘experiment’ can be repeated.” Of course, this does not mean that repeatability is null and void.
Rather, it means that a widely-held notion of repeatability — that the same results should
be produced by any researcher in any laboratory anywhere in the world — may need to be
reconceptualized to consider other forms of repeatability, such as the ability to demonstrate
that the same set of variables or assumptions, held to be controlled and identical, do indeed hold
up in multiple tests of the model. Similarly, there are different mechanisms of achieving validity
in qualitative research. Among the most important is face validity, which simply conveys
whether the results were viewed as credible, recognizable, and trustworthy by others. This is
one of the primary mechanisms of transferability in qualitative research — rather than using
statistical inference based on a defined population, qualitative analysis seeks to produce results,
assumptions, relationships, and models that can be generalized to many settings [180].
   The SEM adaptation in this paper is early-stage work and we applied it to a single use-case
falsehood on microchips injected with the vaccine. Future efforts could explore additional
factors within each level of the SEM and reinforce the interdependencies between levels. Next
steps to improve the model validity would implement this SEM approach for other COVID-19
information disorder use-cases. Another interesting application would be looking at algorithms
(e.g., social media recommendation algorithms) through the SEM for cognitive security lens,
where the neighbors of individual algorithms would be models and model instantiations sharing
training datasets and results, and communities could form around the pre-trained models used
in e.g., text and image understanding, with model poisoning and other machine information
harms being shared across those communities and so on. Another future pathway extends
this SEM adaptation as a theoretical contribution alongside to other prominent theories of
information behavior within an environment, such as Chatman’s small worlds micro view [181],
Habermas’ lifeworld theory macro view [182], the multilevel view from Jaeger and Burnett’s
Information Worlds [183]. Additionally, other information behavior theories could help evolve
this SEM adaptation, such as Lee and Butler’s theory of local information landscapes [184] to
consider the materiality of information within the environment as a capacity-based construct,
directionality of information seeking through Sonnenwald’s information horizons information
[185], or chance discovery via Williamson’s incidental information acquisition [186] (1998) or
Agarwal’s information serendipity [187].


Acknowledgments
We would like to thank Simon van Woerden (WHO) and Alex Ruiz (Phaedrus) for conversations
that helped inform our diagram of the crisis stages of Cognitive Security, including discussions on
UN emergency cycle models. Thanks to the developers of ACM consolidated LaTeX styles https:
//github.com/borisveytsman/acmart and to the developers of Elsevier updated LATEX templates
https://www.ctan.org/tex-archive/macros/latex/contrib/els-cas-templates.


References
  [1] C. Wardle, H. Derakhshan, Thinking about ‘information disorder’: formats of misinfor-
      mation, disinformation, and mal-information, Journalism,‘fake news’& disinformation.
      (2018) 43–54.
  [2] C. Wardle, Understanding Information disorder, 2020.
  [3] G. Ronchetti, What is Cognitive Security?, 2020. URL: https://xtncognitivesecurity.com/
      what-is-cognitive-security/.
  [4] R. Waltzman, A Center for Cognitive Security - Draft Proposal, 2017. URL:
      https://www.linkedin.com/pulse/center-cognitive-security-draft-proposal-rand-
      waltzman/?trk=public_profile_article_view.
  [5] U. Brofenbrenner, The experimental ecology of human development, Harvard University
      Press: Cambridge, 1979.
  [6] L. Richard, L. Guavin, K. Raine, Ecological Models Revisited: Their Uses and Evolution in
      Health Promotion Over Two Decades, Annual Review of Public Health 32 (2011) 307–326.
  [7] J. Wharf Higgins, D. Begoray, M. MacDonald, A Social Ecological Conceptual Frame-
      work for Understanding Adolescent Health Literacy in the Health Education Classroom,
     American Journal of Community Psychology 44 (2009) 350. doi:10.1007/s10464-009-
     9270-8.
 [8] A. Lindridge, S. Macaskill, W. Gnich, D. Eadie, I. Holme, Applying an ecological model to
     social marketing communications, European Journal of Marketing 47 (2013) 1399–1420.
     doi:10.1108/EJM-10-2011-0561.
 [9] I. Vigna, A. Besana, E. Comino, A. Pezzoli, Application of the Socio-Ecological System
     Framework to Forest Fire Risk Management: A Systematic Literature Review, Sustain-
     ability 13 (2021) 2121. doi:10.3390/su13042121.
[10] F. van Gool, N. Theunissen, J. Bierbooms, I. Bongers, Literature study from a so-
     cial ecological perspective on how to create flexibility in healthcare organisations,
     International Journal of Healthcare Management 10 (2017) 184–195. doi:10.1080/
     20479700.2016.1230581.
[11] F. Tripodi, Searching for Alternative Facts: Analyzing Scriptural Inference in Conservative
     News Practices, Technical Report, Data & Society, 2018. URL: https://datasociety.net/wp-
     content/uploads/2018/05/Data_Society_Searching-for-Alternative-Facts.pdf.
[12] C. C. Self, Credibility, in: D. W. Stacks, M. Salwen (Eds.), An Integrated Approach to
     Communication Theory and Research, 2 ed., Routledge, New York, NY, 2014, pp. 449–470.
[13] G. Pasi, Credibility and Relevance in Information Retrieval, 2021. URL:
     https://romcir2021.disco.unimib.it/wp-content/uploads/sites/90/2021/04/Keynote-
     ROMCIR-LAST.pdf.
[14] U. M. of Defence,              Joint Doctrine Publication 2-00:                Understand-
     ing and Intelligence Support to Joint Operations, 2011. URL: https:
     //assets.publishing.service.gov.uk/government/uploads/system/uploads/
     attachment_data/file/311572/20110830_jdp2_00_ed3_with_change1.pdf.
[15] K. R. McLeroy, D. Bibeau, A. Steckler, K. Glanz, An Ecological Perspective on Health
     Promotion Programs, Health Education Quarterly 15 (1988) 351–377. doi:10.1177/
     109019818801500401, publisher: SAGE Publications Inc.
[16] P. L. Mabry, D. H. Olster, G. D. Morgan, D. B. Abrams, Interdisciplinarity and systems
     science to improve population health: a view from the NIH Office of Behavioral and
     Social Sciences Research, American Journal of Preventive Medicine 35 (2008) S211–224.
     doi:10.1016/j.amepre.2008.05.018.
[17] K. Lewin, Psycho-sociological problems of a minority group, Character & Personality; A
     Quarterly for Psychodiagnostic & Allied Studies 3 (1935) 175–187. doi:10.1111/j.1467-
     6494.1935.tb01996.x.
[18] P. Watzlawick, J. H. Weakland, R. Fisch, Change: Principles of Problem Formation and
     Problem Resolution, W. W. Norton & Company, 1974.
[19] T. A. Glass, M. J. McAtee, Behavioral science at the crossroads in public health: Extend-
     ing horizons, envisioning the future, Social Science & Medicine 62 (2006) 1650–1671.
     doi:10.1016/j.socscimed.2005.08.044.
[20] N. Krieger, Epidemiology and the web of causation: has anyone seen the spider?, Social
     Science & Medicine (1982) 39 (1994) 887–903. doi:10.1016/0277-9536(94)90202-x.
[21] B. G. Link, J. Phelan, Social Conditions As Fundamental Causes of Disease, Journal of
     Health and Social Behavior (1995) 80–94. doi:10.2307/2626958.
[22] J. B. McKinlay, The New Public Health Approach to Improving Physical Activity and
     Autonomy in Older Populations, in: E. Heikkinen, J. Kuusinen, I. Ruoppila (Eds.), Prepara-
     tion for Aging, Springer US, Boston, MA, 1995, pp. 87–103. doi:10.1007/978-1-4615-
     1979-9_10.
[23] J. Sallis, N. Owen, E. B. Fisher, Ecological Models of Health Behavior, in: K. Glanz,
     B. K. Rimer, K. Viswanath (Eds.), Health Behavior: Theory, Research, and Practice, 4 ed.,
     Jossey-Bass, San Francisco, CA, 2015, pp. 465–485.
[24] D. J. Whitaker, D. M. Hall, A. L. Coker, Primary prevention of intimate partner violence:
     Toward a developmental, social-ecological model, in: Intimate partner violence: A
     health-based perspective, Oxford University Press, New York, NY, US, 2009, pp. 289–305.
[25] A. Durkin, C. Schenck, Y. Narayan, K. Nyhan, K. Khoshnood, S. H. Vermund, Prevention
     of Firearm Injury through Policy and Law: The Social Ecological Model, The Journal of
     Law, Medicine & Ethics 48 (2020) 191–197. doi:10.1177/1073110520979422.
[26] P. Ohri-Vachaspati, D. DeLia, R. S. DeWeese, N. C. Crespo, M. Todd, M. J. Yedidia, The
     relative contribution of layers of the Social Ecological Model to childhood obesity, Public
     health nutrition 18 (2015) 2055. doi:10.1017/S1368980014002365.
[27] T. A. Gregory, C. Wilson, A. Duncan, D. Turnbull, S. R. Cole, G. Young, Demographic,
     social cognitive and social ecological predictors of intention and participation in screening
     for colorectal cancer, BMC Public Health 11 (2011) 38. doi:10.1186/1471-2458-11-38.
[28] D. L. Espleage, S. M. Swearer, A Social-Ecological Model for Bullying Prevention and
     Intervention: Understanding the Impact of Adults in the Social Ecology of Youngsters,
     in: S. R. Jimerson, S. M. Swearer, D. L. Espleage (Eds.), Handbook of Bullying in Schools,
     Routledge, 2009, pp. 71–82.
[29] R. J. Cramer, N. D. Kapusta, A Social-Ecological Framework of Theory, Assessment, and
     Prevention of Suicide, Frontiers in Psychology 8 (2017).
[30] C. Latkin, L. A. Dayton, G. Yi, A. Konstantopoulos, J. Park, C. Maulsby, X. Kong, COVID-
     19 vaccine intentions in the United States, a social-ecological framework, Vaccine 39
     (2021). doi:10.1016/j.vaccine.2021.02.058.
[31] A. R. Casola, B. Kunes, A. Cunningham, R. J. Motley, Mask Use During COVID-19: A
     Social-Ecological Analysis, Health Promotion Practice 22 (2021) 152–155. doi:10.1177/
     1524839920983922.
[32] H. Igarashi, M. L. Kurth, H. S. Lee, S. Choun, D. Lee, C. M. Aldwin, Resilience in Older
     Adults during the COVID-19 Pandemic: A Socioecological Approach, The Journals
     of Gerontology. Series B, Psychological Sciences and Social Sciences (2021) gbab058.
     doi:10.1093/geronb/gbab058.
[33] N. A. Suhud, G. H. T. Ling, P. C. Leng, A. M. R. A. Matusin, Using A Socio-Ecological
     System (SES) Framework to Explain Factors Influencing Countries’ Success Level in
     Curbing COVID-19, Technical Report, 2020. URL: https://www.medrxiv.org/content/
     10.1101/2020.11.17.20226407v1. doi:10.1101/2020.11.17.20226407.
[34] E. Cowan, M. R. Khan, S. Shastry, E. J. Edelman, Conceptualizing the effects of the COVID-
     19 pandemic on people with opioid use disorder: an application of the social ecological
     model, Addiction Science & Clinical Practice 16 (2021) 4. doi:10.1186/s13722-020-
     00210-w.
[35] L. McCormack, V. Thomas, M. A. Lewis, R. Rudd, Improving low health literacy and
     patient engagement: A social ecological approach, Patient Education and Counseling 100
     (2017) 8–13. doi:https://doi.org/10.1016/j.pec.2016.07.007.
[36] L. E. G. Mboera, C. Sindato, I. R. Mremi, S. F. Rumisha, J. George, R. Ngolongolo, G. Misinzo,
     E. D. Karimuribo, M. M. Rweyemamu, N. Haider, M. A. Hamid, R. Kock, Socio-Ecological
     Systems Analysis and Health System Readiness in Responding to Dengue Epidemics in
     Ilala and Kinondoni Districts, Tanzania, Frontiers in Tropical Diseases 2 (2021). URL:
     https://www.frontiersin.org/article/10.3389/fitd.2021.738758.
[37] M. L. Finucane, J. Fox, S. Saksena, J. H. Spencer, A Conceptual Framework for Analyzing
     Social-Ecological Models of Emerging Infectious Diseases, in: M. J. Manfredo, J. J. Vaske,
     A. Rechkemmer, E. A. Duke (Eds.), Understanding Society and Natural Resources: Forging
     New Strands of Integration Across the Social Sciences, Springer Netherlands, Dordrecht,
     2014, pp. 93–109. doi:10.1007/978-94-017-8959-2_5.
[38] M. E. Figueroa, A Theory-Based Socioecological Model of Communication and Behavior
     for the Containment of the Ebola Epidemic in Liberia, Journal of Health Communication
     22 (2017) 5–9. doi:10.1080/10810730.2016.1231725.
[39] K. K. Walker, H. Owens, G. Zimet, “We fear the unknown”: Emergence, route and transfer
     of hesitancy and misinformation among HPV vaccine accepting mothers, Preventive
     Medicine Reports 20 (2020) 101240. doi:10.1016/j.pmedr.2020.101240.
[40] J. A. Bapaye, H. A. Bapaye, Demographic Factors Influencing the Impact of Coronavirus-
     Related Misinformation on WhatsApp: Cross-sectional Questionnaire Study, JMIR Public
     Health and Surveillance 7 (2021) e19858. doi:10.2196/19858.
[41] H. Seo, M. Blomberg, D. Altschwager, H. T. Vu, Vulnerable populations and misinforma-
     tion: A mixed-methods approach to underserved older adults’ online information assess-
     ment, New Media & Society 23 (2021) 2012–2033. doi:10.1177/1461444820925041,
     publisher: SAGE Publications.
[42] U. Gasser, S. Cortesi, M. M. Malik, A. Lee, Youth and Digital Media: From Credibility to
     Information Quality, SSRN Scholarly Paper ID 2005272, Social Science Research Network,
     Rochester, NY, 2012. doi:10.2139/ssrn.2005272.
[43] M. Anderson, A. Perrin, J. Jiang, 11% of Americans don’t use the internet. Who
     are they?, Technical Report, Pew Research Center, Washington, DC, USA, 2018.
     URL: http://www.pewresearch.org/fact-tank/2018/03/05/some-americans-dont-use-the-
     internet-who-are-they/.
[44] H. Seo, J. Erba, M. Geana, C. Lumpkins, Calling Doctor Google? Technology Adop-
     tion and Health Information Seeking among Low-income African-American Older
     Adults, The Journal of Public Interest Communications 1 (2017) 153–153. doi:10.32473/
     jpic.v1.i2.p153.
[45] M. V. Bronstein, G. Pennycook, A. Bear, D. G. Rand, T. D. Cannon, Belief in Fake News
     is Associated with Delusionality, Dogmatism, Religious Fundamentalism, and Reduced
     Analytic Thinking, Journal of Applied Research in Memory and Cognition 8 (2019)
     108–117. doi:10.1016/j.jarmac.2018.09.005.
[46] C. Froio, A. Brehm, Digital Media and Populism. The Online News Consumption of
     Citizens Holding Populist Attitudes & its Implication for Democratic Public Spheres, 2021.
     URL: https://hal-sciencespo.archives-ouvertes.fr/hal-03263644.
[47] G. Gopichand, S. Kowshik, C. Reddy, M. Kumar, P. Vardhan, Vocabulary Mismatch
     Avoidance Techniques, International Journal of Scientific & Technology Research 9
     (2020) 2585–2594. URL: http://www.ijstr.org/final-print/apr2020/Vocabulary-Mismatch-
     Avoidance-Techniques.pdf.
[48] S. Jeong, J. Baek, C. Park, J. C. Park, Unsupervised Document Expansion for Information
     Retrieval with Stochastic Text Generation, arXiv:2105.00666 [cs] (2021). URL: http:
     //arxiv.org/abs/2105.00666.
[49] J. Roozenbeek, C. R. Schneider, S. Dryhurst, J. Kerr, A. L. J. Freeman, G. Recchia, A. M.
     van der Bles, S. van der Linden, Susceptibility to misinformation about COVID-19 around
     the world, Royal Society Open Science 7 (2020) 201199. doi:10.1098/rsos.201199.
[50] L. D. Scherer, J. McPhetres, G. Pennycook, A. Kempe, L. A. Allen, C. E. Knoepke, C. E.
     Tate, D. D. Matlock, Who is susceptible to online health misinformation? A test of
     four psychosocial hypotheses, Health Psychology 40 (2021) 274–284. doi:10.1037/
     hea0000978.
[51] L. I. Institute, 34 CFR § 463.3 - What definitions apply to the Adult Education and Family
     Literacy Act programs?, ???? URL: https://www.law.cornell.edu/cfr/text/34/463.3.
[52] E. Peters, D. Västfjäll, P. Slovic, C. Mertz, K. Mazzocco, S. Dickert, Numeracy and
     Decision Making, Psychological Science 17 (2006) 407–413. doi:10.1111/j.1467-
     9280.2006.01720.x.
[53] M. Feinberg, R. Willer, J. Stellar, D. Keltner, The virtues of gossip: Reputational information
     sharing as prosocial behavior., Journal of Personality and Social Psychology 102 (2012)
     1015–1030. doi:10.1037/a0026650.
[54] J. C. Wofford, P. J. Calabro, A. Sims, The Relationship of Information Sharing
     Norms and Leader Behavior, Journal of Management 1 (1975) 15–23. doi:10.1177/
     014920637500100104, publisher: SAGE Publications Inc.
[55] M. Di Maggio, M. W. Van Alstyne, Information Sharing, Social Norms and Performance,
     SSRN Scholarly Paper ID 1893164, Social Science Research Network, Rochester, NY, 2013.
     doi:10.2139/ssrn.1893164.
[56] E. C. Tandoc, R. Ling, O. Westlund, A. Duffy, D. Goh, L. Zheng Wei, Audiences’ acts of
     authentication in the age of fake news: A conceptual framework, New Media & Society
     20 (2018) 2745–2763. doi:10.1177/1461444817731756, publisher: SAGE Publications.
[57] J. De keersmaecker, A. Roets, ‘Fake news’: Incorrect, but hard to correct. The role of
     cognitive ability on the impact of false information on social impressions, Intelligence 65
     (2017) 107–110. doi:10.1016/j.intell.2017.10.005.
[58] E. C. Tandoc, D. Lim, R. Ling, Diffusion of disinformation: How social media users respond
     to fake news and why, Journalism 21 (2020) 381–398. doi:10.1177/1464884919868325,
     publisher: SAGE Publications.
[59] M. Viviani, G. Pasi, Credibility in social media: opinions, news, and health information—a
     survey, WIREs Data Mining and Knowledge Discovery 7 (2017) e1209. doi:10.1002/
     widm.1209, _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/widm.1209.
[60] D. Acemoglu, A. Ozdaglar, J. Siderius, Misinformation: Strategic Sharing, Homophily,
     and Endogenous Echo Chambers, Working Paper 28884, National Bureau of Economic
     Research, 2021. URL: https://www.nber.org/papers/w28884. doi:10.3386/w28884, series:
     Working Paper Series.
[61] C. Kasapoğlu, B. Kırdemir, WARS OF NONE: ARTIFICIAL INTELLIGENCE AND THE
     FUTURE OF CONFLICT, Technical Report, Centre for Economics and Foreign Policy
     Studies, 2019. URL: https://www.jstor.org/stable/resrep21050.
[62] A. E. Holton, H. I. Chyi, News and the overloaded consumer: factors influencing in-
     formation overload among news consumers, Cyberpsychology, Behavior and Social
     Networking 15 (2012) 619–624. doi:10.1089/cyber.2011.0610.
[63] K. M. MacQueen, E. McLellan, D. S. Metzger, S. Kegeles, R. P. Strauss, R. Scotti,
     L. Blanchard, R. T. Trotter, What Is Community? An Evidence-Based Definition
     for Participatory Public Health, American Journal of Public Health 91 (2001). URL:
     https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1446907/.
[64] Internews, What Works: Addressing COVID-19 Misinformation: Lessons from the Front-
     lines in 100 Countries., Technical Report, Internews, 2021. URL: https://internews.org/
     wp-content/uploads/2021/04/COVIDReport_20210416.pdf.
[65] L. Livingston, Understanding the Context Around Context:                       Looking Be-
     hind Misinformation Narratives, Technical Report, Over Zero, 2021. URL:
     https://www.ned.org/wp-content/uploads/2021/12/Understanding-the-Context-
     Around-Content-Looking-Behind-Misinformation-Narratives-Laura-Livingston.pdf.
[66] R. Heeks, ICT4D 2.0: The Next Phase of Applying ICT for International Development,
     Computer 41 (2008) 26–33. doi:10.1109/MC.2008.192.
[67] A. Liberman, J. Schroeder, Two social lives: How differences between online and offline
     interaction influence social outcomes, Current opinion in psychology 31 (2020) 16–21.
     doi:10.1016/j.copsyc.2019.06.022.
[68] A. Chen, The Agency, The New York Times (2015). URL: https://www.nytimes.com/
     2015/06/07/magazine/the-agency.html.
[69] K. Gogarty, Despite Facebook’s COVID-19 promises, anti-vaccine groups are thriv-
     ing, 2021. URL: https://www.mediamatters.org/facebook/despite-facebooks-covid-19-
     promises-anti-vaccine-groups-are-thriving.
[70] D. Funke, D. Flamini, A guide to anti-misinformation actions around the world, 2018.
     URL: https://www.poynter.org/ifcn/anti-misinformation-actions/.
[71] C. Lecher, Senators announce new bill that would regulate online political ads, 2017.
     URL: https://www.theverge.com/2017/10/19/16502946/facebook-twitter-russia-honest-
     ads-act.
[72] Media Literacy Now, Washington state media literacy legislation and law | Media Liter-
     acy Now, 2016. URL: https://medialiteracynow.org/your-state-legislation-2/washington-
     legislation/.
[73] S. Minichiello, California now has a law to bolster media literacy in schools,
     2018. URL: https://www.pressdemocrat.com/article/news/california-now-has-a-law-to-
     bolster-media-literacy-in-schools/.
[74] M. Nouwens, I. Liccardi, M. Veale, D. Karger, L. Kagal, Dark Patterns after the GDPR: Scrap-
     ing Consent Pop-ups and Demonstrating their Influence, in: Proceedings of the 2020 CHI
     Conference on Human Factors in Computing Systems, Association for Computing Machin-
     ery, New York, NY, USA, 2020, pp. 1–13. URL: https://doi.org/10.1145/3313831.3376321.
[75] D. J. Weitzner, U.S. Privacy Legislation: Now Is The Time, 2018. URL:
     https://internetpolicy.mit.edu/wp-content/uploads/2018/05/IPRI-Weitzner-BBA-
     Privacy-NowIsTime-2018.pdf.
[76] Biden Administration, The Biden Administration Launches AI.gov Aimed at Broadening
     Access to Federal Artificial Intelligence Innovation Efforts, Encouraging Innovators
     of Tomorrow, 2021. URL: https://www.whitehouse.gov/ostp/news-updates/2021/05/
     05/the-biden-administration-launches-ai-gov-aimed-at-broadening-access-to-federal-
     artificial-intelligence-innovation-efforts-encouraging-innovators-of-tomorrow/.
[77] S. Maynard, A. Ruighaver, A. Ahmad, Stakeholders in security policy development,
     in: 9th Australian Information Security Management Conference, Security Research
     Centre, Edith Cowan University, Perth, Western Australia, Edith Cowan University, Perth
     Western Australia, 2011. doi:10.4225/75/57b546fecd8c6.
[78] S. V. Flowerday, T. Tuyikeze, Information security policy development and imple-
     mentation: The what, how and who, Computers & Security 61 (2016) 169–183.
     doi:10.1016/j.cose.2016.06.002.
[79] H. Paananen, M. Lapke, M. Siponen, State of the art in information security policy devel-
     opment, Computers & Security 88 (2020) 101608. doi:10.1016/j.cose.2019.101608.
[80] N. Kshetri, Privacy and security issues in cloud computing: The role of institutions and
     institutional evolution, Telecommunications Policy 37 (2013) 372–386. doi:10.1016/
     j.telpol.2012.04.011.
[81] L. Gorman, B. Schafer, C. Tsao, D. Ghosh, The Weaponized Web: The National Security
     Implications of Data, Technical Report, Alliance for Securing Democracy, 2021. URL:
     https://securingdemocracy.gmfus.org/national-security-implications-of-data/.
[82] W. Ahmad, Why Botnets Persist: Designing Effective Technical and Policy Interventions,
     2019. URL: https://internetpolicy.mit.edu/publications-ipri-2019-02.
[83] P. Bowen, E. Chew, J. Hash, Information security guide for government executives,
     Technical Report NIST IR 7359, US National Institute of Standards and Technology,
     Gaithersburg, MD, 2007. doi:10.6028/NIST.IR.7359.
[84] S. J. Shackleford, A. E. Brady, Is It Time for a National Cybersecurity Safety Board:
     Examining the Policy Implications and Political Pushback, Albany Law Journal of Science
     & Technology 28 (2017) 56. URL: https://heinonline.org/HOL/Page?handle=hein.journals/
     albnyst28&id=184&div=&collection=.
[85] C. H. Heinl, Artificial (intelligent) agents and active cyber defence: Policy implications,
     in: 2014 6th International Conference On Cyber Conflict (CyCon 2014), 2014, pp. 53–66.
     doi:10.1109/CYCON.2014.6916395.
[86] E. Humprecht, F. Esser, P. Van Aelst, Resilience to Online Disinformation: A Framework
     for Cross-National Comparative Research, The International Journal of Press/Politics 25
     (2020) 493–516. doi:10.1177/1940161219900126.
[87] Y. Benkler, R. Faris, H. Roberts, Network Propaganda: Manipulation, Disinformation, and
     Radicalization in American Politics, Oxford University Press, 2018.
[88] M. X. Jin, S. Rajan, C. E. G. Bicas, M. Hao, L. Dong, B. Mufson, I. Hafiz, Novel Validated
     Index for the Measurement of Disinformation Susceptibility at the County Level, Cureus
     13 (2021). doi:10.7759/cureus.15305.
[89] T. Goertzel, Belief in Conspiracy Theories, Political Psychology 15 (1994) 731–742.
     doi:10.2307/3791630.
[90] J. W. van Prooijen, J. Staman, A. P. Krouwel, Increased conspiracy beliefs among ethnic
     and Muslim minorities - Prooijen - 2018 - Applied Cognitive Psychology - Wiley On-
     line Library, Applied Cognitive Psychology (2018). doi:https://doi.org/10.1002/
      acp.3442.
 [91] S. C. Quinn, M. P. Andrasik, Addressing Vaccine Hesitancy in BIPOC Communities —
      Toward Trustworthiness, Partnership, and Reciprocity, New England Journal of Medicine
      385 (2021) 97–100. doi:10.1056/NEJMp2103104.
 [92] The SAGE Vaccine Hesitancy Working Group, What influences vac-
      cine acceptance:           A model of determinants of vaccine hesitancy,
      2013.         URL:        https://www.who.int/immunization/sage/meetings/2013/april/
      1_Model_analyze_driversofvaccineConfidence_22_March.pdf.
 [93] T. Rozbroj, A. Lyons, J. Lucke, Psychosocial and demographic characteristics relating
      to vaccine attitudes in Australia, Patient Education and Counseling 102 (2019) 172–179.
      doi:10.1016/j.pec.2018.08.027.
 [94] S. E. Bokemper, A. S. Gerber, S. B. Omer, G. A. Huber, Persuading US White evan-
      gelicals to vaccinate for COVID-19: Testing message effectiveness in fall 2020 and
      spring 2021, Proceedings of the National Academy of Sciences 118 (2021). doi:10.1073/
      pnas.2114762118.
 [95] C. Funk, A. Tyson, Intent to Get a COVID-19 Vaccine Rises to 60% as Confidence in
      Research and Development Process Increases, Technical Report, Pew Research Cen-
      ter, 2020. URL: https://heatinformatics.com/sites/default/files/images-videosFileContent/
      PS_2020.12.03_covid19-vaccine-intent_REPORT.pdf.
 [96] L. P. Wong, H. Alias, P.-F. Wong, H. Y. Lee, S. AbuBakar, The use of the health belief
      model to assess predictors of intent to receive the COVID-19 vaccine and willingness
      to pay, Human Vaccines & Immunotherapeutics 16 (2020) 2204–2214. doi:10.1080/
      21645515.2020.1790279.
 [97] M. J. Hornsey, E. A. Harris, K. S. Fielding, The psychological roots of anti-vaccination
      attitudes: A 24-nation investigation, Health Psychology 37 (2018) 307–315. doi:10.1037/
      hea0000586.
 [98] D. Jolley, K. M. Douglas, The Effects of Anti-Vaccine Conspiracy Theories on Vaccination
      Intentions, PLOS ONE 9 (2014) e89177. doi:10.1371/journal.pone.0089177.
 [99] P. Bertin, K. Nera, S. Delouvée, Conspiracy Beliefs, Rejection of Vaccination, and Support
      for hydroxychloroquine: A Conceptual Replication-Extension in the COVID-19 Pandemic
      Context, Frontiers in Psychology 11 (2020) 565128. doi:10.3389/fpsyg.2020.565128.
[100] H. Azarpanah, M. Farhadloo, R. Vahidov, L. Pilote, Vaccine hesitancy: evidence from an
      adverse events following immunization database, and the role of cognitive biases, BMC
      Public Health 21 (2021) 1686. doi:10.1186/s12889-021-11745-1.
[101] L. Thunstrom, M. Ashworth, D. Finnoff, S. Newbold, Hesitancy Towards a COVID-19
      Vaccine and Prospects for Herd Immunity, SSRN Scholarly Paper ID 3593098, Social
      Science Research Network, Rochester, NY, 2020. doi:10.2139/ssrn.3593098.
[102] X. Nan, K. Madden, HPV Vaccine Information in the Blogosphere: How Positive and Neg-
      ative Blogs Influence Vaccine-Related Risk Perceptions, Attitudes, and Behavioral Inten-
      tions, Health Communication 27 (????) 829–836. doi:10.1080/10410236.2012.661348.
[103] C. Betsch, F. Renkewitz, N. Haase, Effect of Narrative Reports about Vaccine Ad-
      verse Events and Bias-Awareness Disclaimers on Vaccine Decisions: A Simulation
      of an Online Patient Social Network, Medical Decision Making 33 (2013) 14–25.
      doi:10.1177/0272989X12452342.
[104] W.-Y. S. Chou, A. Budenz, Considering Emotion in COVID-19 Vaccine Communication:
      Addressing Vaccine Hesitancy and Fostering Vaccine Confidence, Health Communication
      35 (2020) 1718–1722. doi:10.1080/10410236.2020.1838096.
[105] Y. M. Rocha, G. A. d. Moura, G. A. Desidério, C. H. d. Oliveira, F. D. Lourenço, L. D. d. F.
      Nicolete, The impact of fake news on social media and its influence on health during the
      COVID-19 pandemic: a systematic review, Zeitschrift Fur Gesundheitswissenschaften
      (2021) 1–10. doi:10.1007/s10389-021-01658-z.
[106] L. Simione, M. Vagni, C. Gnagnarella, G. Bersani, D. Pajardi, Mistrust and Be-
      liefs in Conspiracy Theories Differently Mediate the Effects of Psychological Factors
      on Propensity for COVID-19 Vaccine, Frontiers in Psychology 12 (2021) 683–684.
      doi:10.3389/fpsyg.2021.683684.
[107] L. Lu, J. Liu, Y. C. Yuan, K. S. Burns, E. Lu, D. Li, Source Trust and COVID-19 Information
      Sharing: The Mediating Roles of Emotions and Beliefs About Sharing, Health Education
      & Behavior 48 (2021) 132–139. doi:10.1177/1090198120984760.
[108] M. Baum, K. Ognyanova, D. Lazer, A. Wang, J. Lin, J. Druckman, R. H. Perlis, M. Santillana,
      J. Green, M. D. Simonson, A. Uslu, The COVID States Project #58: High public support
      for mandating vaccines, Technical Report, OSF Preprints, 2021. doi:10.31219/osf.io/
      6wcn9.
[109] S. Taylor, C. A. Landry, M. M. Paluszek, R. Groenewoud, G. S. Rachor, G. J. G. Asmundson,
      A Proactive Approach for Managing COVID-19: The Importance of Understanding the
      Motivational Roots of Vaccination Hesitancy for SARS-CoV2, Frontiers in Psychology 11
      (2020). doi:10.3389/fpsyg.2020.575950.
[110] N. Walter, J. J. Brooks, C. J. Saucier, S. Suresh, Evaluating the Impact of Attempts to Correct
      Health Misinformation on Social Media: A Meta-Analysis, Health Communication 36
      (2021) 1776–1784. doi:10.1080/10410236.2020.1794553.
[111] S. Geoghegan, K. P. O’Callaghan, P. A. Offit, Vaccine Safety: Myths and Misinformation,
      Frontiers in Microbiology 11 (2020). URL: https://www.frontiersin.org/article/10.3389/
      fmicb.2020.00372.
[112] A. A. Rogers, R. E. Cook, J. A. Button, Parent and Peer Norms are Unique Correlates of
      COVID-19 Vaccine Intentions in a Diverse Sample of U.S. Adolescents, The Journal of
      Adolescent Health 69 (2021) 910–916. doi:10.1016/j.jadohealth.2021.09.012.
[113] D. A. Cox, Peer pressure, not politics, may matter most when it comes to getting the
      COVID-19 vaccine, 2021. URL: https://www.americansurveycenter.org/commentary/
      peer-pressure-not-politics-may-matter-most-when-it-comes-to-getting-the-covid-19-
      vaccine/.
[114] K. H. Jamieson, How to Debunk Misinformation about COVID, Vaccines and Masks,
      2021. URL: https://www.scientificamerican.com/article/how-to-debunk-misinformation-
      about-covid-vaccines-and-masks/. doi:10.1038/scientificamerican0421-44.
[115] US Centers for Disease Control and Prevention, How to talk about COVID-19 vac-
      cines with friends and family, 2021. URL: https://www.cdc.gov/coronavirus/2019-ncov/
      vaccines/talk-about-vaccines.html.
[116] P. Konstantinou, K. Georgiou, N. Kumar, M. Kyprianidou, C. Nicolaides, M. Karekla, A. P.
      Kassianos, Transmission of Vaccination Attitudes and Uptake Based on Social Contagion
      Theory: A Scoping Review, Vaccines 9 (2021). doi:10.3390/vaccines9060607.
[117] C. E. Wagner, J. A. Prentice, C. M. Saad-Roy, L. Yang, B. T. Grenfell, S. A. Levin, R. Laxmi-
      narayan, Economic and Behavioral Influencers of Vaccination and Antimicrobial Use,
      Frontiers in Public Health 8 (2020) 975. doi:10.3389/fpubh.2020.614113.
[118] G. Burgio, B. Steinegger, A. Arenas, Homophily impacts the success of vaccine roll-outs
      (2021). URL: https://arxiv.org/pdf/2112.08240.pdf.
[119] A. M. Buttenheim, Covid-19 Vaccine Hesitancy and Strategies for Building Vaccine
      Confidence in the Covid-19 Vaccines, 2021. URL: https://www.nationalacademies.org/
      ocga/testimony-before-congress/covid-19-vaccine-hesitancy-and-strategies-for-
      building-vaccine-confidence-in-the-covid-19-vaccines.
[120] J. Sutton, Y. Rivera, T. K. Sell, M. B. Moran, D. Bennett Gayle, M. Schoch-Spana, E. K. Stern,
      D. Turetsky, Longitudinal Risk Communication: A Research Agenda for Communicating
      in a Pandemic, Health Security 19 (2021) 370–378. doi:10.1089/hs.2020.0161.
[121] US Centers for Disease Control and Prevention, How to Address COVID-19 Vaccine
      Misinformation, 2021. URL: https://www.cdc.gov/vaccines/covid-19/health-departments/
      addressing-vaccine-misinformation.html.
[122] World Health Organization, Risk communication and community engagement readi-
      ness and response to coronavirus disease (COVID-19): Interim guidance, 2020.
      URL: https://apps.who.int/iris/bitstream/handle/10665/331513/WHO-2019-nCoV-RCCE-
      2020.2-eng.pdf.
[123] US CDC COVID-19 Response, Vaccine Task Force, Vaccine Confidence & Demand Team,
      Insights Unit, COVID-19 State of Vaccine Confidence Insights Report, Technical Report 20,
      Centers for Disease Control and Prevention, 2021. URL: https://www.cdc.gov/vaccines/
      covid-19/downloads/SoVC_report20.pdf.
[124] Massachusetts Department of Public Health, COVID-19 Vaccine Equity Initiative: DPH
      Vaccine Ambassador program, ???? URL: https://www.mass.gov/info-details/covid-19-
      vaccine-equity-initiative-dph-vaccine-ambassador-program.
[125] S. Silva, Instagram’s playing whack-a-mole with anti-vaccine influencers, 2021.
      URL: https://www.mediamatters.org/coronavirus-covid-19/instagrams-playing-whack-
      mole-anti-vaccine-influencers.
[126] T. Bedford, Health Workers In Boston Counter Vaccine Misinformation In Haitian
      Communities, NPR (2021). URL: https://www.npr.org/2021/08/26/1031193075/health-
      workers-in-boston-counter-vaccine-misinformation-in-haitian-communities.
[127] J. Longoria, D. Acosta, S. Urbani, R. Smith, A Limiting Lens: How Vaccine Misinforma-
      tion Has Influenced Hispanic Conversations Online, Technical Report, First Draft, 2021.
      URL: https://firstdraftnews.org:443/long-form-article/covid19-vaccine-misinformation-
      hispanic-latinx-social-media/.
[128] K. Dodson, J. Mason, R. Smith, Covid-19 vaccine misinformation and narratives sur-
      rounding Black communities on social media, Technical Report, First Draft, 2021.
      URL: https://firstdraftnews.org:443/long-form-article/covid-19-vaccine-misinformation-
      black-communities/.
[129] A. Levy, The social network for doctors is full of vaccine disinformation (2021).
      URL: https://www.cnbc.com/2021/08/06/doximity-social-network-for-doctors-full-of-
      antivax-disinformation.html, section: Technology.
[130] T. Bolsen, R. Palm,            Politicization and COVID-19 vaccine resistance in the
      U.S., Progress in Molecular Biology and Translational Science (2021). doi:10.1016/
      bs.pmbts.2021.10.002.
[131] C. Lin, P. Tu, L. M. Beitsch, Confidence and Receptivity for COVID-19 Vaccines: A Rapid
      Systematic Review, Vaccines 9 (2021) 16. doi:10.3390/vaccines9010016.
[132] S. L. Pink, J. Chu, J. N. Druckman, D. G. Rand, R. Willer, Elite party cues increase
      vaccination intentions among Republicans, Proceedings of the National Academy of
      Sciences of the United States of America 118 (2021). doi:10.1073/pnas.2106559118.
[133] Biden Administration, National Strategy for the COVID-19 Response and
      Pandemic Preparedness, Technical Report, The White House, 2021. URL:
      https://www.whitehouse.gov/wp-content/uploads/2021/01/National-Strategy-for-
      the-COVID-19-Response-and-Pandemic-Preparedness.pdf.
[134] US Department of Health and Human Services, Vaccines National Strategic Plan 2021-
      2025, 2021. URL: https://www.hhs.gov/sites/default/files/HHS-Vaccines-Report.pdf.
[135] US Government Accountability Office, COVID-19 HHS Agencies’ Planned Reviews of
      Vaccine Distribution and Communication Efforts Should Include Stakeholder Perspectives,
      Report ot Congressional Committees, US Government Accountability Office, Washington,
      DC, USA, 2021. URL: https://www.gao.gov/assets/gao-22-104457.pdf.
[136] US Department of Health and Human Services, Funding Opportunity for Vaccine Confi-
      dence Research, 2016. URL: https://www.hhs.gov/vaccines/featured-priorities/vaccine-
      confidence/funding-opportunity-for-vaccine-confidence-research/index.html.
[137] US National Institutes of Health, NOT-MD-22-006: Notice of Special Interest (NOSI):
      Research to Address Vaccine Hesitancy, Uptake, and Implementation among Populations
      that Experience Health Disparities, 2021. URL: https://grants.nih.gov/grants/guide/notice-
      files/NOT-MD-22-006.html.
[138] US NASA Science Mission Directorate, A.28 Rapid Response and Novel Re-
      search in Earth Science, 2020. URL: https://nspires.nasaprs.com/external/
      solicitations/summary.do?solId=%7B3F3DFBFB-8FEE-F317-63FD-CB84ECA833EC%
      7D&path&method=init.
[139] Z. Hu, Z. Yang, Q. Li, A. Zhang, The COVID-19 Infodemic: Infodemiology Study Analyzing
      Stigmatizing Search Terms, Journal of Medical Internet Research 22 (2020) e22639.
      doi:10.2196/22639.
[140] J. H. Lee, Combating Anti-Asian Sentiment — A Practical Guide for Clinicians, New
      England Journal of Medicine 384 (2021) 2367–2369. doi:10.1056/NEJMp2102656.
[141] D. Tahir, M. Ravindranath, How the vaccine campaign overlooks Asian Americans, 2021.
      URL: https://politi.co/3wzBKt5.
[142] J. Jaiswal, C. LoSchiavo, D. C. Perlman, Disinformation, Misinformation and Inequality-
      Driven Mistrust in the Time of COVID-19: Lessons Unlearned from AIDS Denialism,
      AIDS and Behavior 24 (2020) 2776–2780. doi:10.1007/s10461-020-02925-y.
[143] D. R. Garfin, R. C. Silver, E. A. Holman, The Novel Coronavirus (COVID-2019) Outbreak:
      Amplification of Public Health Consequences by Media Exposure, Health psychology :
      official journal of the Division of Health Psychology, American Psychological Association
      39 (2020) 355. doi:10.1037/hea0000875.
[144] Z. Su, D. McDonnell, J. Wen, M. Kozak, J. Abbas, S. Šegalo, X. Li, J. Ahmad, A. Chesh-
      mehzangi, Y. Cai, L. Yang, Y.-T. Xiang, Mental health consequences of COVID-19 media
      coverage: the need for effective crisis communication practices, Globalization and Health
      17 (2021) 4. doi:10.1186/s12992-020-00654-4.
[145] J. S. Brennan, F. Simon, P. N. Howard, R. K. Nielsen, Types, sources, and claims of COVID-
      19 misinformation, 2020. URL: https://reutersinstitute.politics.ox.ac.uk/types-sources-
      and-claims-covid-19-misinformation.
[146] N. Calleja, A. AbdAllah, N. Abad, N. Ahmed, D. Albarracin, E. Altieri, J. N. Anoko, R. Arcos,
      A. A. Azlan, J. Bayer, A. Bechmann, S. Bezbaruah, S. C. Briand, I. Brooks, L. M. Bucci,
      S. Burzo, C. Czerniak, M. D. Domenico, A. G. Dunn, U. K. H. Ecker, L. Espinosa, C. Francois,
      K. Gradon, A. Gruzd, B. S. Gülgün, R. Haydarov, C. Hurley, S. I. Astuti, A. Ishizumi,
      N. Johnson, D. J. Restrepo, M. Kajimoto, A. Koyuncu, S. Kulkarni, J. Lamichhane, R. Lewis,
      A. Mahajan, A. Mandil, E. McAweeney, M. Messer, W. Moy, P. N. Ngamala, T. Nguyen,
      M. Nunn, S. B. Omer, C. Pagliari, P. Patel, L. Phuong, D. Prybylski, A. Rashidian, E. Rempel,
      S. Rubinelli, P. Sacco, A. Schneider, K. Shu, M. Smith, H. Sufehmi, V. Tangcharoensathien,
      R. Terry, N. Thacker, T. Trewinnard, S. Turner, H. Tworek, S. Uakkas, E. Vraga, C. Wardle,
      H. Wasserman, E. Wilhelm, A. Würz, B. Yau, L. Zhou, T. D. Purnat, A Public Health
      Research Agenda for Managing Infodemics: Methods and Results of the First WHO
      Infodemiology Conference, JMIR Infodemiology 1 (2021). doi:10.2196/30979.
[147] S. L. Wilson, C. Wiysonge, Social media and vaccine hesitancy, BMJ Global Health 5
      (2020) e004206. doi:10.1136/bmjgh-2020-004206.
[148] L. Palamenghi, S. Barello, S. Boccia, G. Graffigna, Mistrust in biomedical research and vac-
      cine hesitancy: the forefront challenge in the battle against COVID-19 in Italy, European
      Journal of Epidemiology (2020) 1. doi:10.1007/s10654-020-00675-8.
[149] K. E. Corcoran, C. P. Scheitle, B. D. DiGregorio, Christian Nationalism and COVID-19
      Vaccine Hesitancy and Uptake, Vaccine (2021). doi:10.1016/j.vaccine.2021.09.074.
[150] D. A. Broniatowski, A. M. Jamison, N. F. Johnson, N. Velasquez, R. Leahy, N. J. Restrepo,
      M. Dredze, S. C. Quinn, Facebook Pages, the “Disneyland” Measles Outbreak, and
      Promotion of Vaccine Refusal as a Civil Right, 2009–2019, American Journal of Public
      Health 110 (2020) S312–S318. doi:10.2105/AJPH.2020.305869.
[151] L. Ku, The Association of Social Factors and Health Insurance Coverage with COVID-19
      Vaccinations and Hesitancy, July 2021, Journal of General Internal Medicine (2021).
      doi:10.1007/s11606-021-07213-6.
[152] C. T. God, Is it a Hoax or is it Real, 2020. URL: https://www.iheart.com/podcast/the-
      breakfast-club-24992238/episode/is-it-a-hoax-or-is-72167513/.
[153] B. Gates, I’m Bill Gates, ch-chair of the Bill & Melinda Gates Foundation. AMA
      about COVID-19, 2020. URL: https://www.reddit.com/r/Coronavirus/comments/fksnbf/
      im_bill_gates_cochair_of_the_bill_melinda_gates/fkupg49/?context=3.
[154] I. Sriskandarajah, Where did the microchip vaccine conspiracy theory come from anyway?,
      2021. URL: https://www.theverge.com/22516823/covid-vaccine-microchip-conspiracy-
      theory-explained-reddit.
[155] Ipsos, Axios/Ipsos Poll Wave 42: A Survey of the American General Population (Ages
      18+) Topline and Methodology, 2021. URL: https://www.ipsos.com/sites/default/files/ct/
      news/documents/2021-03/topline-axios-ipsos-coronavirus-index-w42.pdf.
[156] Reuters, Fact check: COVID-19 vaccines do not contain the ingredients listed in these
      posts, Reuters (2021). URL: https://www.reuters.com/article/uk-factcheck-covid-vaccine-
      ingredients-idUSKBN2AQ2SW.
[157] US Centers for Disease Control and Prevention, Information about the Moderna COVID-
      19 Vaccine, 2022. URL: https://www.cdc.gov/coronavirus/2019-ncov/vaccines/different-
      vaccines/Moderna.html.
[158] US Centers for Disease Control and Prevention, Information about the Pfizer-BioNTech
      COVID-19 Vaccine, 2022. URL: https://www.cdc.gov/coronavirus/2019-ncov/vaccines/
      different-vaccines/Pfizer-BioNTech.html.
[159] J. Jenkins, Most Latter-day Saints, Catholics and others see no valid religious reasons for
      vaccine exemptions, 2021. URL: https://www.sltrib.com/religion/2021/12/09/most-latter-
      day-saints/.
[160] Y. Mardian, K. Shaw-Shaliba, M. Karyana, C.-Y. Lau, Sharia (Islamic Law) Perspec-
      tives of COVID-19 Vaccines, Frontiers in Tropical Diseases 2 (2021). URL: https:
      //www.frontiersin.org/article/10.3389/fitd.2021.788188.
[161] S. H. Spencer, COVID-19 Vaccines Don’t Have Patient-Tracking Devices, 2020. URL: https:
      //www.factcheck.org/2020/12/covid-19-vaccines-dont-have-patient-tracking-devices/.
[162] D. K. Li, ’There is not ... period’: Doctor tells California official there’s no tracking device
      in Covid shot, 2021. URL: https://www.nbcnews.com/news/us-news/there-not-period-
      doctor-tells-california-official-there-s-no-n1265843.
[163] K. Schoolov, Why it’s not possible for the Covid vaccines to contain a magnetic tracking
      chip that connects to 5G, 2021. URL: https://www.cnbc.com/2021/10/01/why-the-covid-
      vaccines-dont-contain-a-magnetic-5g-tracking-chip.html.
[164] D. Rosengard, M. Tucker-McLaughlin, T. Brown, Students and Social News: How Col-
      lege Students Share News Through Social Media, Electronic News 8 (2014) 120–137.
      doi:10.1177/1931243114546448.
[165] S. Paulussen, R. A. Harder, Social Media References in Newspapers, Journalism Practice
      8 (2014) 542–551. doi:10.1080/17512786.2014.894327.
[166] YouTube, COVID-19 medical misinformation policy - YouTube Help, 2020. URL: https:
      //support.google.com/youtube/answer/9891785?hl=en.
[167] Twitter, COVID-19 misleading information policy, 2021. URL: https://help.twitter.com/
      en/rules-and-policies/medical-misinformation-policy.
[168] S. Frier, Facebook, Instagram to Ban Accounts Spreading Vaccine Lies, Bloomberg.com
      (2021). URL: https://www.bloomberg.com/news/articles/2021-02-08/facebook-to-ban-
      groups-instagrams-sharing-false-vaccine-info.
[169] N. Krishnan, J. Gu, R. Tromble, L. C. Abroms, Research note: Examining how various
      social media platforms have responded to COVID-19 misinformation, Harvard Kennedy
      School Misinformation Review (2021). doi:10.37016/mr-2020-85.
[170] S. D. Sparks, Communications Expert Explains: How to Talk to Parents About COVID
      Vaccination,      Education Week (2022). URL: https://www.edweek.org/leadership/
      communications-expert-explains-how-to-talk-to-parents-about-covid-vaccination/
      2022/01.
[171] M. Saeidi, COVID-19 vaccines and kids: What parents should know, 2021.
      URL: https://www.wfla.com/8-on-your-side/covid-19-vaccines-and-kids-what-parents-
      should-know/.
[172] A. Singh, L. Coburn, A. Yang, S. Fasano, A. Riegle, Teen social media stars
      in uphill battle against COVID-19 vaccine misinformation,             ABC News (2021).
      URL: https://abcnews.go.com/Technology/teen-social-media-stars-uphill-battle-covid-
      19/story?id=79625431.
[173] A. J. Onwuegbuzie, K. M. T. Collins, R. K. Frels, Foreword, Using Bronfenbrenner’s
      ecological systems theory to frame quantitative, qualitative, and mixed research, Inter-
      national Journal of Multiple Research Approaches 7 (2013) 2–8. URL: https://doi.org/
      10.5172/mra.2013.7.1.2. doi:10.5172/mra.2013.7.1.2, publisher: Routledge.
[174] E. M. Cummings, A. C. Schermerhorn, C. E. Merrilees, M. C. Goeke-Morey, P. Shirlow,
      E. Cairns, Political violence and child adjustment in Northern Ireland: Testing pathways
      in a social–ecological model including single-and two-parent families, Developmental
      Psychology 46 (2010) 827–841. doi:10.1037/a0019668.
[175] P. J. Gruenewald, L. G. Remer, E. A. LaScala, Testing a social ecological model of alcohol
      use: the California 50-city study, Addiction 109 (2014) 736–745. doi:10.1111/add.12438.
[176] C. H. Brown, D. C. Mohr, C. G. Gallo, C. Mader, L. Palinkas, G. Wingood, G. Prado,
      S. G. Kellam, H. Pantin, J. Poduska, R. Gibbons, J. McManus, M. Ogihara, T. Va-
      lente, F. Wulczyn, S. Czaja, G. Sutcliffe, J. Villamar, C. Jacobs, A Computational
      Future for Preventing HIV in Minority Communities: How Advanced Technology
      Can Improve Implementation of Effective Programs, JAIDS Journal of Acquired Im-
      mune Deficiency Syndromes 63 (2013) S72. URL: https://journals.lww.com/jaids/Fulltext/
      2013/06011/A_Computational_Future_for_Preventing_HIV_in.13.aspx. doi:10.1097/
      QAI.0b013e31829372bd.
[177] J. Stewart, Using the social ecological model to build a path analysis model of physical
      activity in a sample of active US college students, Dissertation, West Virginia Univer-
      sity, Morgantown, West Virginia, 2019. URL: https://researchrepository.wvu.edu/cgi/
      viewcontent.cgi?article=8542&context=etd.
[178] M. D. A. Rounsevell, D. T. Robinson, D. Murray-Rust, From actors to agents in socio-
      ecological systems models, Philosophical Transactions of the Royal Society B: Biological
      Sciences 367 (2012). URL: https://www.ncbi.nlm.nih.gov/labs/pmc/articles/PMC3223809/.
      doi:10.1098/rstb.2011.0187, publisher: The Royal Society.
[179] L. Antill, Selection of a Research Method, Elsevier Science & Technology, Oxford, UK,
      1985, pp. 191–204. URL: https://ifipwg82.org/sites/ifipwg82.org/files/Antill.pdf.
[180] B. Kaplan, J. A. Maxwell, Qualitative Research Methods for Evaluating Computer Infor-
      mation Systems, in: J. G. Anderson, C. E. Aydin (Eds.), Evaluating the Organizational
      Impact of Healthcare Information Systems, Health Informatics Series, 2 ed., Springer US,
      2005, pp. 30–55. URL: http://eknygos.lsmuni.lt/springer/147/30-55.pdf.
[181] E. A. Chatman, A theory of life in the round, Journal of the American Society for Informa-
      tion Science 50 (1999) 207–217. doi:10.1002/(SICI)1097-4571(1999)50:3<207::
      AID-ASI3> 3.0.CO;2-8.
[182] G. Burnett, P. T. Jaeger, Small Worlds, Lifeworlds, and Information: The Ramifications
      of the Information Behaviour of Social Groups in Public Policy and the Public Sphere,
      Information Research: An International Electronic Journal 13 (2008).
[183] P. T. Jaeger, G. Burnett, Information Worlds: Behavior, Technology, and Social Context in
      the Age of the Internet, Routledge, 2014.
[184] M. Lee, B. S. Butler, How are information deserts created? A theory of local information
      landscapes, Journal of the Association for Information Science and Technology 70 (2019)
      101–116. doi:10.1002/asi.24114.
[185] D. H. Sonnewald, Information Horizons, in: K. E. Fisher, S. Erdelez, L. McKechnie
      (Eds.), Theories of Information Behavior, American Society for Information Science and
      Technology, 2005, pp. 191–197.
[186] K. Williamson, Discovered by chance: The role of incidental information acquisition
      in an ecological model of information use, Library & Information Science Research 20
      (1998) 23–40. doi:10.1016/S0740-8188(98)90004-4.
[187] N. K. Agarwal, Towards a Definition of Serendipity in Information Behaviour, Information
      Research: An International Electronic Journal 20 (2015).