<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>International Workshop on Behavior Change Support Systems, May</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Assessing Trustworthiness in Persuasive Prototypes</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Parinda Rahman</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ifeoma Adaji</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of British Columbia</institution>
          ,
          <addr-line>Kelowna</addr-line>
          ,
          <country country="CA">Canada</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <volume>5</volume>
      <issue>2025</issue>
      <fpage>0000</fpage>
      <lpage>0003</lpage>
      <abstract>
        <p>Trust is a critical factor in the adoption and efectiveness of persuasive technologies, yet many existing systems lack explicit trust-enhancing design features. This study explores how transparency, autonomy, consent, and data privacy influence user trust in persuasive technologies. A focus group of 14 UX/UI experts provided design insights for the ethical design of persuasive technologies, which were implemented into two prototypes-one integrating ethical design principles and the other serving as a control. A user study with 449 participants evaluated trust perceptions using the Human-Computer Trust Scale. Results indicate that transparency significantly enhances trust, while inadequate privacy controls contribute to skepticism. The findings ofer actionable insights for designing ethical and trustworthy persuasive systems, emphasizing the need for clear communication, user control, and informed consent mechanisms</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;persuasive design</kwd>
        <kwd>user interface</kwd>
        <kwd>trust</kwd>
        <kwd>ethics</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        With the increasing prevalence of personalized systems, persuasive technologies have been widely
adopted across various domains, including health, education, and e-commerce, to influence user
behavior and encourage positive behavioral change [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. These technologies leverage user data to tailor
interventions, thereby enhancing their efectiveness. However, as persuasive systems become more
integrated into daily life, concerns regarding their ethical implications and trustworthiness have emerged,
particularly in relation to data collection, processing, and usage for persuasive purposes [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. A
significant proportion of users remain skeptical due to the lack of transparency in data handling, limited
user control, and uncertainties surrounding privacy and security. While persuasive technologies are
designed to guide behavior in beneficial ways, their ethical considerations must be prioritized to ensure
they are perceived as trustworthy rather than manipulative.
      </p>
      <p>
        Trust plays a crucial role in the adoption and efectiveness of persuasive systems. If users perceive
a system’s intentions or data practices as untrustworthy, they may disengage or reject it entirely [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
Therefore, for persuasive technologies to be both efective and ethically sound, they must incorporate key
design principles such as transparency, autonomy, informed consent, data privacy, and security—factors
that directly shape user perceptions and interactions [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Prior research by Rahman et al. [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] has
identified autonomy, transparency, consent, and data privacy as critical ethical concerns that contribute
to mistrust in technology. Building on this work, the present study conducted a focus group with 14
user experience (UX) researchers and designers (average experience: 3 years) to explore strategies for
enhancing trust through improved ethical design in transparency, autonomy, consent, data privacy, and
security.. This study aims to integrate expert-driven design insights into prototype development and
evaluate the efectiveness of these ethical considerations in fostering user trust.
      </p>
      <p>Despite the growing recognition of trust-related concerns in persuasive technologies, many existing
systems lack explicit trust-enhancing design features, making it challenging for users to assess their
reliability. To address these challenges, this research aims to design and evaluate persuasive prototypes
that incorporate varying levels of transparency, autonomy, consent, data privacy, and security. By
systematically assessing user trust perceptions across diferent prototypes, this study seeks to establish
best practices for the ethical design of persuasive technologies. The findings will contribute to the
development of systems that are not only efective in influencing behavior but also respectful of user
rights and preferences, ultimately fostering greater trust and acceptance.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>
        The intersection of trust and persuasive technologies has been a focal area of research, emphasizing how
these systems can ethically and efectively influence user behavior. Rahman and Adaji [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] conducted a
systematic literature review identifying transparency, user autonomy, and informed consent as critical
factors in fostering trust. They highlight risks such as manipulation and privacy invasion, stressing
the need for ethical design practices. Ahmad et al. [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] further examined how cognitive and afective
trust influence decision-making, finding that users generally exhibit low trust in persuasive systems,
necessitating strategies to address this deficit.
      </p>
      <p>
        Transparency plays a foundational role in trust perceptions of persuasive technologies. Users require
clear information on system intent, algorithms, and data practices [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. Without suficient disclosure,
persuasive technologies risk being perceived as deceptive, leading to skepticism and disengagement [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ].
McKnight et al. [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] emphasized transparency, system reliability, and user control as key elements shaping
trust. Similarly, Zieglmeier et al. [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] proposed "trustworthy transparency by design," advocating for
explicit data usage policies to enhance user confidence.
      </p>
      <p>
        Autonomy is another essential factor in ethical persuasive design. Users must feel in control rather
than coerced [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. Ethical persuasive systems should provide adjustable settings, opt-out options, and
informed decision-making opportunities [12]. However, many interfaces lack autonomy-enhancing
mechanisms, raising concerns about unintentional coercion. Research on how varying levels of user
control impact trust remains limited.
      </p>
      <p>Informed consent, data privacy, and security are also critical for trust in persuasive systems. Users
must be fully aware of and agree to persuasive strategies [13], yet many consent mechanisms lack
context-awareness and adaptability. Privacy concerns persist as many persuasive applications collect
extensive user data without clear privacy settings or security assurances [14]. Although
privacy-bydesign principles exist, their efectiveness in UI design remains underexplored.</p>
      <p>While existing studies provide valuable insights, systematic evaluations of trust in persuasive
prototypes remain limited. Most research conceptualizes trust broadly rather than assessing how specific
design features—such as transparency indicators, privacy controls, and consent mechanisms—impact
user trustworthiness. Comparative evaluations of trust-focused persuasive prototypes are scarce,
leaving gaps in understanding how users perceive and interact with trust-enhancing design elements [15].
Addressing these gaps is essential for ensuring persuasive systems are not only efective but also ethical
and user-centered.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Methodology</title>
      <sec id="sec-3-1">
        <title>3.1. Design Details</title>
        <p>
          The prototypes were developed using insights from a previous focus group study involving 14 UX/UI
experts. The focus group research questions aimed to identify key design considerations for developing
ethical and trustworthy persuasive systems. The questions primarily focused on Transparency, Consent,
Autonomy, and data privacy and security. Insights from the focus group and expert evaluations were
implemented in User Interface A (UI A), which integrated ethical design principles. In contrast, User
Interface B (UI B) served as a control, omitting these considerations to assess their impact. In contrast,
User Interface (UI B) served as a control, omitting these design considerations to assess their impact.
The prototypes were developed using Figma and focused on three domains: health, shopping, and
education. Shopping applications were selected due to their widespread popularity among smartphone
users [16]. Meanwhile, user trust was identified as a critical factor in the efectiveness of both health
and fitness applications, as well as educational applications [ 17, 18]. Furthermore, another study [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]
identifies that unethical design practices are predominantly prevalent across these domains. Findings
from the prior focus group study further reinforced the importance of transparency, user autonomy,
and privacy in these domains, shaping the final prototype designs.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Study Design</title>
        <p>For the evaluation of the prototypes, participants were recruited via Amazon MTurk. The study received
approval from the Behavioural Research Ethics Board (H24-01325). For the recruitment of the study,
participants were filtered in Amazon Mturk to have used persuasive technology before. This was
done to ensure the validity of the responses. A Qualtrics survey was distributed, where participants
assessed the prototypes using the Human-Computer Trust Scale, a validated measure for evaluating
trust in technology [19]. A 7-point Likert scale was used in the scale and for each ethical factor such as
transparency, the average of all the statements was computed to characterize the overall trust score.
Additionally, participants were asked to indicate which prototype they preferred and perceived as
more trustworthy and ethical. Demographic data, including age, gender, and frequency of persuasive
technology use, was also collected. Prior to participation, informed consent was obtained from all
individuals involved in the study, and participant anonymity was ensured.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Participants</title>
        <p>The study included a total of 449 participants. The majority were male (62.6%), with females making up
37.4% of the sample. Most participants fell within the 25-34 age range (63.5%), followed by the 35-44
age group (26.5%). Only a small fraction (5.9%) were aged 45 or above. Regarding technology expertise,
most participants were at an intermediate level (58.6%), while 33.2% identified as experts and 8.2% as
novices. In terms of persuasive technology usage, more than half (56.6%) used such technologies daily,
39.9% used them weekly, and a small percentage (3.6%) reported bi-weekly or less frequent usage.</p>
      </sec>
      <sec id="sec-3-4">
        <title>3.4. Data Analysis</title>
        <p>
          The data were analyzed using SPSS due to its robust statistical capabilities for handling survey data,
facilitating both descriptive and inferential analyses [20]. SPSS enables eficient data management,
allowing for the identification of patterns and relationships in user perceptions of trust in persuasive
technologies. The descriptive statistics focus on Transparency, Consent, Autonomy, and Data Privacy
&amp; Security because these factors were considered as they have been consistently identified in prior
research as key determinants of trust in persuasive systems [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ],[
          <xref ref-type="bibr" rid="ref4">4</xref>
          ].
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Prototypes</title>
      <sec id="sec-4-1">
        <title>4.1. Transparency Design</title>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Consent Design</title>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Autonomy Design</title>
        <p>Figure 3 presents two user interfaces for subscribing to a premium membership in an e-commerce
platform, illustrating varying levels of user autonomy. UI A ensures equal choice burden by presenting
subscription options with uniform button colors and providing a clearly visible “Skip” button. The
"Close" button is also highlighted, allowing users to exit without subscribing. Additionally, UI A
transparently displays potential savings for users who opt for a yearly subscription. Conversely, UI B
prioritizes the monthly subscription by using a more visually prominent color while obscuring the yearly
subscription. Furthermore, the “Close” button is smaller and faded, making it less noticeable. Research
by Michalski et al. [25] suggests that factors such as color, icon size, and positioning significantly
influence users’ navigation and decision-making within a system.</p>
      </sec>
      <sec id="sec-4-4">
        <title>4.4. Data Privacy and Security</title>
        <p>Figure 4 illustrates two user interface designs for a persuasive health app that tracks daily water
intake, highlighting diferent approaches to data privacy and security. UI A emphasizes transparency
by informing users about data security protocols through an icon indicating GDPR compliance. Users
are also given the option to download and review their shared data, with additional choices to delete or
retain their records. Furthermore, UI A incorporates a gamified element, rewarding users with in-app
currency for reading more about data regulations, thereby increasing awareness. In contrast, UI B
provides only the option to save user data, without any indication of security protocols or privacy
safeguards. Users are not prompted to learn more about how their data is handled. Prior research [26]
suggests that users who are more informed about an app’s data privacy and security policies are more
likely to trust the system.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Results</title>
      <sec id="sec-5-1">
        <title>5.1. Prototype Evaluations</title>
        <p>Table 2 presents the mean scores and standard deviations for the two user interface (UI) prototypes
across four key ethical and trust-related dimensions: Transparency, Consent, Autonomy, and Data
Privacy and Security. The results indicate that UI A consistently achieved slightly higher mean ratings
than UI B across all dimensions, suggesting a small but consistent preference for UI A in terms of
user experience. However, the standard deviations indicate some variability in responses, with UI B
generally exhibiting higher standard deviation values. This suggests that user opinions about UI B were
more varied, whereas UI A’s ratings were more stable. As shown in Table 2, users’ direct preference
between UI A and UI B is illustrated. The users were asked which user interface they preferred and
they were provided with the choice of UI A and UI B. A higher proportion of users favored UI A in all
four dimensions, with Transparency showing the most notable diference, where 66% of users preferred
UI A compared to 34% for UI B. Similarly, Consent (59.5% vs. 40.5%), Autonomy (58.6% vs. 41.4%), and
Data Privacy and Security (62.8% vs. 37.2%) all showed a preference for UI A. These results suggest that
UI A was perceived as a more user-friendly and efective interface compared to UI B.</p>
        <p>Overall, the results indicate that UI A was perceived as both the more user-friendly and ethical
option, consistently receiving higher mean scores, greater user preference, and stronger perceptions of
ethicality and trustworthiness. While UI B did not outperform UI A in any dimension, the narrower
gap in Data Privacy and Security perceptions suggests that further refinement in UI A’s approach to
privacy features could be beneficial.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Discussion</title>
      <p>This study examined how incorporating ethical design features—specifically transparency, consent,
autonomy, and data privacy/security—afects user trust in persuasive prototypes. The findings highlight
that prototypes embedding these ethical considerations were consistently rated higher in trustworthiness
and ethical perception compared to those lacking such features. Among these factors, transparency
emerged as the most influential, with a significant majority of participants favoring interfaces that
provided clear explanations for system recommendations. This strong preference underscores the
critical role of transparency in fostering trust, as users tend to be more inclined to trust systems that
openly communicate their processes, decisions, and intentions.</p>
      <p>
        These results align with existing literature emphasizing transparency and user control as fundamental
components of trust in persuasive systems. For instance, Ahmad and Ali [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] found that cognitive trust
significantly influences decision-making in persuasive technologies, reinforcing the idea that ethical
design elements can measurably enhance trust and user preference. However, while transparency is
generally regarded as a trust-enhancing feature, its efects are not universally positive. Some studies
suggest that, in certain contexts, increased transparency may paradoxically reduce trust. For example,
Springer and Whittaker [27] observed that when system predictions did not align with user expectations,
greater transparency sometimes led to decreased confidence. Similarly, Eslami et al. [ 28] found that
algorithmic transparency could polarize users, with some appreciating the additional information while
others found it confusing, ultimately leading to a decline in trust. These findings suggest that while
transparency is valuable, its implementation must be carefully considered to avoid unintended negative
consequences.
      </p>
      <p>Beyond transparency, other factors also influence trust and acceptance. Wanner et al. [ 29]
demonstrated that while transparency enhances cognitive trust, it does not necessarily lead to greater
acceptance of intelligent systems. This highlights the importance of additional considerations such as
perceived usefulness, user experience, and the level of user control ofered. Our study supports this
perspective, suggesting that transparency should be balanced with usability to ensure that users feel
informed without being overwhelmed. If transparency features are overly complex or burdensome, users
may disengage from the system rather than feel reassured. A user-centered approach to transparency,
tailored to the context and needs of diferent user groups, is therefore essential.</p>
      <p>The practical implications of these findings are significant for designers and developers of persuasive
systems. Transparency features should be presented in a clear and accessible manner, ensuring that
users understand how their data is utilized and how system decisions are made. Providing detailed but
digestible explanations can foster trust while mitigating the risks of information overload. Additionally,
granting users control over their data and implementing real-time consent mechanisms aligns with
ethical best practices and reinforces system integrity. These measures not only improve user trust
but also enhance engagement and satisfaction, leading to greater adoption of persuasive technologies.
However, designers must ensure that transparency is implemented in a way that supports, rather than
complicates, user experience.</p>
      <p>Despite these insights, this study has certain limitations. The participant pool primarily consisted of
individuals aged 25-44, which may limit the generalizability of the findings to other age groups. Moreover,
the study did not report any statistical significance, efect sizes, or demographic diferences. Future
studies should aim to include a broader demographic to explore how trust in persuasive technologies
varies across diferent populations and analyze the data further using statistical tests. Additionally,
reliance on self-reported trust measures introduces the potential for response biases. More objective
trust measurements, such as behavioral data or physiological responses, could provide deeper insights
into user perceptions.</p>
      <p>Further research should also adopt longitudinal designs to assess how trust in persuasive systems
evolves over time. Investigating adaptive transparency features—where explanations are adjusted based
on user preferences and system interactions—could ofer a more refined approach to fostering trust.
Additionally, real-time consent mechanisms and personalization in ethical design could further enhance
user engagement and confidence in persuasive technologies.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Conclusion</title>
      <p>This study highlights the importance of ethical design principles—transparency, autonomy, consent,
and data privacy—in fostering trust in persuasive technologies. The evaluation of user interfaces
demonstrated that incorporating these features enhances user perceptions of trustworthiness and
ethicality, with transparency emerging as a key determinant. However, the findings also suggest that
poorly implemented transparency may lead to skepticism, reinforcing the need for a balanced and
user-centered approach. Future research should investigate adaptive transparency mechanisms and
real-time consent features to enhance trust-building strategies in persuasive systems.</p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the author(s) used Chat-GPT-4 and Grammarly in order to:
Grammar and spelling check. After using these tool(s)/service(s), the author(s) reviewed and edited the
content as needed and take(s) full responsibility for the publication’s content.
[12] J. Riegelsberger, M. A. Sasse, J. D. McCarthy, The mechanics of trust: A framework for research
and design, International journal of human-computer studies 62 (2005) 381–422.
[13] H. Nissenbaum, A contextual approach to privacy online, Daedalus 140 (2011) 32–48.
[14] A. Acquisti, L. Brandimarte, G. Loewenstein, Privacy and human behavior in the age of information,</p>
      <p>Science 347 (2015) 509–514.
[15] J. Hamari, J. Koivisto, T. Pakkanen, Do persuasive technologies persuade?-a review of empirical
studies, in: International conference on persuasive technology, Springer, 2014, pp. 118–136.
[16] A. Adib, R. Orji, A systematic review of persuasive strategies in mobile e-commerce applications
and their implementations, in: International conference on persuasive technology, Springer, 2021,
pp. 217–230.
[17] M. Honary, B. T. Bell, S. Clinch, S. E. Wild, R. McNaney, et al., Understanding the role of healthy
eating and fitness mobile apps in the formation of maladaptive eating and exercise behaviors in
young people, JMIR mHealth and uHealth 7 (2019) e14239.
[18] C. Stockman, E. Nottingham, Dark patterns of cuteness: Popular learning app design as a risk
to children’s autonomy, in: Children, Young People and Online Harms: Conceptualisations,
Experiences and Responses, Springer, 2024, pp. 113–137.
[19] M. Madsen, S. Gregor, Measuring human-computer trust, in: 11th australasian conference on
information systems, volume 53, Citeseer, 2000, pp. 6–8.
[20] A. Field, Discovering statistics using IBM SPSS statistics, Sage publications limited, 2024.
[21] J. Zerilli, U. Bhatt, A. Weller, How transparency modulates trust in artificial intelligence, Patterns
3 (2022).
[22] F. Cabiddu, L. Moi, G. Patriotta, D. G. Allen, Why do users trust algorithms? a review and
conceptualization of initial trust and trust over time, European management journal 40 (2022)
685–706.
[23] N. Richards, W. Hartzog, The pathologies of digital consent, Wash. UL Rev. 96 (2018) 1461.
[24] J. Tang, U. Akram, W. Shi, Why people need privacy? the role of privacy fatigue in app users’
intention to disclose privacy: based on personality traits, Journal of Enterprise Information
Management 34 (2021) 1097–1120.
[25] R. Michalski, J. Grobelny, W. Karwowski, The efects of graphical interface design characteristics
on human–computer interaction task eficiency, International Journal of Industrial Ergonomics 36
(2006) 959–977.
[26] S. Hussain, Future of digital marketing: Innovation, privacy, and consumer trust (????).
[27] A. Springer, S. Whittaker, What are you hiding? algorithmic transparency and user perceptions,
arXiv preprint arXiv:1812.03220 (2018).
[28] M. Eslami, A. Rickman, K. Vaccaro, A. Aleyasen, A. Vuong, K. Karahalios, K. Hamilton, C. Sandvig,
" i always assumed that i wasn’t really that close to [her]" reasoning about invisible algorithms in
news feeds, in: Proceedings of the 33rd annual ACM conference on human factors in computing
systems, 2015, pp. 153–162.
[29] J. Wanner, L.-V. Herm, K. Heinrich, C. Janiesch, The efect of transparency and trust on intelligent
system acceptance: Evidence from a user-based study, Electronic Markets 32 (2022) 2079–2102.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>B. J.</given-names>
            <surname>Fogg</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Hreha</surname>
          </string-name>
          , Persuasive technology,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>R.</given-names>
            <surname>Orji</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Mofatt</surname>
          </string-name>
          ,
          <article-title>Persuasive technology for health and wellness: State-of-the-art and emerging trends</article-title>
          ,
          <source>Health informatics journal 24</source>
          (
          <year>2018</year>
          )
          <fpage>66</fpage>
          -
          <lpage>91</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>D. H.</given-names>
            <surname>McKnight</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Choudhury</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Kacmar</surname>
          </string-name>
          ,
          <article-title>The impact of initial consumer trust on intentions to transact with a web site: a trust building model</article-title>
          ,
          <source>The journal of strategic information systems 11</source>
          (
          <year>2002</year>
          )
          <fpage>297</fpage>
          -
          <lpage>323</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>R.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Bush-Evans</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Arden-Close</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Bolat</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>McAlaney</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Hodge</surname>
          </string-name>
          , S. Thomas,
          <string-name>
            <given-names>K.</given-names>
            <surname>Phalp</surname>
          </string-name>
          ,
          <article-title>Transparency in persuasive technology, immersive technology, and online marketing: Facilitating users' informed decision making and practical implications</article-title>
          ,
          <source>Computers in Human Behavior</source>
          <volume>139</volume>
          (
          <year>2023</year>
          )
          <fpage>107545</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>P.</given-names>
            <surname>Rahman</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Adaji</surname>
          </string-name>
          ,
          <article-title>Dark patterns in shopping, education &amp; health apps, in: 2024 IEEE Digital Platforms and Societal Harms (DPSH)</article-title>
          , IEEE,
          <year>2024</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>P.</given-names>
            <surname>Rahman</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Adaji</surname>
          </string-name>
          ,
          <article-title>Ethics in persuasive technologies: A systematic literature review</article-title>
          ,
          <source>in: Proceedings of the International Conference on Mobile and Ubiquitous Multimedia</source>
          ,
          <year>2024</year>
          , pp.
          <fpage>106</fpage>
          -
          <lpage>118</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>W. N. W.</given-names>
            <surname>Ahmad</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. M.</given-names>
            <surname>Ali</surname>
          </string-name>
          ,
          <article-title>A study on persuasive technologies: the relationship between user emotions, trust and persuasion (</article-title>
          <year>2018</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Y. D.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. H.</given-names>
            <surname>Emurian</surname>
          </string-name>
          ,
          <article-title>Trust in e-commerce: consideration of interface design factors</article-title>
          ,
          <source>Journal of Electronic Commerce in Organizations (JECO) 3</source>
          (
          <year>2005</year>
          )
          <fpage>42</fpage>
          -
          <lpage>60</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>A.</given-names>
            <surname>Spahn</surname>
          </string-name>
          ,
          <article-title>And lead us (not) into persuasion</article-title>
          . . .
          <article-title>? persuasive technology and the ethics of communication</article-title>
          ,
          <source>Science and engineering ethics 18</source>
          (
          <year>2012</year>
          )
          <fpage>633</fpage>
          -
          <lpage>650</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>V.</given-names>
            <surname>Zieglmeier</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Lehene</surname>
          </string-name>
          ,
          <article-title>Designing trustworthy user interfaces</article-title>
          ,
          <source>in: Proceedings of the 33rd Australian Conference on Human-Computer Interaction</source>
          ,
          <year>2021</year>
          , pp.
          <fpage>182</fpage>
          -
          <lpage>189</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>R. B. Cialdini</surname>
          </string-name>
          , et al.,
          <source>Influence: Science and practice</source>
          , volume
          <volume>4</volume>
          , Pearson education Boston,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>