<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Beyond Diagnosis Approach: Fostering Trust in AI's Supportive Role in Healthcare</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Mohammed Ali Tahtali</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Corné Dirne</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Noord-Brabant</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Eindhoven</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>The Netherlands</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="editor">
          <string-name>Trust in AI, Healthcare Integration and Augmentative AI Applications 1</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Brabant</institution>
          ,
          <addr-line>Eindhoven</addr-line>
          ,
          <country country="NL">The Netherlands</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Industrial Engineering &amp; Innovation Sciences, Eindhoven University of Technology</institution>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Department of Industrial Engineering en Management, Fontys University of Applied Sciences</institution>
          ,
          <addr-line>Noord-</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>This study examines the potential of Artificial Intelligence (AI) to gain trust and improve outcomes in healthcare by serving as a supportive tool rather than replacing human judgment. Despite AI's advancements in diagnosis and treatment, scepticism among individuals persists due to concerns over AI's lack of empathy and the importance of human expertise. By focusing on AI's role as an augmentative rather than substitutive technology, we aim to identify strategies for integrating AI in a manner that complements human skills, thereby enhancing acceptance and trust in AI and improving care for patients.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>As Artificial Intelligence (AI) continues to advance, its role has significantly shifted from
performing routine, repetitive tasks to making complex decisions that traditionally
required expert human judgment. This transformation is particularly evident in critical
sectors such as government and healthcare, where AI is now instrumental in assisting with
decisions once solely made by judges, doctors and other trained professionals. AI's invasion
into these sectors parallels human expertise and, in some instances, even surpasses it in
terms of efficiency and accuracy. Applications in healthcare include, for example, AI for
image recognition in radiology [1], the use of AI for complex diagnoses [2] and personalised
treatment plans [3]. Despite AI's demonstrated accuracy in these critical tasks, there are
considerable challenges in healthcare, particularly scepticism among people about relying
on AI's decisions.</p>
      <p>
        Delving deeper into this scepticism, the trust of medical professionals in AI, as explored
in the literature, reveals a multifaceted landscape. Longoni et al. [4] emphasise the concerns
over AI's ability to navigate complex, individual patient scenarios and maintain the human
elements of empathy and understanding in healthcare. Complementing this, research from
Juravle et al. [
        <xref ref-type="bibr" rid="ref9">5</xref>
        ], Promberger and Baron [6] and Seitz et al. [7] highlight a clear preference
for human decision-making over AI among both patients and healthcare providers, rooted
in human expertise and emotional empathy connections - areas where AI currently falls
short. Furthermore, the importance of explainability in AI systems, as emphasised in studies
by Tucci et al. [8], Schwartz et al. [9], Naiseh et al. [10] and Alam and Mueller [
        <xref ref-type="bibr" rid="ref4">11</xref>
        ], is critical,
arguing that making AI decisions understandable is vital for fostering trust among medical
professionals and patients.
      </p>
      <p>However, it is essential to note that the majority of tasks AI has been assigned in these
studies focus on diagnosis, often sparking debate regarding its role and effectiveness.
Despite these challenges, research by Verma et al. [12], Tahtali et al. [13] and Longoni et al.
[4] suggests a potential area for AI to gain trust among medical professionals and possibly
among patients as well, when it supports rather than replaces the final decision-making
process. For instance, by automating triage and operational tasks, such as predicting bed
availability and summarizing patient consultations, AI significantly expands healthcare
capabilities [13]. This automation enhances patient care and underlines AI's transformative
impact on the sector. Highlighting AI's ability to extend beyond its current diagnostic roles,
the findings support a more widely accepted model where AI serves as a supportive tool
rather than a replacement. Such a paradigm shift could improve trust in AI applications
within healthcare.</p>
      <p>Our study aims to explore and identify more effective and acceptable ways of integrating
AI into a broader range of healthcare tasks. By focusing on AI applications that complement
and augment human expertise rather than replacing it, we seek to foster wider acceptance
and trust in AI among patients. Consequently, the central research question we propose is:
How can AI applications in healthcare, extending beyond direct diagnosis, gain the trust of
patients and contribute to enhancing overall patient care?</p>
    </sec>
    <sec id="sec-2">
      <title>2. Research design</title>
      <p>Our (ongoing) study explores patients' trust in an AI smart camera, currently operated
as a supportive tool for monitoring individuals post-intensive care. The traditional method
involves daily manual patient assessments, such as vital signs. Yet, introducing this AI tool
aims to augment the existing process, potentially transforming it into a more continuous,
AI-driven monitoring system. However, patients' acceptance and trust in AI's (video based
monitoring) capabilities remain uncertain. Through a qualitative research approach, this
paper aims to uncover the nuances of acceptability surrounding the AI support tool,
emphasizing the camera's role as a complement to, rather than a replacement for, human
expertise in healthcare practices.</p>
      <p>Employing semi-structured interviews, we engage patients to determine their
perspectives on and receptiveness to AI technology. We focus on various topics, such as the
camera's usability, trust in the signals it provides and the nurse-patient relationship. This
methodological choice facilitates a deeper understanding of their expectations and
apprehensions, thereby informing the development of AI technologies that foster trust and
meet the healthcare sector's needs. Our research is about gauging readiness and envisioning
AI's integration into healthcare workflows in ways that align with professional standards
and patient comfort. This research seeks to illuminate paths forward that ensure AI's role
in healthcare maximizes benefits while maintaining the irreplaceable value of human
clinical insight.</p>
    </sec>
    <sec id="sec-3">
      <title>Acknowledgements</title>
      <p>The authors extend their heartfelt gratitude to all participants willing to share their
insights and experiences, thereby making this research possible. Special thanks go to
Noortje Somers for her assistance in conducting interviews, performing analyses and
participating in discussions.</p>
      <p>This work is part of the project "Trust in Algorithms" (project number 023.017.109),
under the auspices of the Teacher Promotion Grant program, generously funded by the
Dutch Research Council (NWO).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>M. Huisman</surname>
          </string-name>
          et al.,
          <article-title>“An international survey on AI in radiology in 1,041 radiologists and radiology residents part 1: fear of replacement, knowledge, and attitude</article-title>
          ,” Eur.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>Radiol.</surname>
          </string-name>
          , vol.
          <volume>31</volume>
          , no.
          <issue>9</issue>
          , pp.
          <fpage>7058</fpage>
          -
          <lpage>7066</lpage>
          ,
          <year>2021</year>
          , doi: 10.1007/s00330-021-07781-5.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Eriksen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Möller</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Ryg</surname>
          </string-name>
          , “
          <article-title>Use of GPT-4 to Diagnose Complex Clinical Cases,”</article-title>
          <source>Nejm Ai</source>
          , vol.
          <volume>1</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>2023</fpage>
          -
          <lpage>2025</lpage>
          ,
          <year>2023</year>
          , doi: 10.1056/aip2300031.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          11. Dove Medical Press Ltd, pp.
          <fpage>10851</fpage>
          -
          <lpage>10858</lpage>
          ,
          <year>2019</year>
          . doi:
          <volume>10</volume>
          .2147/CMAR.S232473.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Longoni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Bonezzi</surname>
          </string-name>
          , and
          <string-name>
            <given-names>C. K.</given-names>
            <surname>Morewedge</surname>
          </string-name>
          , “Resistance to Medical
          <source>Artificial Intelligence,” J. Consum. Res.</source>
          , vol.
          <volume>46</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>629</fpage>
          -
          <lpage>650</lpage>
          , Dec.
          <year>2019</year>
          , doi: 10.1093/jcr/ucz013.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <given-names>G.</given-names>
            <surname>Juravle</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Boudouraki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Terziyska</surname>
          </string-name>
          , and
          <string-name>
            <given-names>C.</given-names>
            <surname>Rezlescu</surname>
          </string-name>
          , “
          <article-title>Trust in artificial intelligence for medical diagnoses,” in Progress in Brain Research</article-title>
          , vol.
          <volume>253</volume>
          ,
          <string-name>
            <surname>Elsevier</surname>
            <given-names>B.V.</given-names>
          </string-name>
          ,
          <year>2020</year>
          , pp.
          <fpage>263</fpage>
          -
          <lpage>282</lpage>
          . doi:
          <volume>10</volume>
          .1016/bs.pbr.
          <year>2020</year>
          .
          <volume>06</volume>
          .006.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          19, no.
          <issue>5</issue>
          , pp.
          <fpage>455</fpage>
          -
          <lpage>468</lpage>
          ,
          <year>2006</year>
          , doi: 10.1002/bdm.542.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <given-names>L.</given-names>
            <surname>Seitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Bekmeier-Feuerhahn</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          K. Gohil, “
          <article-title>Can we trust a chatbot like a physician? A qualitative study on understanding the emergence of trust toward diagnostic chatbots,”</article-title>
          <string-name>
            <surname>Int. J. Hum. Comput. Stud.</surname>
          </string-name>
          , vol.
          <volume>165</volume>
          ,
          <string-name>
            <surname>Sep</surname>
          </string-name>
          .
          <year>2022</year>
          , doi: 10.1016/j.ijhcs.
          <year>2022</year>
          .
          <volume>102848</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          5, no.
          <source>November</source>
          <year>2021</year>
          , pp.
          <fpage>0</fpage>
          -
          <lpage>2</lpage>
          ,
          <year>2022</year>
          , doi: 10.21037/jmai-21-25.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <surname>J. M. Schwartz</surname>
          </string-name>
          et al.,
          <article-title>“Factors Influencing Clinician Trust in Predictive Clinical Decision Support Systems for In-Hospital Deterioration: Qualitative Descriptive Study,” JMIR Hum</article-title>
          .
          <source>Factors</source>
          , vol.
          <volume>9</volume>
          , no.
          <issue>2</issue>
          ,
          <string-name>
            <surname>Apr</surname>
          </string-name>
          .
          <year>2022</year>
          , doi: 10.2196/33960.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <surname>Comput. Stud.</surname>
          </string-name>
          , vol.
          <volume>169</volume>
          ,
          <string-name>
            <surname>Jan</surname>
          </string-name>
          .
          <year>2023</year>
          , doi: 10.1016/j.ijhcs.
          <year>2022</year>
          .
          <volume>102941</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <given-names>L.</given-names>
            <surname>Alam</surname>
          </string-name>
          and
          <string-name>
            <given-names>S.</given-names>
            <surname>Mueller</surname>
          </string-name>
          , “
          <article-title>Examining the effect of explanation on satisfaction and trust in AI diagnostic systems</article-title>
          ,
          <source>” BMC Med</source>
          . Inform. Decis. Mak., vol.
          <volume>21</volume>
          , no.
          <issue>1</issue>
          ,
          <string-name>
            <surname>Dec</surname>
          </string-name>
          .
          <year>2021</year>
          , doi: 10.1186/s12911-021-01542-6.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          <string-name>
            <given-names>H.</given-names>
            <surname>Verma</surname>
          </string-name>
          et al.,
          <article-title>“Rethinking the Role of AI with Physicians in Oncology: Revealing Perspectives from Clinical</article-title>
          and Research Workflows,” in
          <source>Conference on Human Factors in Computing Systems - Proceedings, Apr</source>
          .
          <year>2023</year>
          . doi:
          <volume>10</volume>
          .1145/3544548.3581506.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <string-name>
            <surname>Tahtali</surname>
            <given-names>MA</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Snijders</surname>
            <given-names>C</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dirne</surname>
            <given-names>C</given-names>
          </string-name>
          , Le Blanc P. “Prioritising Trust:
          <article-title>Podiatrists' Preference for AI in Supportive Over Diagnostic Roles in Healthcare: A Qualitative Analysis”</article-title>
          , JMIR, Preprints.
          <volume>30</volume>
          /03/2024:59010. doi:
          <volume>10</volume>
          .2196/preprints.59010
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>