<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Persuasive Technology for Suicide Prevention: A Virtual Human mHealth Application</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Sharon Mozgai</string-name>
          <email>mozgai@ict.usc.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Albert “Skipp” Rizzo</string-name>
          <email>rizzo@ict.usc.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Arno Hartholt</string-name>
          <email>hartholt@ict.usc.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>USC Institute for Creative Technologies</institution>
          ,
          <addr-line>Los Angeles, CA</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>We are demoing Battle Buddy, an mHealth application designed to support access to physical and mental wellness content as well as safety planning for U.S. military veterans. This virtual human interface will collect multimodal data through passive sensors native to popular wearables (e.g., Apple Watch) and deliver adaptive multimedia content specifically tailored to the user in the interdependent domains of physical, cognitive, and emotional health. Battle Buddy can deliver health interventions matched to the individual user via novel adaptive logic-based algorithms while employing various behavior change techniques (e.g., goal-setting, barrier identification, rewards, modeling, etc.). All interactions were specifically designed to engage and motivate by employing the persuasive strategies of (1) personalization, (2) self-monitoring, (3) tunneling, (4) suggestion, and (5) expertise.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Virtual Human</kwd>
        <kwd>Persuasive Technology</kwd>
        <kwd>mHealth</kwd>
        <kwd>Suicide</kwd>
        <kwd>1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
    </sec>
    <sec id="sec-2">
      <title>2. System Overview</title>
      <p>This mHealth application has a primary focus on the iPhone and is an extension of a previous
virtual human mHealth application [8,7,10,11]. Given the initial focus on iPhones and the tight
iOS integration, the primary wearable target device is the Apple Watch, which offers a rich
multimodal set of sensors. Data will be read directly from the iOS HealthKit API and is processed
using our custom application software. This allows us to support any hardware device able to
write data to HealthKit, including FitBit and Garmin devices. Battle Buddy is a Unity application
developed using a custom version of the Virtual Human Toolkit [6, 3,4] and RIDE [5], a rapid
prototyping middleware for AI-driven simulations. The VHToolkit incorporates and enables
automatic audio-visual sensing, speech recognition, natural language processing, nonverbal
behavior generation, nonverbal behavior realization, text-to-speech generation, and rendering,
features which will be added to Battle Buddy through user-centered design processes. Data is
leveraged to enable the decision-making algorithms and intervention manager.</p>
      <p>A subset of the data will be analyzed and evaluated on the client devices to provide users with
real-time and actionable feedback (e.g., daily progress towards personal fitness goals,
questionnaire results, etc.); the remainder of the data (e.g., UI interaction data, recorded user
voice audio) will be collected and post-processed on the server. A user-centered iterative design
process was used in the development of this application, leveraging findings of empirical reviews
of effective behavior change techniques employed in mobile health apps [6]. In particular, the
persuasive strategies of (1) personalization (e.g., creating a customized safety plan, (2)
selfmonitoring (e.g., mood tracking), (3) tunneling (e.g., guided breathing exercise), (4) suggestion
(e.g., app initiated pushes/prompts), and (5) expertise (e.g., quick connection to suicide hotline).</p>
    </sec>
    <sec id="sec-3">
      <title>Acknowledgements</title>
      <p>This work was sponsored by the US Army under contract W911NF-14-D-0005 and the VA Mission
Daybreak Challenge, in collaboration with SoldierStrong. The content of the information does not
necessarily reflect the position or the policy of the Government, and no official endorsement
should be inferred.</p>
      <p>Mozgai, S., Hartholt, A., Rizzo, A.: The passive sensing agent: A multimodal adaptive mhealth
application. In: 2020 IEEE International Conference on Pervasive Computing and
Communications Workshops (PerCom Workshops). pp. 1–3. IEEE (2020)
Mozgai, S., Hartholt, A., Rizzo, A.S.: An adaptive agent-based interface for personalized health
interventions. In: Proceedings of the 25th International Conference on Intelligent User
Interfaces Companion. pp. 118–119 (2020)
Mozgai, S., Lucas, G., Gratch, J.: To tell the truth: Virtual agents and morningmorality. In:
Intelligent Virtual Agents: 17th International Conference, IVA 2017, Stockholm, Sweden,
August 27-30, 2017, Proceedings 17. pp. 283–286. Springer (2017)
Mozgai, S.A., Femminella, B., Hartholt, A., Rizzo, A.: User-centered design modelfor mobile
health (mhealth) applications: A military case study in rapid assessment process (rap). In:
Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. pp.
1–8 (2021)
Rizzo, A. Hartholt, A., Mozgai, S.: From combat to covid-19–managing the impactof trauma
using virtual reality. Journal of Technology in Human Services 39(3), 314–347 (2021)
Tam-Seto, L., Wood, V.M., Linden, B., Stuart, H.: A scoping review of mentalhealth mobile apps
for use by the military community. Mhealth 4 (2018)</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>Military and veteran suicide prevention</article-title>
          (
          <year>Oct 2022</year>
          ), https://afsp.org/military-andveteransuicide-prevention
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>Bickmore</surname>
            ,
            <given-names>T.W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Utami</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Matsuyama</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Paasche-Orlow</surname>
            ,
            <given-names>M.K.</given-names>
          </string-name>
          :
          <article-title>Improving access to online health information with conversational agents: a randomized controlled experiment</article-title>
          .
          <source>Journal of medical Internet research</source>
          <volume>18</volume>
          (
          <issue>1</issue>
          ),
          <year>e1</year>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Hartholt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fast</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reilly</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Whitcup</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liewer</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mozgai</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Ubiquitousvirtual humans: A multi-platform framework for embodied ai agents in xr</article-title>
          .
          <source>In: 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)</source>
          . pp.
          <fpage>308</fpage>
          -
          <lpage>3084</lpage>
          . IEEE (
          <year>2019</year>
          )
          <string-name>
            <surname>Hartholt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fast</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reilly</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Whitcup</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liewer</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mozgai</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Multiplatform expansion of the virtual human toolkit: Ubiquitous conversational agents</article-title>
          .
          <source>International Journal of Semantic Computing</source>
          <volume>14</volume>
          (
          <issue>3</issue>
          ) (
          <year>2020</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Hartholt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McCullough</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fast</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leeds</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mozgai</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Aris</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ustun</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gordon</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McGroarty</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Rapid prototyping for simulation and training with the rapid integration &amp; development environment (ride)</article-title>
          .
          <source>In: Proceedings of the 2021 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC)</source>
          (
          <year>2021</year>
          )
          <string-name>
            <surname>Hartholt</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Traum</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Marsella</surname>
            ,
            <given-names>S.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shapiro</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stratou</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leuski</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Morency</surname>
            ,
            <given-names>L.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gratch</surname>
          </string-name>
          , J.:
          <article-title>All together now, introducing the virtual human toolkit</article-title>
          .
          <source>In: International Workshop on Intelligent Virtual Agents</source>
          . pp.
          <fpage>368</fpage>
          -
          <lpage>381</lpage>
          . Springer (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>