<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>From Ethics to Values in the Design of Mobile PINC</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Author Keywords Ethics</institution>
          ,
          <addr-line>values, Value Sensitive Design, PINC, persuasive technology, mobile phones</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>Value Sensitive Design (VSD) is a promising method for addressing ethical issues and opportunities in the design of mobile technologies to promote behavior change. After positioning the work with respect to the PINC strategies (Persuasion, Influence, Nudge, and Coercion), I introduce the VSD method and analyze the role of values inherent to PINC strategies as well as values implicated by the means and ends of behavior change. Finally, I consider value tensions and differences in individual values.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>PINC AND PERSUASIVE TECHNOLOGY
I approach PINC from the persuasive technology
community. Persuasive technology concerns the use of
technologically generated or mediated information,
Copyright © 2011 for the individual papers by the papers' authors.
Copying permitted only for private and academic purposes. This volume
is published and copyrighted by the editors of PINC2011.</p>
      <p>
        The PINC framing suggests new approaches to behavior
change. Cialdini's theory of influence gives six specific
strategies that are used by people to influence each other;
some work in persuasive technology has drawn on these
strategies (e.g., [
        <xref ref-type="bibr" rid="ref10 ref15">10,15</xref>
        ]). Nudge comes from the concept of
choice architecture: that our environment structures the
choices available to us, and moreover, that there is an
inevitably a default option; designers can carefully select
that default to gently nudge to the desired behavior. To the
best of my knowledge, this idea has not been studied
explicitly in the domain of persuasive technology, although
default options can be considered suggestions from the
computer [10, p.126]. Finally, coercion ensures a particular
behavior through threats and force [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. In his framing of the
field of persuasive technology, Fogg explicitly excluded
coercion as distinct from persuasion [10, p.15] and always
unethical [10, p.226] and . Thus, there has been little study
of coercion in the persuasive technology community.
VALUE SENSITIVE DESIGN
Value Sensitive Design (VSD) is a theoretical and
methodological framework intended to help designers
account for human values in a principled and
comprehensive way throughout the design process [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ].
VSD emphasizes values of moral import and thus speaks to
ethical concerns in technology design. Key VSD features
include its comprehensive attention to stakeholders and its
tripartite methodology [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ].
      </p>
      <p>
        VSD demands attention to both direct and indirect
stakeholders: not only those who use the technology but
those who are affected by its use. VSD also suggests
particular attention to vulnerable stakeholders. In the case
of mobile persuasion, this may include teens [
        <xref ref-type="bibr" rid="ref16 ref5">5,16</xref>
        ], U.S.
Latinos and blacks [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ], and Africans [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ] who rely heavily
on their mobile phones for communication and Web access,
and may not have broadband Internet access as an
alternative. Such groups should be neither neglected nor
abused in the design of mobile PINC technologies.
VSD's methodology incorporates technical, empirical, and
conceptual investigations. Technical investigations concern
how system features support or undermine particular
values. Empirical investigations address stakeholder
conceptions of values and the human response to the
artifact. For example, an empirical study might present
participants with scenarios designed to push the boundaries
of certain values in the design of persuasive technology,
similar to Page and Kray's recent study of ethical responses
to persuasive technologies [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ]. Finally, conceptual
investigations explore the values at hand and the tensions
between them. The remainder of this paper comprises such
a conceptual investigation.
      </p>
      <p>
        Elsewhere, I have argued that VSD is well-suited to address
ethical concerns in persuasive technology [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. The VSD
methodology draws attention to stakeholder values and
value tensions throughout the design process, so that
barriers can be addressed early [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Here, I extend the brief
value analysis in that earlier work by considering coercion
in addition to persuasion, as well as mobile technology.
VALUE ANALYSIS
I consider three classes of values related to mobile PINC
technologies: values necessarily implicated by the PINC
approach, values implicated by particular methods of
promoting behavior change, and values implicated by the
desired ends. I also consider value tensions and how
differences in values may nonetheless lead to the same
behavior. I will draw particularly upon my earlier analysis
[
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] and Berdichevsky and Neuenschwander's ethical
principles for persuasive technology [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>
        The values of PINC
The PINC endeavor is intimately tied to the value of
autonomy [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Autonomy “refers to people's ability to
decide, plan, and act in ways that they believe will help
them to achieve their goals” [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. PINC technology thwarts
autonomy when it is used to get people to do things that are
against their own goals (e.g., to waste money on useless
products). But PINC can also uphold autonomy, when it is
deployed in support of an individual's goals (e.g., to
become more active). Indeed, Oinas-Kukkonen has
proposed the development of theory and methodology for
behavior change support systems as an important direction
for the Persuasive Technology community [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ].
      </p>
      <p>
        How do the PINC approaches stand with respect to
autonomy? As Fogg notes, persuasion implies voluntary
change [10, p.15]. Influence, too, suggests voluntariness;
indeed, Cialdini shows how to recognize and circumvent
influence attempts [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Central to the nudge is is the idea of
libertarian paternalism [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ]: though the designers choose a
typically “best” option for the default, individuals always
have the freedom to choose other options according to their
own goals and knowledge. Finally, the force of coercion
inherently diminishes the autonomy of the coerced [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
When might coercion by computers be justified? As
Anderson notes, “few believe that [coercion] is always
unjustified, since it seems that no society could function
without some authorized uses” [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Law gives governments
a limited authority to use coercion. Deploying computers
for law enforcement has potential benefits and costs, as
computers lack the contextual awareness and judgment of a
human being. While this might be seen as an opportunity to
use computers for fair enforcement, unclouded by human
biases, it can be surprisingly difficult to produce computer
systems that are free of bias. Designers can encode
unconscious biases in the system, unintended biases can
emerge in use, and new biases can arise as a system is used
in new contexts [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. Further, although humans often blame
computers for bad outcomes, computers lack moral
accountability [
        <xref ref-type="bibr" rid="ref10 ref14">10,14</xref>
        ]. We should ask, who is held
accountable if a computer's act of coercion is unjust? Thus,
coercion by computer systems engages the further values of
accountability and freedom from bias.
      </p>
      <p>
        Coercive tactics such as threats may be acceptable when
users have freely chosen the system in support of their own
goals. Indeed, Page and Kray report that study participants
found coercive “shock tactics” to be acceptable if it was the
person's own choice to use the system [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ]. This returns us
to our earlier definition of autonomy: “people's ability to
decide, plan, and act in ways that they believe will help
them to achieve their goals” [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ].
      </p>
      <p>However, users need information to assess the suitability of
the system to their goals. Berdichevsky and Neuenschander
state two ethical principles related to disclosure:</p>
      <p>
        VI) The creators of a persuasive technology should
disclose their motivations, methods, and intended
outcomes, except when such disclosure would
significantly undermine an otherwise ethical goal.
VII) Persuasive technologies must not misinform in order
to achieve their persuasive end. [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]
One step beyond this is the value of informed consent [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]:
that people should not only be informed, but should have an
explicit opportunity to offer or withhold consent. Friedman,
Howe, and Felten identified six components of informed
consent [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]: consent comprises voluntariness, competence,
agreement, and minimal distraction, while for consent to be
informed requires not only disclosure, as Berdichevsky and
Neuenschander exhort, but also comprehension. PINC
technologies that are undermined by informed consent—for
example, Kaptein and Eckles's persuasion profiling [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]—
deserve heightened scrutiny regardless of the acceptability
of their ends. As Michalski [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] and Kaptein and Eckles
[
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] point out, the problem is that the natural incentives
may be against even disclosure, let alone informed consent.
The values of means
Once we have decided attempt to change another's
behavior, we have a number of means available for doing
so. While, for example, the nudge approach implies a
particular mechanism for influencing choices, persuasion
encompasses a number of means [
        <xref ref-type="bibr" rid="ref10 ref19">10,19</xref>
        ].
      </p>
      <p>
        Many persuasive strategies, such as self-monitoring,
personalization, tailoring, and social comparison [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ], rely
on information about the user's context and activities.
Indeed, two of Berdichevsky and Neuenschwander's
principles point to privacy as a value of particular concern:
IV) The creators of a persuasive technology must ensure
that it regards the privacy of users with at least as
much respect as they regard their own privacy.
      </p>
      <p>
        V) Persuasive technologies relaying personal
information about a user to a third party must be
closely scrutinized for privacy concerns. [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]
Mobile phones can capture an unprecedented amount of
information about the user, such as location coordinates,
calls, and text messages, accentuating the need for attention
to privacy [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. But channels such as audio, photographs,
and proximity also capture information about others nearby
—indirect stakeholders [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. In their empirical study of teen
safety scenarios, Czeskis and colleagues learned that teens
were more reluctant to indirectly share information about
their context and activities with friends' parent than to share
such information with their own parents [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Thus, in a
mobile context, it is important to consider the privacy of
companions and bystanders—not only the user.
      </p>
      <p>
        Although privacy is important to many PINC strategies, we
should go beyond privacy to account for values such as
identity, courtesy, and calmness [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] when they are
implicated by the means used to affect behavior. For
example, consider the value of identity, “people's
understanding of who they are over time” [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. The
persuasive strategy of social learning, providing “means to
observe other[s] who are performing their target behaviors
and to see the outcomes of their behavior” [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ], should be
more effective when observers share an identity with the
observed. Further, if we are “married” to our cell phones,
we will be more attached to applications that reflect our
identities. As another example, courtesy and calmness are
implicated by technologies that use the suggestion strategy.
Although suggestions must be given at the right time and
place to affect behavior [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], suggestions should be polite
and allow the user to remain peaceful and composed—
unless there is an overriding reason to otherwise.
The values of ends (target values)
As noted in the introduction, values underly persuasion. In
persuading someone to act in one way and not another, we
are asserting that the desired behavior will result in a better
outcome. Better for what? Better for our health, for our
family's safety, for national security, for the environment,
and so on. Implicit in every act of persuasion is a value the
persuader wants to support, a target value.
      </p>
      <p>Berdichevsky and Neuenschwander address three principles
to the ends of persuasion:</p>
      <p>I)</p>
      <p>The intended outcome of any persuasive technology
should never be one that would be deemed unethical
if the persuasion were undertaken without the
technology or if the outcome occurred independently
of persuasion
II) The motivations behind the creation of a persuasive
technology should never be such that they would be
deemed unethical if they led to more traditional
persuasion.</p>
      <p>
        VIII) The Golden Rule of Persuasion: The creators of a
persuasive technology should never seek to persuade
a person or persons of something they themselves
would not consent to be persuaded to do. [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]
All three principles focus on unacceptable ends for
persuasion. They provide no guidance as to what ends
would be desirable. Attention to values can lead to desirable
ends for behavior change. Indeed, much persuasive
technology has explicitly targeted health or environmental
sustainability. Although these are laudable goals, perhaps
we should also be designing persuasive technology that
helps us to overcome our racial biases (freedom from bias),
control our anger (calmness), and learn to help and rely on
our neighbors (trust). Further, it is important to understand
the values of those we are designing for.
      </p>
      <p>
        Value tensions
The most obvious value tensions in PINC technology pit
desired behavior changes and the values they implicate
against the intention to change behavior and methods for
doing so. That is, ends can be in tension with means. We
see promoting health, environmental sustainability, and so
on, versus preserving autonomy, privacy, and so on.
However, these are not the only types of tensions. First, the
act of persuasion inherently privileges the values of the
persuader over those of the persuaded. By asking you to
change your behavior, I am saying that my values are more
important than your values (or at least, the values you seem
to be acting on). In the best case, as in behavior change
support systems, the persuader and the persuaded agree on a
value such as health or environmental sustainability; the
persuader provides information or support to help the
persuaded act in accordance with this shared value.
Second, people may agree on values but disagree on
priorities. We might agree that environmental sustainability
is worthwhile—but I might rate the comfort or excitement
of driving as more important. Indeed, Rokeach compared
individuals' value systems solely on the basis of differences
in their rankings of a set of predefined values [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ].
Same behavior, different values
Finally, people may agree on a desired behavior, but have
different reasons for valuing that behavior. For example,
five people might choose to drive below the speed limit,
each for their own reasons: to obey the law; to protect
safety; to practice thrift; to reduce dependence on foreign
oil and protect national security; or to reduce the need for
oil drilling and contribute to environmental sustainability.
As Fogg points out, the mobile phone is an intimate device
[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. If it does not share our goals, but rather has goals of its
own, we feel betrayed [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. The same would seem to hold
for values. Suppose that my highest value is the safety of
my children. If I adopt a mobile application to help me
avoid speeding, and it shows me pictures of polar bears, I
will be upset. Because it challenges my values, I see the
application as a threat to my autonomy, and I experience
psychological reactance [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]—leading me to drive even
faster. Instead, I should be reminded of my value of safety.
I see two approaches to addressing individual users' values.
First, designers' value commitments should be made clear
through branding and the informed consent process, so that
users can make informed choices. Second, interfaces such
as Todd, Rogers, and Payne's informative grocery shopping
cart [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ] should be tailorable. They should uphold user
autonomy by allowing users to choose which information
among value-laden options (e.g., sustainability,
healthfulness, and cost) to display most prominently. A
danger is that undisclosed, involuntary tailoring may cross
from persuasion to manipulation [
        <xref ref-type="bibr" rid="ref15 ref17">15,17</xref>
        ].
      </p>
      <p>CONCLUSION
Attention to values may contribute not only to
understanding ethical issues of mobile PINC technology—
bringing attention to concerns beyond privacy and
disclosure—but also to increasing their scope and
effectiveness—their power to do good in the world. Further
work should clarify role of these values through empirical
and technical investigations of PINC technology.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Aleahmad</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          , et al.
          <article-title>Fishing for sustainability: The effects of indirect and direct persuasion</article-title>
          .
          <source>In Ext. Abstracts CHI</source>
          <year>2008</year>
          , ACM Press (
          <year>2008</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Anderson</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Coercion</surname>
          </string-name>
          . In The Stanford Encyclopedia of Philosophy (
          <year>2006</year>
          ), http://plato.stanford.edu/entries/coercion
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Berdichevsky</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Neuenschwander</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          <article-title>Toward an ethics of persuasive technology</article-title>
          .
          <source>CACM 42</source>
          ,
          <issue>5</issue>
          (May
          <year>1999</year>
          ),
          <fpage>51</fpage>
          -
          <lpage>58</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Cialdini</surname>
          </string-name>
          , R.B. Influence: The Science of Persuasion, Collins, revised ed. (
          <year>1998</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Czeskis</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , et al.
          <article-title>Parenting from the pocket: Value tensions and technical direction for secure and private parent-teen mobile safety</article-title>
          .
          <source>In Proc. SOUPS</source>
          <year>2010</year>
          , ACM Press (
          <year>2010</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Davis</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <article-title>Ethical design methods for persuasive technology</article-title>
          .
          <source>In Proc. PERSUASIVE</source>
          <year>2009</year>
          , ACM Press (
          <year>2009</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Eslambolchilar</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , Wilson,
          <string-name>
            <given-names>M.L.</given-names>
            ,
            <surname>Oakley</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. PINC</surname>
          </string-name>
          :
          <article-title>Persuasion, influence, nudge, and coercion through mobile devices</article-title>
          . To appear, Ext.
          <source>Abstracts CHI</source>
          <year>2011</year>
          , ACM Press (
          <year>2011</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Fogg</surname>
            ,
            <given-names>B.J.</given-names>
          </string-name>
          <article-title>The future of persuasion is mobile</article-title>
          .
          <source>In B.J. Fogg and D. Eckles (eds.)</source>
          , Mobile Persuasion, Stanford Captology Media (
          <year>2007</year>
          ),
          <fpage>5</fpage>
          -
          <lpage>11</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Fogg</surname>
            ,
            <given-names>B.J.</given-names>
          </string-name>
          <article-title>Increasing persuasion through mobility</article-title>
          .
          <source>In B.J. Fogg and D. Eckles (eds.)</source>
          , Mobile Persuasion, Stanford Captology Media (
          <year>2007</year>
          ),
          <fpage>155</fpage>
          -
          <lpage>163</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Fogg</surname>
            ,
            <given-names>B.J.</given-names>
          </string-name>
          <string-name>
            <surname>Persuasive</surname>
          </string-name>
          <article-title>Technology: Using Computers to Change What We Think and Do</article-title>
          , Morgan Kaufmann (
          <year>2003</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Friedman</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Howe</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Felten</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          <article-title>Informed Consent in the Mozilla Browser: Implementing Value-Sensitive Design</article-title>
          .
          <source>In Proceedings of the 35th Hawaii International Conference on System Sciences</source>
          (
          <year>2002</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Friedman</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kahn</surname>
            ,
            <given-names>P.H.</given-names>
          </string-name>
          <string-name>
            <surname>Jr.</surname>
          </string-name>
          , and
          <string-name>
            <surname>Borning</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>Value Sensitive Design and information systems: Three case studies</article-title>
          . In P. Zhang and D. Galletta (eds.),
          <article-title>Human-Computer Interaction</article-title>
          and
          <string-name>
            <surname>Management Information Systems: Foundations. M. E. Sharpe</surname>
          </string-name>
          (
          <year>2006</year>
          ),
          <fpage>348</fpage>
          -
          <lpage>372</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Friedman</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Nissenbaum</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>Bias in computer systems</article-title>
          .
          <source>ACM TOIS 14</source>
          ,
          <issue>3</issue>
          (
          <year>1996</year>
          ),
          <fpage>330</fpage>
          -
          <lpage>347</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.Johnson, DG., and
          <string-name>
            <surname>Mulvey</surname>
            ,
            <given-names>J.M.</given-names>
          </string-name>
          <article-title>Accountability and computer decision systems</article-title>
          .
          <source>CACM 38</source>
          ,
          <issue>12</issue>
          (
          <year>1995</year>
          ),
          <fpage>58</fpage>
          -
          <lpage>64</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Kaptein</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Eckles</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <article-title>Selecting effective means to any end</article-title>
          .
          <source>In Proc. PERSUASIVE</source>
          <year>2010</year>
          , LNCS
          <volume>6137</volume>
          (
          <year>2010</year>
          ),
          <fpage>82</fpage>
          -
          <lpage>93</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Levine</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <article-title>Using technology to promote sexual health</article-title>
          .
          <source>In B.J. Fogg and D. Eckles (eds.)</source>
          , Mobile Persuasion, Stanford Captology Media (
          <year>2007</year>
          ),
          <fpage>15</fpage>
          -
          <lpage>18</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.Michalski, J.
          <article-title>Ethical dangers of mobile persuasion</article-title>
          .
          <source>In B.J. Fogg and D. Eckles (eds.)</source>
          , Mobile Persuasion, Stanford Captology Media (
          <year>2007</year>
          ),
          <fpage>137</fpage>
          -
          <lpage>142</lpage>
          ..
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Oinas-Kukkonen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>Behavior change support systems: A research model and agenda</article-title>
          .
          <source>In Proc. PERSUASIVE</source>
          <year>2008</year>
          , LNCS
          <volume>5033</volume>
          (
          <year>2008</year>
          ),
          <fpage>164</fpage>
          -
          <lpage>176</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Oinas-Kukkonen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>and</article-title>
          <string-name>
            <surname>Harjumaa</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>A systematic framework for designing and evaluating persuasive systems</article-title>
          .
          <source>In Proc. PERSUASIVE</source>
          <year>2008</year>
          , LNCS
          <volume>6137</volume>
          (
          <year>2010</year>
          ),
          <fpage>4</fpage>
          -
          <lpage>14</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Page</surname>
            ,
            <given-names>R.E.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Kray</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <article-title>Ethics and persuasive technology: An exploratory study in the context of healthy living</article-title>
          .
          <source>In Proc. First Int. Workshop on Nudge and Influence through Mobile Devices, CEUR Workshop Proceedings</source>
          <volume>690</volume>
          (
          <year>2010</year>
          ),
          <fpage>19</fpage>
          -
          <lpage>22</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Rokeach</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>The Nature of Human Values</article-title>
          . Free Press (
          <year>1973</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>Thaler</surname>
            ,
            <given-names>R. H.</given-names>
          </string-name>
          <article-title>and</article-title>
          <string-name>
            <surname>Sunstein</surname>
            ,
            <given-names>C.R.</given-names>
          </string-name>
          <string-name>
            <surname>Nudge</surname>
          </string-name>
          : Improving Decisions about Health, Wealth, and
          <string-name>
            <surname>Happiness</surname>
          </string-name>
          , Yale University Press (
          <year>2008</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          23.
          <string-name>
            <surname>Todd</surname>
            ,
            <given-names>P.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rogers</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Payne</surname>
            ,
            <given-names>S.J. Nudging</given-names>
          </string-name>
          <article-title>the cart in the supermarket; How much is enough information for food shoppers</article-title>
          ?
          <source>In Proc. First Int. Workshop on Nudge and Influence through Mobile Devices, CEUR Workshop Proceedings</source>
          <volume>690</volume>
          (
          <year>2010</year>
          ),
          <fpage>23</fpage>
          -
          <lpage>26</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          24.Washington,
          <string-name>
            <surname>J.</surname>
          </string-name>
          <article-title>The new digital divide</article-title>
          .
          <source>Des Moines Register (January</source>
          <volume>9</volume>
          ,
          <year>2011</year>
          ),
          <year>1AA</year>
          ,
          <year>5AA</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          25.
          <string-name>
            <surname>Wray</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <article-title>Africa sees massive growth in mobile web usage, guardian</article-title>
          .co.
          <source>uk (December</source>
          <volume>2</volume>
          ,
          <fpage>2</fpage>
          <lpage>2009</lpage>
          ), http://www.guardian.co.uk/technology/2009/dec/22/mobilephon es-internet.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>