<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Organizing Artworks in an Ontology-based Semantic A ective Space</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Federico Bertola</string-name>
          <email>bertola.federico@educ.di.unito.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Viviana Patti</string-name>
          <email>patti@di.unito.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Universita degli Studi di Torino Dipartimento di Informatica c.</institution>
          <addr-line>so Svizzera 185, I-10149 Torino</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>In this paper, we focus on applying sentiment analysis to resources from online collections, by exploiting, as information source, tags intended as textual traces that visitors leave for commenting artworks on social platforms. Our aim is to create a semantic social space where artworks can be dynamically organized according to an ontology of emotions. We propose to tackle this issue in a semantic web setting, through the development of an ontology of emotional categories based on Plutchik's circumplex model, a well-founded psychological model of human emotions. The ontology has been conceived for categorizing emotiondenoting words and has been populated with Italian terms. The capability of detecting emotions in artworks can foster the development of emotion-aware search engines, emotional tag clouds or interactive map of emotions, which could enable new ways of accessing and exploring art collections. First experiments on tags and artworks from the ArsMeteo Italian web portal are discussed.</p>
      </abstract>
      <kwd-group>
        <kwd>Ontology of emotions</kwd>
        <kwd>emotion visualization</kwd>
        <kwd>a ective computing</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        The development on the web and the advent of social media has brought about
new paradigms of interactions that foster rst-person engagement and
crowdsourcing content creation. In this context the subjective and expressive
dimensions move to the foreground, opening the way to the emergence of an a ective
component within a dynamic corpus of digitized contents, which advocate new
techniques for automatic processing, indexing and retrieval of the a ective
information present. Therefore, recently a high interest raised among researchers in
developing approaches and tools for sentiment analysis and emotion detection,
aimed at automatic analyzing and processing the a ective information conveyed
by social media [
        <xref ref-type="bibr" rid="ref16 ref7">16, 7</xref>
        ]. In addition, the need to support users in accessing and
exploring the outcomes of the emotion detection and sentiment analysis
algorithms has fueled interest on research of solutions that address the sentiment
summarization and visualization problem. Organization and manipulation of
social media contents, for categorization, browsing, or visualization purpose, often
need to encompass a semantic model of their a ective qualities (or of their
reception by the users). In particular, a key role to bring advancements in this area
can be played by ontologies and cognitive models of emotions [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], to be de ned
and integrated into traditional information processing techniques.
      </p>
      <p>
        In this paper we address the above issues in the context of the
ArsEmotica project [2{4]1. ArsEmotica is an application software that detects emotions
evoked by resources (artworks) from online collections, by exploiting, as
information source, tags intended as textual traces that visitors leave for commenting
artworks on social platforms. The nal aim is to create a semantic social space
where artworks can be dynamically organized according to an ontology of
emotions. Detected emotions are meant to be the ones which better capture the
a ective meaning that visitors, collectively, give to the artworks. We propose
to tackle this issue in a semantic web setting, through the development of an
ontology of emotional categories based on Plutchik's circumplex model [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], a
well-founded psychological model of human emotions. The approach to the
sentiment analysis task is, indeed, ontology-driven. Shortly, given a tagged resource,
the correlation between tags and emotions is computed by referring to the
ontology of emotional categories, by relying on the combined use of Semantic Web
technologies, NLP and lexical resources.
      </p>
      <p>
        In the last years, many cultural heritage institutions opened their collections
for access on the web (think for instance to the Google Art project2). User
data collected by art social platforms are a precious information source about
trends and emotions. Therefore, a growing interest in monitoring the sentiment
of the visitors in virtual environments can be observed among art practitioners,
curators and cultural heritage stakeholders, as discussed in [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
      </p>
      <p>
        In the following, we will describe the most recent achievements within the
ArsEmotica project, with a special focus on the development of the ontology of
emotions. The ontology inspired an interactive user interface for visualizing and
summarizing the results of our emotion detection algorithm; detected emotional
responses to artworks are represented by means of a graphical representation
inspired to the Plutchik's emotion wheel. Moreover, we will present rst results of
the ongoing experiments of running the ArsEmotica emotion detection engine on
a real dataset of artworks and tags from the ArsMeteo web portal [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. The paper
is organized as follows. Section 2 recalls the ArsEmotica's architecture. Section 3
focusses on the ontology of emotions. Section 4 describes the rst experiments
on the ArsMeteo dataset. Section 5 discusses the ArsEmotica's interactive user
interface. Final remarks end the paper.
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>The ArsEmotica Framework</title>
      <p>
        In this section, we brie y recall the characteristics and the main components
of ArsEmotica 2.0, the application software that we developed for testing our
ideas. Details can be found in [
        <xref ref-type="bibr" rid="ref2 ref4">4, 2</xref>
        ]. ArsEmotica is meant as a sort of \emotional
      </p>
      <sec id="sec-2-1">
        <title>1 http://di.unito.it/arsemotica 2 http://www.googleartproject.com/</title>
        <p>engine", which can be interfaced with any resource sharing and tagging system
which provides the data to be processed, i.e. digital artworks and their tags.
Social tagging platforms for art collections, having active communities that visit
and comment online the collections, would be ideal data sources.</p>
        <p>
          The pipeline of ArsEmotica includes four main steps.
1. Pre-processing: Lemmatization and String sanitizing. In this step
tags associated with a given artworks are ltered so as to eliminate aws
like spelling mistakes, badly accented characters, and so forth. Then, tags
are converted into lemmas by applying a lemmatization algorithm, which
builds upon Morph-It!, a corpus-based morphological resource for the Italian
language [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ].
2. Checking tags against the ontology of emotions. This step checks
whether a tag belongs to the ontology of emotions. In other words, it checks
if the tags of a given resource are \emotion-denoting" words directly referring
to some emotional categories of the ontology. Tags belonging to the ontology
are immediately classi ed as \emotional".
3. Checking tags with SentiWordNet. Tags that do not correspond to
terms in the ontology are further analyzed by means of SentiWordNet [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ], in
order to distinguish objective tags, which do not bear an emotional meaning,
from subjective and, therefore, a ective tags. The latter will be the only ones
presented to the user in order to get a feedback on which emotional concept
they deliver. The feedback is collected thanks to the interactive user interface
described in Sec. 5, which has been designed in tune with the ontological
model of emotion presented below.
4. Combining emotional data and output. Based on data collected in the
previous steps, the tool computes and o ers as output a set of emotions
associated to the resource. We have implemented a new algorithm for
accomplishing this task, where emotions collected in the previous steps are
not simply ranked as in [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] but compared and combined. The algorithm
compare collected emotions, by exploiting ontological reasoning on the
taxonomic structure of the ontology of emotions. Moreover, it combines them
by referring to the Hourglass Model [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ], a reinterpretation of the Plutchik's
model, where primary emotions are further organized around four
independent but concomitant dimensions (Pleasantness, Attention, Sensitivity and
Aptitude), whose di erent levels of activation can give birth to very wide
space of di erent emotions. Shortly, in this model di erent emotions (basic
or compound), result from di erent combinations of activation levels for the
four dimensions. Dimensions are characterized by six levels of activation,
which determine the intensity of the expressed/perceived emotion as a oat
2 [ 1; +1] (Figure 1, right side). This allows to classify a ective information
both in a categorical way (according to a number of emotion categories)
and in a dimensional format (which facilitates comparison and aggregation),
and provided us a powerful inspiration in implementing a new algorithm for
combining emotional data in a nal output.
        </p>
        <p>
          The resulting output can be produced in di erent modalities. Emotions
evoked by artworks are visualized by a sort of emotion wheel, graphically
inspired to the color wheel used by Plutchik for o ering a bi-dimensional
representation of his circumplex model of emotions [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ] (Sec. 5). Moreover,
the application encodes the output in a machine-readable format, by
relying on W3C standards: RDF and EmotionML, an emerging standard for
emotion annotation3.
3
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>An Ontology for ArsEmotica</title>
      <p>
        In this section we describe the ontology, which plays a key role in all steps
of ArsEmotica computation. It is an ontology of emotional categories based on
Plutchik's circumplex model [
        <xref ref-type="bibr" rid="ref14 ref15">15, 14</xref>
        ], a well-founded psychological model of
emotions, and includes also concepts from the Hourglass model in [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. The ontology
is written in OWL. It can be released on demand for academic purposes.
3.1
      </p>
      <sec id="sec-3-1">
        <title>Classes, Hierarchy and Properties</title>
        <p>The ontology structures emotional categories in a taxonomy, which includes
32 emotional concepts. Due to its role within the ArsEmotica architecture, the
ontology has been conceived for categorizing emotion-denoting words, as the one
used in the previous version of the application. It, includes two root concepts:
Emotion and Word.</p>
        <sec id="sec-3-1-1">
          <title>3 http://www.w3.org/TR/emotionml/</title>
          <p>Class Emotion For what concerns the class Emotion, the design of the
emotional categories taxonomic structure, of the disjunction axioms and of the object
and data properties mirrors the main features of Plutchik's circumplex model,
(see Fig 1, left side). Such model can be represented as a wheel of emotions and
encodes the following elements and concepts:
{ Basic or primary emotions: joy, trust, fear, surprise, sadness, disgust,
anger, anticipation (i.e. expectancy ); in the color wheel this is represented
by di erently colored sectors.
{ Opposites: basic emotions can be conceptualized in terms of polar
opposites: joy versus sadness, anger versus fear, trust versus disgust, surprise
versus anticipation.
{ Intensity: each emotion can exist in varying degrees of intensity; in the
wheel this is represented by the vertical dimension.
{ Similarity: emotions vary in their degree of similarity to one another; in
the wheel this is represented by the radial dimension.
{ Complex emotions: complex emotions are a mixtures of the primary
emotions; in the model in Fig 1 emotions in the blank spaces are compositions
of basic emotions called primary dyads.</p>
          <p>Emotion is the root for all the emotional concepts. The Emotion's hierarchy
includes all the 32 emotional categories presented as distinguished labels in the
model. In particular, the Emotion class has two sub-classes: BasicEmotion and
ComplexEmotion. BasicEmotion and CompositeEmotion are disjoint classes.</p>
          <p>Basic emotions of the Plutchik's model (Disgust, Trust, Sadness, Joy,
Anticipation, Surprise, Anger and Fear) are direct subclasses of BasicEmotion.
Each of them is specialized again into two subclasses representing the same
emotion with weaker or the stronger intensity (e.g. the basic emotion J oy has
Ecstasy and Serenity as subclasses). Therefore, we have 24 emotional concepts
subsumed by the BasicEmotion concept. Instead, the class CompositeEmotion
has 8 subclasses, corresponding to the primary dyads in the Plutchik's model.</p>
          <p>Other relations proposed in the Plutchik's model have been expressed in the
ontology by means of the following object properties, where Arch is the set of
the basic emotions and Comp the set of the complex emotions:
{ hasOpposite: (f : Arch ! Arch), encodes the notion of polar opposites ;
{ hasSibling: (f : Arch ! Arch) encodes the notion of similarity ;
{ isComposedOf : (f : Comp ! Arch) encodes the notion of composition
of basic emotions.</p>
          <p>
            The data type property hasScore:(f : Arch ! R) was introduced to link
each emotion with an intensity value mapped into the hourglass model.
Class Word Word is the root for the emotion-denoting words, i.e. those words
which each language provides for denoting emotions, in line with related and
previous work [
            <xref ref-type="bibr" rid="ref10 ref2">10, 2</xref>
            ]. Since we currently applied our application to use cases where
tagging involved Italian communities, we de ned and populated the subclass
ItalianWord 4. Intuitively, each instance of the Word and Emotion concepts, e.g.
felicita has two parents: one is a concept from the Emotion hierarchy (the
emotion denoted by the word, e.g. Joy ), while the other is a concept from the Word
hierarchy (e.g. Italian, the language the word belongs to).
3.2
          </p>
        </sec>
      </sec>
      <sec id="sec-3-2">
        <title>Individuals and Ontology Population</title>
        <p>
          We semi-automatically populated the ontology with Italian words by following
the same methodology described in [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] for populating OntoEmotion, the
ontology used in the previous version of the ArsEmotica prototype. Shortly, we relied
on the multilingual lexical database MultiWordNet [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] and its a ective domain
WordNet-A ect5, a well-known lexical resource that contains information about
the emotions that the words convey, that was developed starting from
WordNet [
          <xref ref-type="bibr" rid="ref17">17</xref>
          ]. WordNet [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ] is a lexical database, in which nouns, verbs, adjectives
and adverbs (lemmas) are organized into sets of synonyms (synsets),
representing lexical concepts. The WordNet-A ect resource was developed through the
selection and labeling of the synsets representing a ective concepts.
        </p>
        <p>
          Our population process started by manually selecting a set of
representative Italian emotional words, at least one word for each concept. This initial set
was including less than 90 words classi ed under our 32 emotional concepts, but
they were only nouns. In order to expand with adjectives the set of Italian words
representative of emotional concepts, we included and classi ed according to the
ontology6 the list of 32 emotion terms in [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ]: addolorato, allegro, angosciato,
annoiato, ansioso, arrabbiato, contento, depresso, disgustato, disperato, divertito,
entusiasta, euforico, felice, gioioso, imbarazzato, impaurito, indignato, infelice,
irritato, malinconico, meravigliato, preoccupato, risentito, sbalordito, scontento,
sconvolto, sereno, sorpreso, spaventato, stupito, triste.
        </p>
        <p>
          In a second phase we automatically expanded the set of individuals (emotion
denoting words) belonging to the emotional concepts by exploiting
MultiWordNet and the WordNet-A ect. All manually classi ed words and adjectives were
used as entry lemmas for querying the lexical database. The result for each word
was a synset, representing the \senses" of that word, labeled by MultiWordNet
unique synset identi ers. Each synset was then processed by using
WordNetA ect [
          <xref ref-type="bibr" rid="ref17">17</xref>
          ]: when a synset is annotated as representing a ective information,
then, all the synonyms belonging to that synset are imported in the ontology
as relevant Italian emotion-denoting words for the same concept of the entry
lemmas. In other words, we automatically enriched the ontology with synonyms
of the representative emotional words, but also lter out synsets which do not
convey a ective information. As a nal step, we further expanded the set of
emotion denoting words with further adjectives, verbs and adverbs, by exploiting the
4 The ontology is already designed to be extended with further subclasses of Word,
for representing emotion-denoting words in di erent languages.
5 http://wndomains.fbk.eu/wnaffect.html
6 This process has been carried on manually, by relying on morpho-semantic
relations between nouns already classi ed in the ontology and adjectives speci ed in the
Treccani dictionary (http://www.treccani.it).
        </p>
        <p>WordNet relation derived-from, for which can be assumed that the a ective
meaning is preserved. Therefore, all synsets obtained by an application of the
derived-from relation (and not yet classi ed in our ontology) were included as
individuals of the proper emotional concept. At the end of the process a human
expert checked the identi ed terms. The resulting ontology contains about 700
Italian words referring to the 32 emotional categories of the ontology.
4</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>First Experiments on the ArsMeteo Dataset</title>
      <p>
        We are currently testing the version 2.0 of the ArsEmotica prototype against a
dataset of tagged multimedia artworks from the ArsMeteo art portal (http://
www.arsmeteo.org [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]). According to the ArsEmotica emotional analysis, 1705
out of the 9171 artworks in the dataset bear an emotional meaning encoded in
the ontology.
4.1
      </p>
      <sec id="sec-4-1">
        <title>The ArsMeteo Dataset</title>
        <p>Our dataset ArsM is a signi cant set of tagged artworks from the ArsMeteo
web portal. It consists of 9171 artworks with the associated tags7. The ArsMeteo
web platform combines social tagging and tag-based browsing technology with
functionalities for collecting, accessing and presenting works of art together with
their meanings. It enables the collection of digital (or digitalized) artworks and
performances, belonging to a variety of artistic forms including poems, videos,
7 Speci cally, ArsM includes comments associated to the artworks from Arsmeteo
users until December 2010.
pictures and musical compositions. Meanings are given by the tagging activity
of the community. Currently, the portal collected over 10,000 artworks created
by about 300 di erent artists; his community has produced over 37,000 tags (an
average of 10 tags per artwork).
4.2</p>
      </sec>
      <sec id="sec-4-2">
        <title>Emotional Analysis</title>
        <p>Emotions belonging to the ontology are detected in about 20 percent of our
dataset8. Let us denote with Af f ectiveArsM the set of artworks classi ed
according to some emotions of our ontology after the emotional analysis performed
by ArsEmotica.</p>
        <p>In ArsMeteo, artworks usually have many tags, expressing a variety of
meanings, thus supporting the emergence of di erent emotional potentials. This is
consistent with the idea that art can emotionally a ect people in di erent ways.
When this happens, the analysis performed by ArsEmotica provides multiple
emotional classi cations. Fig 2 shows results on number of emotions detected
for each artworks in Af f ectiveArsM . About 40% of the artworks received
multiple classi cation, i.e. ArsEmotica detected more than one emotion associated
to the artwork.</p>
        <p>For what concerns the emotion distribution in Af f ectiveArsM , when we
consider basic emotions in their varying degree of intensities, the most common
emotions were the ones belonging to the sadness (457 artworks) and joy family
(405 artworks), followed by anticipation, fear, disgust and surprise. Anger was
8 Notice that the tagging activity, monitored in ArsMeteo since 2006, was not
performed with the aim of later applying some kind of Sentiment Analysis, but as a
form of spontaneous annotation produced by the members of the community.
rarer, and trust was almost nonexistent (see Fig. 3); when we consider complex
emotions, results are summarized in Fig. 4: love is very common (424 artworks),
optimism and awe are rare, and the other complex emotions are almost
nonexistent.
5</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Visualizing and Summarizing Detected Emotions</title>
      <p>We have developed an interface linked to our ontology of emotions, which have
as main aims: a) to present the outcomes of the emotional analysis for tagged
artworks, b) to propose to the user intuitive metaphors to browse the emotional
space, and c) to ease the task of emotionally classifying tags having indirect
emotional meanings, by means of emotional concepts from the ontology. On this
perspective, the Plutchik's model is very attractive for three main reasons. First,
the reference to a graphical wheel is very intuitive and o ers a spacial
representation of emotions and their di erent relations (similarities, intensities, polar
oppositions). Such kind of representation allows to convey to the user a rich
information on the emotional model, without referring to tree-like visualization
of the ontology hierarchy. Second, the use of colors for denoting di erent
emotions provides a very intuitive communication code. Di erent color nuances for
di erent emotions, transmit naturally the idea that primary emotions can blend
to form a variety of compound emotions, analogous to the way colors combine
to generate di erent color graduations. Third, the number of emotional
categories distinguished in the wheel is limited. This aspect facilitates the user that
is involved in an emotional evaluation.</p>
      <p>The Interface The sequence of interactions o ered to the user follows the ux
of computation sketched in Section 2. After the user selects an artwork from the
collection (Fig. 5, top-left window), the application applies the emotional
analysis on the artwork tags. The result of this computation, i.e. the evoked emotions,
is summarized to the user by a graphical representation (obtained by adapting
the RGraph tool9) called \La rosa delle emozioni" which strongly recalls the
Plutchik's color wheel. Let us consider, for instance, to run the emotional
analysis to the artwork\Dove la Ra nata Ragazza Bionda guarda il Grosso Toro
Malmorto" by Filippo Valente, belonging to our ArsM dataset. The resulting
window (Fig 5, top-left window), includes a preview of the artwork and a
summary of related metadata (e.g. title and author of the selected artwork); below,
the four red colored tags are identi ed as emotional according to the emotional
ontology: `orrore', `infamia', `cattiveria', `tristezza'; the presence of emotional
responses related to Sadness and a strong disgust (Loathing ) is highlighted by
coloring the sectors of the emotion wheel corresponding to those emotions.
Internal sectors of the ArsEmotica's wheel are intended as representing light intensity
of emotions, while the external ones as representing high intensity. Underlined
blue colored tags denotes tags that have been recognized by the sentiment
analysis stage as possibly conveying some a ective meaning. Then, they appear as
active links for the user's emotional feedback: see e.g. `sangue', `scon ggere', and
so on.</p>
      <p>Indeed, a user which is not satis ed with the outcome can select a tag to
evaluate among the active ones. Then, the application activates a pop-up window,
where an uncolored emotional wheel is shown. Users can express the emotional
evaluation in terms of basic emotions with di erent intensities, and color the
wheel accordingly, by clicking on one of the 24 sectors of the wheel; otherwise
9 http://www.rgraph.net/
they can select compound emotions, by selecting the wedge-shaped triangles
inserted between the basic emotions. In our example (Figure 5, bottom-center) the
user associated to the tag `sangue' (blood) the emotions Fear and Disgust (with
high intensity, which corresponds to Loathing ). Notice that the tag evaluation
is contextual to the vision of the artwork, which indeed remains visible in the
background. After user expressed her feedback, detected and collected emotions
are combined and the resulting emotional evaluation is again presented to the
user by using the ArsEmotica's wheel.
6</p>
    </sec>
    <sec id="sec-6">
      <title>Conclusion and Future Work</title>
      <p>In this paper we have described the OWL ontology of emotions used in the
ArsEmotica 2.0 prototype, which refers to a state-of-the-art cognitive model of
emotions and inspired an interactive user interface for visualizing and
summarizing the results of the emotion detection algorithm.</p>
      <p>
        Recently, many researchers are devoting e orts in developing ontology of
emotions in the Semantic Web context [
        <xref ref-type="bibr" rid="ref10 ref12 ref5">5, 12, 10</xref>
        ], and some of them addressed
the issue from a foundational point of view. In particular, the Human Emotion
Ontology (HEO) developed in OWL [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], was introduced with the explicit aim to
standardize the knowledge about emotions and to support very broad semantic
interoperability among a ective computing applications. It will be interesting to
study how to link the ArsEmotica's ontology of emotions with HEO, which could
play for us the role of \upper ontology" for emotions, by providing an ontological
de nition of the general concept of emotion. In fact, in our ontology the root
concept of the Emotion hierarchy is treated as primitive (it is not semantically
described in terms of characterizing properties).
      </p>
      <p>The Hourglass Model we refer to in order to combine detected and collected
emotions in ArsEmotica allows us to design a uid and continuous emotional
space, where artworks (but also possibly user's tag) can be positioned. The
actual ArsEmotica interface provide our users with the possibility to access the
outcomes of a the emotional analysis. On this line, the next step is to study
innovative strategies to browse the artworks, by relying on their semantic
organization in the ArsEmotica emotional space. The aim is to provide users with
the possibility to explore the resources by exploiting the various dimensions
suggested by the ontological model. Possible queries to deal with could be: \show
me sadder artworks" (intensity relation); \show me something emotionally
completely di erent" (polar opposites); \show me artworks conveying similar
emotions" (similarity relation).</p>
      <p>
        For what concerns sentiment visualization, designing engaging interfaces that
allow an appropriate granularity of expression is not an easy task. We plan to
evaluate soon the new prototype and its interface, by carrying on a user test
where users of the ArsMeteo community, which in the past have already actively
participated to a user study on the rst version of our prototype [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], will be
involved.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <given-names>E.</given-names>
            <surname>Acotto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Baldoni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Baroglio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Patti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Portis</surname>
          </string-name>
          , and
          <string-name>
            <given-names>G.</given-names>
            <surname>Vaccarino.</surname>
          </string-name>
          <article-title>Arsmeteo: artworks and tags oating over the planet art</article-title>
          .
          <source>In Proc. of ACM HT '09</source>
          , ACM:
          <volume>331</volume>
          {
          <fpage>332</fpage>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>M.</given-names>
            <surname>Baldoni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Baroglio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Patti</surname>
          </string-name>
          , and
          <string-name>
            <given-names>P.</given-names>
            <surname>Rena</surname>
          </string-name>
          .
          <article-title>From tags to emotions: Ontologydriven sentiment analysis in the social semantic web</article-title>
          .
          <source>Intelligenza Arti ciale</source>
          ,
          <volume>6</volume>
          (
          <issue>1</issue>
          ):
          <volume>41</volume>
          {
          <fpage>54</fpage>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <given-names>M.</given-names>
            <surname>Baldoni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Baroglio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Patti</surname>
          </string-name>
          , and
          <string-name>
            <given-names>C.</given-names>
            <surname>Schifanella</surname>
          </string-name>
          .
          <article-title>Sentiment analysis in the planet art: A case study in the social semantic web</article-title>
          . In Cristian Lai, Giovanni Semeraro, and Eloisa Vargiu, editors,
          <source>New Challenges in Distributed Information Filtering and Retrieval</source>
          , volume
          <volume>439</volume>
          <source>of Studies in Computational Intelligence</source>
          , pages
          <fpage>131</fpage>
          {
          <fpage>149</fpage>
          . Springer,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <given-names>F.</given-names>
            <surname>Bertola</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Patti</surname>
          </string-name>
          .
          <article-title>Emotional responses to artworks in online collections</article-title>
          .
          <source>In UMAP Workshops</source>
          ,
          <string-name>
            <surname>PATCH</surname>
          </string-name>
          <year>2013</year>
          :
          <article-title>Personal Access to Cultural Heritage</article-title>
          , volume
          <volume>997</volume>
          <source>of CEUR Workshop Proceedings. CEUR-WS.org</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <given-names>E.</given-names>
            <surname>Cambria</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Hussain</surname>
          </string-name>
          . Sentic Computing: Techniques, Tools, and Applications.
          <source>SpringerBriefs in Cognitive Computation Series</source>
          . Springer-Verlag GmbH,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <given-names>E.</given-names>
            <surname>Cambria</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Livingstone</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Hussain</surname>
          </string-name>
          .
          <article-title>The hourglass of emotions</article-title>
          . In Anna Esposito, Antonietta Maria Esposito, Alessandro Vinciarelli, Rudiger Ho mann, and Vincent C. Muller, editors,
          <source>COST 2102 Training School</source>
          ,
          <source>Revised Selected Papers</source>
          , volume
          <volume>7403</volume>
          of Lecture Notes in Computer Science. Springer,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <given-names>E.</given-names>
            <surname>Cambria</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Schuller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Xia</surname>
          </string-name>
          , and
          <string-name>
            <given-names>C.</given-names>
            <surname>Havasi</surname>
          </string-name>
          .
          <article-title>New avenues in opinion mining and sentiment analysis</article-title>
          .
          <source>IEEE Intelligent Systems</source>
          ,
          <volume>28</volume>
          (
          <issue>2</issue>
          ):
          <volume>15</volume>
          {
          <fpage>21</fpage>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <given-names>A.</given-names>
            <surname>Esuli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Baccianella</surname>
          </string-name>
          , and
          <string-name>
            <given-names>F.</given-names>
            <surname>Sebastiani</surname>
          </string-name>
          .
          <source>SentiWordNet 3</source>
          .
          <article-title>0: An enhanced lexical resource for sentiment analysis and opinion mining</article-title>
          .
          <source>In Proc. of LREC'10</source>
          . ELRA, May
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9. C. Fellbaum, editor.
          <source>WordNet: An Electronic Lexical Database</source>
          . MIT Press,
          <year>1998</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10. V. Francisco, P. Gervas, and
          <string-name>
            <given-names>F.</given-names>
            <surname>Peinado</surname>
          </string-name>
          .
          <article-title>Ontological reasoning for improving the treatment of emotions in text</article-title>
          .
          <source>Knowledge and Information Systems</source>
          ,
          <volume>25</volume>
          :
          <fpage>421</fpage>
          {
          <fpage>443</fpage>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <given-names>D.</given-names>
            <surname>Galati</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Sini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Tinti</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Testa</surname>
          </string-name>
          .
          <article-title>The lexicon of emotion in the neo-latin languages</article-title>
          .
          <source>Social Science Information</source>
          ,
          <volume>47</volume>
          (
          <issue>2</issue>
          ):
          <volume>205</volume>
          {
          <fpage>220</fpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <given-names>M.</given-names>
            <surname>Grassi</surname>
          </string-name>
          .
          <article-title>Developing heo human emotions ontology</article-title>
          .
          <source>In Proc. of the 2009 joint COST 2101 and 2102 international conference on Biometric ID management and multimodal communication</source>
          , pages
          <volume>244</volume>
          {
          <fpage>251</fpage>
          . Springer-Verlag,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13. E.
          <string-name>
            <surname>Pianta</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Bentivogli</surname>
            , and
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Girardi</surname>
          </string-name>
          .
          <article-title>Multiwordnet: developing an aligned multilingual database</article-title>
          .
          <source>In Proc. of Int. Conf. on Global WordNet</source>
          ,
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <given-names>R.</given-names>
            <surname>Plutchik</surname>
          </string-name>
          .
          <article-title>The circumplex as a general model of the structure of emotions and personality</article-title>
          . In R. Plutchik and
          <string-name>
            <surname>H. R</surname>
          </string-name>
          . Conte, editors,
          <source>Circumplex models of personality and emotions</source>
          , pages
          <volume>17</volume>
          {
          <fpage>47</fpage>
          . American Psychological Association,
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <given-names>R.</given-names>
            <surname>Plutchik</surname>
          </string-name>
          .
          <source>The Nature of Emotions. American Scientist</source>
          ,
          <volume>89</volume>
          (
          <issue>4</issue>
          ),
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>M. Schroeder</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Pirker</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Lamolle</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <string-name>
            <surname>Burkhardt</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Peter</surname>
            , and
            <given-names>E.</given-names>
          </string-name>
          <string-name>
            <surname>Zovato</surname>
          </string-name>
          .
          <article-title>Representing emotions and related states in technological systems</article-title>
          . In Roddy Cowie, Catherine Pelachaud, and Paolo Petta, editors,
          <source>Emotion-Oriented Systems, Cognitive Technologies</source>
          , pages
          <volume>369</volume>
          {
          <fpage>387</fpage>
          . Springer,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <given-names>C.</given-names>
            <surname>Strapparava</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Valitutti</surname>
          </string-name>
          .
          <article-title>WordNet-A ect: an a ective extension of WordNet</article-title>
          .
          <source>In Proc. of LREC'04</source>
          , volume
          <volume>4</volume>
          , pages
          <fpage>1083</fpage>
          {
          <fpage>1086</fpage>
          ,
          <year>2004</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18. E. Zanchetta and
          <string-name>
            <given-names>M.</given-names>
            <surname>Baroni</surname>
          </string-name>
          .
          <article-title>Morph-it! a free corpus-based morphological resource for the Italian language</article-title>
          .
          <source>Corpus Linguistics</source>
          <year>2005</year>
          ,
          <volume>1</volume>
          (
          <issue>1</issue>
          ),
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>