<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Feb</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>GaChat:A chat system that displays online retrieval information in dialogue text</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Satoshi Horiguchi</string-name>
          <email>horiguchi@mos.ics.keio.ac.jp</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Akifumi Inoue, Tohru</string-name>
          <email>@cs.teu.ac.jp</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kenichi Okada</string-name>
          <email>okada@ics.keio.ac.jp</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Faculty of Science and, Technology</institution>
          ,
          <addr-line>Keio</addr-line>
          ,
          <institution>University</institution>
          ,
          <addr-line>3-14-l Hiyoshi, Kohoku-ku, Yokohama, 223-8522</addr-line>
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Graduate School of Science, and Technology</institution>
          ,
          <addr-line>Keio</addr-line>
          ,
          <institution>University</institution>
          ,
          <addr-line>3-14-l Hiyoshi, Kohoku-ku, Yokohama, 223-8522</addr-line>
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Hoshi, School of Computer and</institution>
          ,
          <addr-line>Science</addr-line>
          ,
          <institution>Tokyo University, of Technology</institution>
          ,
          <addr-line>1404-1 Katakura, Hachioji, Tokyo 192-0982, Japan, akifumi, hoshi</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2009</year>
      </pub-date>
      <volume>8</volume>
      <issue>2009</issue>
      <abstract>
        <p>Text chat systems are popular and widely used by a lot of users. However, there are sometimes redundant interactions between the users because of its less awareness. In this paper, we propose a text chat system called ”GaChat”, which simultaneously appends related information about the dialogue text between its users. First, proper nouns are extracted from the dialogue text by morphologic analysis. Then online images and articles related to the nouns are simultaneously displayed with the dialogue text. Such settlement of the ambiguity helps users to reduce redundant interactions like searching and asking the details of the phrase. This paper describes the prototype implementation and its first evaluation experiment.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Author Keywords</title>
      <p>Text chat communication, Instant messaging service, Web
based communication</p>
    </sec>
    <sec id="sec-2">
      <title>ACM Classification Keywords</title>
      <p>H.5.3 Group and Organization Interfaces: CSCW</p>
    </sec>
    <sec id="sec-3">
      <title>INTRODUCTION</title>
      <p>Text chat systems are popular and widely used as one of the
easy communication tools. Such systems can smoothly and
quickly exchange text messages. On the other hand, a text
chat communication has less awareness than a face-to-face
or a video-based communication. It is difficult to infer the
vocabulary of their partner and subtle difference in nuance
under the low-awareness condition. We often have to
exchange a lots of redundant messages to explain a trivial
matter.</p>
      <p>One of the approaches to cover the lack of the awareness is
to attach additional devices such as videocameras and
thermometers to the system. However, a text chat system with
a videocam means just a video chat system. It is not a
textchat system any longer. A video chat communication
requires higher mental load than a text chat communication.
Those devices also complicate the simplicity of the text chat
system.</p>
      <p>We believe that a text chat system should be used for casual
communications, which is easy to start, easy to keep, and
easy to quit. We propose a text chat system called “GaChat”.
This system is developed to avoid the misunderstandings
because of the low- awareness without any additional devices.
This means that the system only uses a keyboard as the input
device. Instead, the system displays images and comments
related to the dialogue of the users explicitly.</p>
      <p>The remainder of this paper is organized as follows. In
Section 2, we discuss related work on communications support
with text chat. In Section 3, we describe our chat system
design. In Section 4, we discuss the our prototype system. In
Section 5, we describe example of operating the prototype.
In Section 6, we discuss the current limitations ofour
prototype system. We conclude in Section 7 by discussing near
future work that we plan to explore.</p>
    </sec>
    <sec id="sec-4">
      <title>RELATED WORK</title>
      <p>Our system displays complementary information
simultaneously with the text messages. Several studies have take
similar approaches to the various situations[1, 2, 3]. Lieberman
proposed a system that assists users to take opportunities in
our daily work by image retrieval and annotation[4].
The purpose of those researches limits it’s use of the chat,
and improves communication in that such limited situation.
Lock-on-Chat[5] is a chat system for the communication at
the place of an academic conference. Users can share the
snapshot of the slide among the participants and can leave
comments freely on a specific part of the snapshot. This
system was actually implemented at an conference place, and a
lot of participants actively discussed with it. However, our
goal is not for such a specific situation, but for the daily
casual situation.</p>
      <sec id="sec-4-1">
        <title>Retrieval result</title>
      </sec>
      <sec id="sec-4-2">
        <title>Send button</title>
      </sec>
      <sec id="sec-4-3">
        <title>Retrieval button</title>
        <p>Munemori et al[6] proposed a “emoticon” chat system. In
this system, a user can use “emoticon” only, no plain text
messages can be used. Although this system can give
universal messages regardless of the language, the contents of
the message is limited.</p>
        <p>While we are communicating with text chat systems, we
often find unknown or unclear words in the messages. Most of
us try to use the help of a web search engine to resolve the
question. This search action is troublesome because it
requires another web browser besides the chat system. Doing
the search action or not is left to the user’s own choice.
Windows Live Messenger[7], which is most popular chat
system, has already integrated the web search function to the
system. A user can search keywords entered in the message
area by pushing “search” button instead of “submit” button.
However, the result is returned as a URL format. A retrieval
example is illustrated in figure 1. To see what’s in the URL,
a user has to run the browser again. A user also has to do
chatting and searching at the sametime. From the
questionnaire to our colleagues, we found that the function of the
“retrieval(search)” button was not used positively. Even the
existence was not clear for them.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>METHOD</title>
    </sec>
    <sec id="sec-6">
      <title>Outline</title>
      <p>The outline of the proposal method is shown Figure 2. A
user inputs and sends a message as well as in the same
manner of the normal text chat system. Then the message is sent
to the GaChat server. The server excerpts the proper noun
from the message, and fetches the article about the noun
from wikipedia. Those additional data are automatically
displayed on both chat windows.</p>
      <p>
        If the proposal method is used, the extra activity between the
chat users can be suppressed. An extra activity is an extra
(
        <xref ref-type="bibr" rid="ref1">1</xref>
        )
(
        <xref ref-type="bibr" rid="ref2">2</xref>
        )
      </p>
      <p>Message
Message</p>
      <p>Article
conversation for an unknown retrieval of the word and the
consensus of communications.</p>
    </sec>
    <sec id="sec-7">
      <title>About online retrieval information</title>
      <p>Typical word and phrase that have different understanding
is the proper noun.A proper noun indicates a specific
object such as a name of a person or a place. It is alternative
whether one knows the object or not. If a user does not have
enough knowledge about the proper nouns appeared in the
exchanging message, the message might not be treated
properly. We also have to pay attention to the difference of the
nuance and connotations.</p>
      <p>Proper nouns are also frequently used as search terms
because it is effective to make the search more specific. If we
use general nouns as search terms, the result might be
enormous and ambiguous. Such information does not help the
mutual understanding at text chat communication.
Therefore, our system uses proper nouns only to fetch the
supplemental images and articles. If there are multiple proper
nouns in a chat message, we choice the last one. This is
because Japanese grammar(the authors’ native language) tends
to place more pmphasis on the last part of a sentence.
We think about a presentation of the image information and
formal textual information as a method of presenting
information on the proper noun. Supplementary information should
be strongly related to the object the user intended to explain.
Our system fetch those contents from the site Wikipedia. For
instance, when one tries to explain a school “Tokyo
University of Technology” to a friend who doesn’t know it, a
symbolic picture and outlined information about history and
faculty may be impressive and help the understanding(Figure
3). Currently such well-edited contents are collected in an
encyclopaedic site.</p>
    </sec>
    <sec id="sec-8">
      <title>Operation</title>
      <p>Our proposal system consists of Chat Server and Chat Client.
Chat server (GaChat server) has those functions that is
management of conversation, analysis of conversation text,
image search and retrieval of article in encyclopedia. Chat
client (Gachat client)utilizes those additional data to talk with
The explanation about Tokyo University of Technology</p>
      <sec id="sec-8-1">
        <title>Image</title>
      </sec>
      <sec id="sec-8-2">
        <title>Textual information</title>
        <p>
          Established: 1947
Location: Hachioji,Tokyo,JP
Website:
http://www.teu.ac.jp/
etc…
the other one. GaChat is a chat system with the following
functions.
(
          <xref ref-type="bibr" rid="ref1">1</xref>
          )Function to acquire phrase of proper noun from content
of remark of text chat
(
          <xref ref-type="bibr" rid="ref2">2</xref>
          )Function to do image search according to phrase acquired
by one, and to acquire pertinent site URL
(
          <xref ref-type="bibr" rid="ref3">3</xref>
          )Function to do encyclopedia retrieval concerning phrase
acquired by one, and to acquire content
(
          <xref ref-type="bibr" rid="ref4">4</xref>
          )Function to display article in image and encyclopedia
acquired by 2 and 3 in addition to remark by automatic
operation
The outline of operation extracts the proper noun by the
system when sentences are input, and retrieves the image search
and the encyclopedia to the retrieval word this. URL of the
image is acquired in the image search, and the title and the
text in the article are acquired in the encyclopedia retrieval.
As a result, the user-name, it makes remarks to the message
area, and the title and the text are displayed in the image, the
retrieval word, and the encyclopedia area in the image area.
        </p>
      </sec>
    </sec>
    <sec id="sec-9">
      <title>IMPLEMENTATION OF GACHAT</title>
      <p>This chapter describes chat system ”GaChat” that gives
retrieval information by the conversation phrase based on the
proposal method automatically.</p>
    </sec>
    <sec id="sec-10">
      <title>Environment</title>
      <p>
        GaChat was composed of the client and the server, and
implemented by Java. It is Sen[8] as the morphological
analy(
        <xref ref-type="bibr" rid="ref2">2</xref>
        ). User name and chat room display
(
        <xref ref-type="bibr" rid="ref5">5</xref>
        ). Extraction word display
(
        <xref ref-type="bibr" rid="ref4">4</xref>
        ). Text area
(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ). User name and chat room making
      </p>
      <p>
        (
        <xref ref-type="bibr" rid="ref7">7</xref>
        ). Outline of Wikipedia
(
        <xref ref-type="bibr" rid="ref6">6</xref>
        ). Retrieval image
(
        <xref ref-type="bibr" rid="ref3">3</xref>
        ). Input and send of message
sis machine though the proper noun was extracted from the
conversation text of the client with the server) was used.
Yahoo! API[9] was used for the extracted phrase.
’SimpleAPIWikipedia API[10]’ was used for the article retrieval(figure
4).
      </p>
    </sec>
    <sec id="sec-11">
      <title>GaChat server</title>
      <p>The GaChat server is a text analysis besides the
communication of the message of the normal text chat, is an
image search, and retrieves the encyclopedia. The text
analysis extracted the proper noun from the remark. This
extracted phrase is used for retrieval of image and the
encyclopedia(Figure 5).</p>
      <p>We discuss technology’s detail on our system. Request URL
is made by the REST form, and it inquires of image search
Web service. It is necessary the phrase to URL encode by
UTF-8. The image URL get from the response field of
“/ResultSet/Result/Url”. Itextract the URL of retrieved image
from Those retrieval results information of the top 12 list.
”The 12 list” means the number of retrieval result in first
page. Because, the research company in the United States
indicates that the user of 62% surfs only the first page
result[11].</p>
      <p>The encyclopedia retrieval matches to the specification of
WikipediaAPI and generates request URL that adds the phrase
similarly encoded to UTF-8. The output form selected XML.
It is a Perth doing XML. The title of the article get from the
response field of “/results/result/title”. The digest of the
article get from the response field of “/results/result/body”.</p>
    </sec>
    <sec id="sec-12">
      <title>GaChat client</title>
      <p>
        We explains each function of the GaChat client by GUI(Figure
5).
(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ). We register the chat’s username and the chat room.
(
        <xref ref-type="bibr" rid="ref2">2</xref>
        ). This area displays registered name and chat room.
(
        <xref ref-type="bibr" rid="ref3">3</xref>
        ). Input and send of message.
(
        <xref ref-type="bibr" rid="ref4">4</xref>
        ). This area displays the conversational content.
(
        <xref ref-type="bibr" rid="ref5">5</xref>
        ). This area displays the proper noun and that phrase’s
URL.
(
        <xref ref-type="bibr" rid="ref6">6</xref>
        ). This area displays image related to the phrase.
(
        <xref ref-type="bibr" rid="ref7">7</xref>
        ). This area displays Wikipedia article related to the phrase.
I went to Tokyo Tower yesterday.
      </p>
      <p>A Wikipedia article
related to Tokyo Tower
A image
related to Tokyo Tower</p>
    </sec>
    <sec id="sec-13">
      <title>EXPERIMENT</title>
      <p>We discuss The operation verification of GaChat. It is
confirmed that the proper noun is extracted from each a series of
remark as a precondition by confirming the operation about
the system of each function to mount, and the image search,
the display by the words and phrases, and the retrieval and
the displays of the Wikipedia article are done. To talk about
the time that hangs to the retrieval and the display of two
functions slightly, it is especially understood not to become
it in the obstacle.</p>
      <p>We introduce the results of the communications with GaChat.
The testee was recruited in the laboratory to which authors
belonged for communications using GaChat to search for the
feature, and the text communications were used to the last to
do. The chat software was used usually, and testee’s
condition was assumed to be a thing that the blind touch of extent
to which the input of the message with the keyboard did not
obstruct communications was able to be done this time. It
introduces one example of feature communications as follows.
The conversation is Figure 6 and 7.</p>
      <p>Good.</p>
      <p>Have you been to Zojo-ji Temple
near Tokyo Tower?
A Wikipedia article
related to Zojo-ji Temple
A image
related to Zojo-ji Temple</p>
    </sec>
    <sec id="sec-14">
      <title>DISCUSSION</title>
      <p>there is domination because the phrase (proper noun) is
extracted from the content of the conversation by the automatic
operation of this system, and the function to display the
retrieval result doesn’t request special work from the user.</p>
      <p>Perhaps, each person might be implicitly inspecting article
information in the encyclopedia by the confirmation extent
when not understanding. It will be necessary to catch and to
investigate in the future.</p>
      <p>The image information might be acquired from 12 high ranks
at random, and neither article information nor the content of
Wikipedia might match it in mounting GaChat. a actress
Ozawa pearl is displayed to retrieval ”Ozawa” in the image
information and politician Ichiro Ozawa is displayed in the
wikipedia article(Figure 8). Person’s name is a problem that
a famous person of this family name causes in two or more
situations when extracted only by the family name.
”Interest” might be able to be offered to communications by using
such a phenomenon though it is necessary to correspond as
trouble.</p>
      <p>General information that expects everyone need not never
Horiguchi: I went to Tokyo Tower yesterday. go out though the match of the image information and the
Kodaira: Good. Have you been to Zojo-ji Temple near Tokyo Wikipedia article might be important. Because the concept
Tower? of GaChat is fixation of the topic by conversation text and
synchronous displaying related information on the topic and
union, the accuracy of the retrieval result is not so important.</p>
      <p>The phrase (proper noun)extracted from Horiguchi’s remark The more important one is stability that always displays the
is “Tokyo Tower”. Picture and Wikipedia article related to image information and the Wikipedia article to the
converTokyo Tower is displayed. The phrase (proper noun)extracted sation. There is no problem about the part under the present
from Kodaira’s remark is “Zojo-ji Temple”. Picture and Wikipedia situation.
article related to Zojo-ji Temple is displayed. Zojo-ji Temple
is a Buddhist temple in the Shiba neighborhood of
Minatoku in Tokyo, Japan.</p>
      <p>In this case, when related information on the topic was
synchronously presented while doing the text communications,
it was able to be confirmed to be especially reactive to the
image. However, this conversation example was included,
and it was not able to be confirmed that article information
in the encyclopedia influenced the topic by the present stage.</p>
      <p>We want to analyze a further feature continuing the trial
evaluation in the future. Moreover, the fixed quantity evaluation
is necessary to judge more in detail, and to analyze the state
as communications. The evaluation standard that becomes a
criteria is selected therefore, and construction is important.</p>
      <p>Moreover, the method of displaying it is devised when on
the other hand, there is an image information in the article
on Wikipedia. It will be necessary to mount the function,
extracted Key word</p>
      <p>“Ozawa”</p>
      <p>It is a type of family name.</p>
      <p>This Image is
a actress named Ozawa.</p>
      <p>This Wikipedia article is
a politician named Ozawa.
and to compare it with this proposal technique in the
future. This time, the case where Wikipedia article
information is referred frequently was not able to be confirmed, and
it wants to be going to advance the examination in the future,
and to search for the application example.</p>
    </sec>
    <sec id="sec-15">
      <title>CONCLUSIONS</title>
      <p>In this thesis, We proposed GaChat. So far we have outlined
the way in which our chat system is low- awareness without
any additional devices. Our system aimed at the obstruction
evasion and the consensus of communications. Moreover, it
is thought that information displayed synchronizing with the
message plays the role of the conversion of the topic when
communications are smoothly done, and I want to evaluate
the respect. When chatting the text, it phrases it as an
approach of the action assistance of the print putting of the
obstruction evasion of communication and communications
together in the conversation in the text chat in this thesis.
The technique for the union of specified meanings of the
phrase by synchronously presenting information that
concerned, and limiting it was shown. It is thought that
lighthearted communications that were the advantages of the text
chat were kept compared with the one to transmit past
awareness information and to attempt solving. The opinion roughly
friendly was received from the testee.</p>
      <p>In the evaluation that used the prototype, it was confirmed
that attention gathered in the image information
conversation text and synchronous displayed, and the topic was fixed.
Moreover, it was confirmed that attention turned even when
there was no knowledge for the presented image
information. It was shown to reduce communication the
obstruction, and to assist the print putting of knowledge each other
together. Moreover, it is thought that information displayed
synchronizing with the message plays the role of the
conversion of the topic.</p>
      <p>It will be necessary to continue the trial evaluation, and to do
a quantitative evaluation in the future. I want to improve the
system from the problem etc. clarified from the evaluation
experiment. When communications are smoothly done, we
investigate the role of the conversion of the topic.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Gareth</surname>
            <given-names>J. F.</given-names>
          </string-name>
          <string-name>
            <surname>Jones</surname>
          </string-name>
          and
          <string-name>
            <surname>Peter J. Brown</surname>
          </string-name>
          .
          <article-title>Information access for context-aware appliances (poster session)</article-title>
          .
          <source>In SIGIR '00: Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval</source>
          , pages
          <fpage>382</fpage>
          -
          <lpage>384</lpage>
          , New York, NY, USA,
          <year>2000</year>
          . ACM.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2. L.
          <string-name>
            <surname>Birnbaum J. Budzik</surname>
            ,
            <given-names>K. J.</given-names>
          </string-name>
          <string-name>
            <surname>Hammond</surname>
          </string-name>
          .
          <article-title>Information access in context</article-title>
          .
          <source>Knowledge-Based Systems</source>
          ,
          <volume>14</volume>
          (
          <issue>1-2</issue>
          ):
          <fpage>37</fpage>
          -
          <lpage>53</lpage>
          , Mar.
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <given-names>B.</given-names>
            <surname>Rhodes</surname>
          </string-name>
          .
          <article-title>Using physical context for just-in-time information retrieval</article-title>
          . Computers, IEEE Transactions on,
          <volume>52</volume>
          (
          <issue>8</issue>
          ):
          <fpage>1011</fpage>
          -
          <lpage>1014</lpage>
          , Aug.
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <given-names>Henry</given-names>
            <surname>Lieberman</surname>
          </string-name>
          , Elizabeth Rozenweig, and
          <string-name>
            <given-names>Push</given-names>
            <surname>Singh</surname>
          </string-name>
          .
          <article-title>Aria: An agent for annotating and retrieving images</article-title>
          .
          <source>Computer</source>
          ,
          <volume>34</volume>
          (
          <issue>7</issue>
          ):
          <fpage>57</fpage>
          -
          <lpage>62</lpage>
          ,
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <given-names>Takeshi</given-names>
            <surname>Nishida</surname>
          </string-name>
          and
          <string-name>
            <given-names>Takeshi</given-names>
            <surname>Nishida</surname>
          </string-name>
          .
          <article-title>Lock-on-chat: Boosting anchored conversation and its operation at a technical conference</article-title>
          .
          <source>In INTERACT 2005</source>
          ,
          <string-name>
            <surname>Springer</surname>
            <given-names>LNCS</given-names>
          </string-name>
          , pages
          <fpage>970</fpage>
          -
          <lpage>973</lpage>
          ,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <given-names>Munemori</given-names>
            <surname>Jun</surname>
          </string-name>
          , Miyai Shunsuke, and
          <string-name>
            <given-names>Ito</given-names>
            <surname>Junko</surname>
          </string-name>
          .
          <article-title>Development and application of pictograph chat communicator</article-title>
          .
          <source>In IPSJ SIG Technical Reports</source>
          ,
          <fpage>2006</fpage>
          -GN-61,
          <year>Sep 2006</year>
          (Japanese).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7. WindowsLiveMessenger. http://messenger.live.jp/,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Sen</surname>
          </string-name>
          .
          <article-title>Japanese morphological analysis system</article-title>
          . http://ultimania.org/sen/,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <article-title>Image search Web service. Dveloper network</article-title>
          . http://developer.yahoo.co.jp/search/ image/V1/imageSearch.html,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10. WikipediaAPI. SimpleAPI vol.
          <volume>3</volume>
          . http://wikipedia.simpleapi.net/,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11. iProspect.
          <article-title>iprospect search engine user behavior study</article-title>
          . http://www.iprospect.com/index.htm,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>