<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Augmented Reality Cooking System Using Tabletop Display Interface</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Han-byul Jang</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jang-woon Kim</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Chil-woo Lee</string-name>
        </contrib>
      </contrib-group>
      <pub-date>
        <year>2007</year>
      </pub-date>
      <abstract>
        <p>-Cooking is the theme that causes interest to everyone as element that is essential in life. Everyone try to cook, but fire and knifes are dangerous to handle for children. So children are felt difficult to cook directly. In this paper, we describe about development of cooking system that gives imaginary cooking experience to children. Our cooking system consists of two major technologies (Augmented Reality, Tabletop display) that can interact with user. Using augmented reality technology gives to user more arresting and accessible virtual cooking environment. Using tabletop display that provides multi-touch can provide interaction between user and the cooking system effectively.</p>
      </abstract>
      <kwd-group>
        <kwd>Augmented Reality</kwd>
        <kwd>Cook</kwd>
        <kwd>Education</kwd>
        <kwd>Virtual Reality</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>I. INTRODUCTION</title>
      <p>Cthat cook food that is essential in the life. People like
ooking is subject that excite interest to anyone as action
delicious food and want to cook with theirs own hands. In the
case of children, because it is dangerous to handle fire or knife
in cooking, there is difficulty to cook food with theirs own
hands. Augmented reality cooking system that uses tabletop
display is a virtual cooking application for children. Cooking
system developed in this paper is applying an augmented reality
technology to overlap 3D graphic objects to real world image.
So it is more realistic than a normal 3D virtual environment
system. Our system uses tabletop display interface that can
support multi-touch and multi-users. So we are able to provide
intuitive interface for the users and computers.</p>
    </sec>
    <sec id="sec-2">
      <title>II. COMPOSITION OF COOKING SYSTEM</title>
      <sec id="sec-2-1">
        <title>A. Configuration of whole system</title>
        <p>Hardware of the system is consisted of the miniature kitchen
set and a tabletop display. The miniature kitchen is the place to
put cards that express kitchen stuff, ingredients and etc. There
is a marker used to make augmented reality on Cards and Our
system recognize it to demonstrate augmented reality. Tabletop
display shows augmented reality image of cooking system on
the screen. The user can interact with both tabletop display and
miniature kitchen set. Figure 1 shows the overview of our
cooking system.</p>
        <p>Figure 1 overview of cooking system</p>
      </sec>
      <sec id="sec-2-2">
        <title>B. Interaction system</title>
        <p>
          In the case of general augmented reality system, usually,
interaction depends on marker. Generally markers are used as
button or manipulator. “3D Pottery Modeling in Augmented
Reality [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]” shows such cases. But marker dependent system is
affected by camera view and there is restriction that the user
must operate the system using real objects like marker. In this
way, state of food changes continuously accords to progress of
the cooking in the system, so it is impertinent to give interaction
by markers which display cooking ingredients. We want to
reduce these problems. We use tabletop display to reduce
markers dependency and to directly access the augmented
reality objects. So we are able to intuitionally manipulate
augmented reality objects. Our system analyze hand gesture by
three steps: hand gesture, command and event. In our system,
hand gesture meaning is input that hand touches the tabletop
screen.
        </p>
        <p>
          Tabletop display can provide rich and intelligent interaction
through analysis of the touch input [
          <xref ref-type="bibr" rid="ref3 ref4 ref5">3, 4, 5</xref>
          ]. We analyze gesture
and create interaction system to use special quality of tabletop
display. So our system analyzes and distinguishes hand
gesture’s meaning in tabletop display and concludes suitable
result. Hand gesture analyses have three main steps that are
hand gesture, command and event. In tabletop display, hand
gesture is the input generated by hand fingertip touch on the
tabletop’s screen. Command is specific meaning that is created
by analyzing hand gesture. Event is the result that is produced
by analyzing command in our cooking system. Advantages of
our hand gesture analysis are as following. First, one hand
gesture can used for various interaction interfaces because one
hand gesture can be differentiated and variously analyzed by
various command according to situation. Second, we can
attempt organic composition of the commands and can
compose suitable interaction system.
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>III. FRENCH TOAST COOKING SYSTEM</title>
      <p>In this paper, we select "French toast" among various kinds
of recipes and compose system. Because French toast is
popular food for children and its making process is simple.
Process to cook French toast is just roast bread that soaked in
mixed egg and milk liquid. So we thought that French toast is
suitable for our research. Figure 2 shows French toast.</p>
      <sec id="sec-3-1">
        <title>A. Kitchen stuff and ingredients of cooking</title>
        <p>Table 1 shows Kitchen stuff and ingredients of French toast.
Table 1 kitchen stuff and ingredients of French toast
Main material Bread, egg, milk, salt, strawberry jam
Sub-material Sugar powder, Oil, butter
Kitchen stuff Fry pan, vessel</p>
        <p>In this way, to make process simple, we omit sugar powder
that is not important between these materials and butter that is
putted on the fry pan. We use only important materials that are
bread, egg, and milk. Finally we create markers to represent the
materials.</p>
      </sec>
      <sec id="sec-3-2">
        <title>B. French toast manufacture by interaction</title>
        <p>French toast is a cooking that is made through process of
roast bread soaked in mixed egg and milk liquid. Necessary
hand gesture, command and events are shown in Table 2. The
“Break egg” command generates an event that changes the state
of the egg object to “broken”. The “Material mixture”
command generates an event that changes state of the objects to
“mixed” state. The “Fire on” command generates event that
changes state of the bread on fry pan to “roasted bread”.
Cooking system is run by these gesture commands.</p>
        <p>Table 2 necessary hand gesture, command, and events
Hand gesture command events
break egg The egg is
broken</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>First select egg and than up and down.</title>
    </sec>
    <sec id="sec-5">
      <title>Touch the bowl which has material several times.</title>
    </sec>
    <sec id="sec-6">
      <title>One fingertip moves round after two fingertip touch the fry pan.</title>
    </sec>
    <sec id="sec-7">
      <title>Material mixture</title>
    </sec>
    <sec id="sec-8">
      <title>Turn on Fire</title>
      <p>Egg
milk
mixed
and
are</p>
    </sec>
    <sec id="sec-9">
      <title>Fire on and than roast the bread</title>
    </sec>
    <sec id="sec-10">
      <title>IV. CONCLUSION</title>
      <p>Our cooking system takes advantage of augmented reality
technology and makes user experience cooking process. By
using tabletop display interface, we can provide intuitive
interaction. Children can imaginarily experience cooking as it
is not dangerous to use our cooking system. In present state, we
have developed the simple system for cooking. In future
research, we will develop intuitive cooking gesture and handle
various cooking.</p>
    </sec>
    <sec id="sec-11">
      <title>ACKNOWLEDGEMENTS This research is accomplished by research funding of C.N.U Culture Technology Institute supported by MCT and KOCCA, Korea</title>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>[1] In the section of “Emerging technologies” of ACM SIGGRAPH2006 Full Conference</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Gabjong</given-names>
            <surname>Han</surname>
          </string-name>
          , Jane Hwang, Seungmoon Choi, Gerard Jounghyun Kim.
          <article-title>“3D Pottery Modeling in Augmented Reality”</article-title>
          ,
          <string-name>
            <surname>HCI</surname>
          </string-name>
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Dietz</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Leigh</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <year>2001</year>
          .
          <article-title>DiamondTouch: A Multi-User Touch Technology</article-title>
          .
          <source>In Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology (Orlando, Florida, November 11 - 14</source>
          ,
          <year>2001</year>
          ).
          <source>UIST '01</source>
          . ACM Press, New York, NY,
          <fpage>219</fpage>
          -
          <lpage>226</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Han</surname>
            <given-names>J. Y.</given-names>
          </string-name>
          <string-name>
            <surname>Low-Cost</surname>
          </string-name>
          Multi
          <article-title>-Touch Sensing through Frustrated Total Internal Reflection</article-title>
          .
          <source>In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology</source>
          , ACM Press, New York, NY,
          <year>2005</year>
          ,
          <fpage>15</fpage>
          -
          <lpage>118</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Matsushita</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iida</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ohguro</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shirai</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kakehi</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Naemura</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <year>2004</year>
          .
          <article-title>Lumisight Table: A Face-to-face Collaboration Support System That Optimizes Direction of Projected Information to Each Stakeholder</article-title>
          .
          <source>In Proceedings of the 2004 ACM Conference on Computer Supported Cooperative Work</source>
          (Chicago, Illinois, USA, November
          <volume>06</volume>
          -
          <issue>10</issue>
          ,
          <year>2004</year>
          ), CSCW '
          <fpage>04</fpage>
          . ACM Press, New York, NY,
          <fpage>274</fpage>
          -
          <lpage>283</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>