<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Expanding the design possibilities of tabletop tangible user interfaces</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Jeremy Laviole</string-name>
          <email>j.laviole@catie.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Quentin Gobert</string-name>
          <email>q.gobert@catie.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>CATIE, Centre Aquitain des Technologies de l'Information et Electroniques</institution>
          ,
          <addr-line>Talence</addr-line>
          ,
          <country country="FR">France</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The creation of tangible interfaces can have multiple means: using electronics, computer vision or with electromagnetic sensing. In this workshop we propose to create tangible interfaces and interactive experiences using simple color detection that can be easily integrated onto objects or for the creation of dedicated interactors. The goal of this workshop is to permit the emergence of new tangible interfaces, with a highlight on projection-based augmented reality. A first group created an emergency situation management mock-up, and the second a pedagogical tool to teach the impact of the Moon and Sun on tangible interaction, augmented reality, interactive projection, paper interfaces, education, maker, EmerEtis'22, Fifth European Tangible Interaction Studio - Nov. 7-10 2022, ENAC Toulouse, France CEUR Workshop Proceedings</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Tangible interfaces can provide user interfaces through object manipulations. However, object
identification, tracking and instrumentation usually requires long design process, complex or
costly dedicated tracking mechanisms using cameras[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        The origin and motivation of this workshop is twofold :
• We propose a simple low-cost vision-based tracking system using coloured dots. This
tracking system was initially inspired by DynamicLand [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], and relies on pre-existing
augmented reality library for see-through AR and projection-based AR [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
• Over the past few years, this tracking system has proven itself for the creation of various
user interfaces described in section 4. It seems that new design possibilities are ofered
using this tracking system. It has a low visual impact and smaller sizes compared to
markers[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] like ARToolkitPlus[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] or Aruco[
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], it can detect large and small objects alike
with simple tweaking on the detection sizes and colours.
      </p>
      <p>In this paper and the workshop we present a few diferent examples of applications concepts
and realisations using this kind of markers. The main context is projection-based AR. The
system uses a projector and a camera calibrated together, which are located above a table
creating an AR interactive surface on the table.
https://github.com/poqudrof (J. Laviole)</p>
      <p>We created diferent interactors, menus, physical items, selection tokens in various
experiments and a commercial application. In this workshop we want to expose the tools to the
community to enlarge the design space of TUI.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Organizers</title>
      <p>
        The workshop is proposed by CATIE, which is a technology resource center. The first goal
of CATIE is to enable companies to understand new technologies: between communications,
marketing and research article, little can be applied in real-world scenarios. We also focus on
research that has a short-term (2-5 years) potential, to provide new technologies for
companies. In this context, Augmented Reality and tangible interfaces already play a role in current
technologies as well as future uses. The Human Factor (HF) team works on vulgarization of
behaviour and cognitive user studies with a dedicated web platform (Peac²h[
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]) that collects
metrics and surveys on user experiences.
      </p>
      <p>The main contact person is Jeremy Laviole. He has a PhD in Augmented Reality, and created a
toolkit for see-through AR and projection-based AR. This toolkit has been developed for over 10
years with a few years gap here and then. He created an AR company to push projection-based
interfaces, and is now part of CATIE as a research engineer.</p>
      <p>Quentin Gobert is a student at the Optic Institute of Aquitaine, and will join CATIE as
engineer. His education is physic-based and he specialized in the informatics branch of optic
(3D rendering, AR and VR technology). During his internship, he developed a demonstrator
based on PapARt, a projection-based AR technology developed by Jeremy Laviole, and whose
goal is to be shown in a showroom at the Estia Engineering school in Bidart, France.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Workshop Structure</title>
      <p>The workshop lasted for a little less than the 4 hours planned at first. Here is the plan we
followed :
• 15mins: Workshop introduction, presentation of the subject, materials and previous
results.1
• 15mins: Presentation of the participants, creation of groups and subject picking or
creation.
• 1hours: Group work start.
• 30mins: Cofee break.
• 30mins: Group work end.
• 30mins: Test implementation of the group’s work
• 30mins: Group presentation with demonstration.</p>
      <p>
        During the workshop, each group was invited to discuss around demonstrations and ask
questions for their workshop activities and around the project. Here what was demonstrated:
1Slides are available at: https://doc.natar.fr/doc/presentations
• Circular detection, color reading inside the circle and discrimination of 5 colors in CIELAB
color space. HSV, RGB, XYZ are also available.
• Circular detection at scale: 8mm detector will get 8mm circular object, and 25mm will
get only bigger not seeing the smaller ones. It is impossible to mix sizes in groups and
lines for now.
• Tracking of dots, groups, and lines over time and filtering with 1€ filter[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ].
• Example with IOT device: Puck.JS and 6Tron Z-Motion by Catie, both using Bluetooth.
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Group work and workshop sample activities</title>
      <sec id="sec-4-1">
        <title>4.1. Group work</title>
        <p>The group work is focused on ideation and creation of interfaces. In order to kickstart the
ideas on TUIs using this system we introduce here some of our explorations over the past years.
Groups have then the possibility to build ideas on top of the concepts presented here of propose
completely diferent ones.</p>
        <p>The strengths and weaknesses of the system will be tested during the workshop, in order to
check the real-world possibilities and constraints. Here is an example group work program, for
an accessibility focused group:
• Discussions on the subject that gathered the group together : tangible UIs that are
recognizable by touch.
• Proposition to recreate an existing work of physical UI shapes recognizable by touch.
• Creation of low fidelity mockup on paper and cardboard.
• Test of the detections and usage constraints using the provided detection system.
• Preparation of the group presentation with the structure: Problem tackled, existing
project, current exploration, strengths and limitations, conclusion.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Sample activities</title>
        <p>
          We present here multiple projects or pieces of design that uses tangible interfaces which are
permitted by these small markers. Stickers or colored dots can be used to create UIs, moreover
it is notable that a colored sticker can be applied on a nail to achieve finger tracking on the table.
Likewise, coloured nail polish also works. Using the same idea, markers can be put on objects
or mock-ups augmenting the visualization of the model [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]. Detection can lead to events, as
well as the absence of detection like in [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]. Here a few examples of explorations created over
the few years.
        </p>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Interactive object presentation</title>
        <p>Keywords: Communication project, electronics, technology demonstration</p>
        <p>
          The project aims to a presentation of electronic cards from 6TRON project[
          <xref ref-type="bibr" rid="ref11">11</xref>
          ], the open
source electronic design platform created at CATIE. Hence, we created a projection-based AR
demonstrator for a showroom at the Estia school, in Bidard, France. We placed colored stickers
on the electronic card to identify it as seen in figure 1. 3D mapping on the card enables to create
animations on the card at the correct size. When the card is identified by the camera, we display
several related information around it. The tangible interaction is a physical token on which we
placed colored stickers. In a next iteration we plan to use a colored token with a logo instead of
the two dots. One can manipulate the token and place it in the AR space. There are two ways
one can interact with the display :
• by placing the token directly around the card, on the text displayed. This interaction
enables to discover the component of the card directly around it, with co-located
information.
• by placing the token on a menu next to the display. This menu is more explicit for people
who want to know about a specific component.
        </p>
      </sec>
      <sec id="sec-4-4">
        <title>4.4. Scientific teaching about artificial neural networks</title>
        <p>Keywords: Scientific popularization, Neural Networks, Hackathon</p>
        <p>
          Artificial Neural Network (ANN) and Deep Neural Network (DNN) are popular machine
learning tools. They can be quite obscure and hard to comprehend because of their large
structure that is partly bio inspired. This example is taken from a workshop on AI[
          <xref ref-type="bibr" rid="ref12">12</xref>
          ], and
was presented as a scientific mediation tool. Most of the elements of the ANN are physically
embedded. The learning data set are cards, the ANN is constructed using red tokens of diferent
shapes for the user, yet are detected as circular by the system. Sliders on the right in figure
2 are used for training, and the top slider can change the modes between ANN construction,
learning and prediction.
        </p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Post-workshop plans</title>
      <p>
        The goal of this workshop is to open new possibilities for researchers and students. The
open-source project[
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] has been used in many internships and group projects over the years.
      </p>
      <p>The sticker tracking opens easy tangible design compared to electronic instrumentation, or
using depth cameras. In our opinion, it can be a first step before using more complex tracking
techniques, using commercial software for feature-based tracking, model-based tracking or
retro-reflective infrared markers. The design of projection-based interface, and experiences is
still a recent research field, and we hope it could lead to new explorations.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Material</title>
      <p>The materials available were:
• Play dough with 8 diferent colors
• Paper, cardboard, scissors, post-it notes.</p>
      <p>• Colored stickers, felt-tip pens, pencils.</p>
      <p>The organizers did bring a projector-camera system to prototype tracking and display for the
groups if necessary. The possibility to code and create applications by workshop members was
taken out for the sake of time and ease of use.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Workshop outcomes</title>
      <p>The workshop gathered 11 people who gathered in two groups of 5 and 5 persons each. The
ifrst group worked on emergency situation management control system. The second group
created an education tool to experiment with the Earth, Moon, Sun trio.</p>
      <sec id="sec-7-1">
        <title>7.1. Group 1: Emengency management</title>
        <p>The first group created a container to store five tangible objects. Each object had its own use,
unique shape and color. Two objects were used to position elements on a map : a bus and a
helicopter to pinpoint the location where they should go to evacuate the population. The second
type of objects where placed on a container and their order in the container: top, to bottom
indicated which layer to overlay on the map and their visual intensity. The goal there was to
visualize and understand how water will flow to predict which roads will be blocked in a near
future and plan an evacuation to limit the number of person who could be stuck or in danger
by the flood.</p>
        <p>The color tracking implementation using circle detection proved dificult for light colors,
which was a known limitations. The tracking is shown in figure 3. The system managed to
distinguish the colors after a calibration step. However the shapes where more ellipses than
circles, so for a real-world use we recommend to consider a large object as a set of multiple
small colored objects. Using this, it is possible to retrieve the orientation of these objects.</p>
      </sec>
      <sec id="sec-7-2">
        <title>7.2. Group 2:  Celestial bodies impact on Earth</title>
        <p>The second group created a pedagogical experience to play with the Earth, Moon and Sun
and see some of the impacts of their locations on Earth. The Sun is fixed in the center of the
manipulation area, as seen in Figure 3. When the Earth rotates on itself the day/night cycle
is updated on the top right (a). When the Moon rotates around the Earth the tides changes in
the middle right part (b). When the Earth moves around the Earth, the seasons changes on the
bottom right part (c).</p>
        <p>The three celestial bodies were tracked using color tracking, each had its own color and size
adjusted to be tracked on paper. In order for it work properly the Earth was simplified from
blue and green to a green sphere. The Moon was changed from grey to blue. The experience
could be done using this implementation for (b) and (c). However the rotation of the Earth on
itself could not be achieved like this. Using our system there are two simple ways to create is:
either using small IOT devices that uses an accelerometer, or using a set of dots that enable
rotation tracking.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>E.</given-names>
            <surname>Molla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Lepetit</surname>
          </string-name>
          ,
          <article-title>Augmented reality for board games</article-title>
          ,
          <source>in: 2010 IEEE International Symposium on Mixed and Augmented Reality</source>
          , IEEE,
          <year>2010</year>
          , pp.
          <fpage>253</fpage>
          -
          <lpage>254</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>B.</given-names>
            <surname>Victor</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Horowitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Iannini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Rizwan</surname>
          </string-name>
          , Dynamicland,
          <source>Retrieved November</source>
          <volume>23</volume>
          (
          <year>2017</year>
          )
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J.</given-names>
            <surname>Laviole</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Hachet</surname>
          </string-name>
          ,
          <article-title>Papart: interactive 3d graphics and multi-touch augmented paper for artistic creation</article-title>
          ,
          <source>in: 2012 IEEE symposium on 3D user interfaces (3DUI)</source>
          , IEEE,
          <year>2012</year>
          , pp.
          <fpage>3</fpage>
          -
          <lpage>6</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>M.</given-names>
            <surname>Billinghurst</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Kato</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Poupyrev</surname>
          </string-name>
          , et al.,
          <source>Tangible augmented reality, Acm siggraph asia 7</source>
          (
          <year>2008</year>
          )
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>D.</given-names>
            <surname>Wagner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Schmalstieg</surname>
          </string-name>
          ,
          <article-title>Artoolkitplus for pose tracking on mobile devices (</article-title>
          <year>2007</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>F. J.</given-names>
            <surname>Romero-Ramirez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Muñoz-Salinas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Medina-Carnicer</surname>
          </string-name>
          ,
          <article-title>Speeded up detection of squared fiducial markers</article-title>
          ,
          <source>Image and vision Computing</source>
          <volume>76</volume>
          (
          <year>2018</year>
          )
          <fpage>38</fpage>
          -
          <lpage>47</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>CATIE</given-names>
            <surname>,</surname>
          </string-name>
          <article-title>Peac²h platform for the integration of human factor, 2022</article-title>
          . URL: https://peac2h.io.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>G.</given-names>
            <surname>Casiez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Roussel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Vogel</surname>
          </string-name>
          ,
          <article-title>1€ filter: a simple speed-based low-pass filter for noisy input in interactive systems</article-title>
          ,
          <source>in: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems</source>
          ,
          <year>2012</year>
          , pp.
          <fpage>2527</fpage>
          -
          <lpage>2530</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>A.</given-names>
            <surname>Gillet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sanner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Stofler</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Olson</surname>
          </string-name>
          ,
          <article-title>Tangible interfaces for structural molecular biology</article-title>
          ,
          <source>Structure</source>
          <volume>13</volume>
          (
          <year>2005</year>
          )
          <fpage>483</fpage>
          -
          <lpage>491</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>G. A.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Billinghurst</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. J.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <article-title>Occlusion based interaction methods for tangible augmented reality environments</article-title>
          ,
          <source>in: Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry, 2004</source>
          , pp.
          <fpage>419</fpage>
          -
          <lpage>426</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <article-title>CATIE, 6TRON is a development environment of professional solutions in the field of industrial internet of things, 2016</article-title>
          . URL: https://6tron.io.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>H.</given-names>
            <surname>Cerveau</surname>
          </string-name>
          ,
          <article-title>Le hackathon sur le cerveau et l'ia. hackathon on ai and brain</article-title>
          .,
          <year>2017</year>
          . URL: https://mindlabdx.github.io/hack1cerveau/.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>J.</given-names>
            <surname>Laviole</surname>
          </string-name>
          , Papart: Paper augmented reality toolkit, https://github.com/natar-io/PapARt,
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>