=Paper=
{{Paper
|id=Vol-3328/workshop6
|storemode=property
|title=Expanding the design possibilities of tabletop tangible user interfaces
|pdfUrl=https://ceur-ws.org/Vol-3328/workshop6.pdf
|volume=Vol-3328
|authors=Jeremy Laviole,Quentin Gobert
|dblpUrl=https://dblp.org/rec/conf/etis/LavioleG22
}}
==Expanding the design possibilities of tabletop tangible user interfaces==
Expanding the design possibilities of tabletop tangible
user interfaces
Jeremy Laviole1 , Quentin Gobert1
1
CATIE, Centre Aquitain des Technologies de l’Information et Electroniques, Talence, France
Abstract
The creation of tangible interfaces can have multiple means: using electronics, computer vision or with
electromagnetic sensing. In this workshop we propose to create tangible interfaces and interactive
experiences using simple color detection that can be easily integrated onto objects or for the creation of
dedicated interactors. The goal of this workshop is to permit the emergence of new tangible interfaces,
with a highlight on projection-based augmented reality. A first group created an emergency situation
management mock-up, and the second a pedagogical tool to teach the impact of the Moon and Sun on
the Earth.
Keywords
tangible interaction, augmented reality, interactive projection, paper interfaces, education, maker, Emer-
gency management
1. Introduction
Tangible interfaces can provide user interfaces through object manipulations. However, object
identification, tracking and instrumentation usually requires long design process, complex or
costly dedicated tracking mechanisms using cameras[1].
The origin and motivation of this workshop is twofold :
• We propose a simple low-cost vision-based tracking system using coloured dots. This
tracking system was initially inspired by DynamicLand [2], and relies on pre-existing
augmented reality library for see-through AR and projection-based AR [3].
• Over the past few years, this tracking system has proven itself for the creation of various
user interfaces described in section 4. It seems that new design possibilities are offered
using this tracking system. It has a low visual impact and smaller sizes compared to
markers[4] like ARToolkitPlus[5] or Aruco[6], it can detect large and small objects alike
with simple tweaking on the detection sizes and colours.
In this paper and the workshop we present a few different examples of applications concepts
and realisations using this kind of markers. The main context is projection-based AR. The
system uses a projector and a camera calibrated together, which are located above a table
creating an AR interactive surface on the table.
Etis’22, Fifth European Tangible Interaction Studio – Nov. 7-10 2022, ENAC Toulouse, France
Envelope-Open j.laviole@catie.fr (J. Laviole); q.gobert@catie.fr (Q. Gobert)
GLOBE https://github.com/poqudrof (J. Laviole)
© 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
Workshop
Proceedings
http://ceur-ws.org
ISSN 1613-0073
CEUR Workshop Proceedings (CEUR-WS.org)
We created different interactors, menus, physical items, selection tokens in various exper-
iments and a commercial application. In this workshop we want to expose the tools to the
community to enlarge the design space of TUI.
2. Organizers
The workshop is proposed by CATIE, which is a technology resource center. The first goal
of CATIE is to enable companies to understand new technologies: between communications,
marketing and research article, little can be applied in real-world scenarios. We also focus on
research that has a short-term (2-5 years) potential, to provide new technologies for compa-
nies. In this context, Augmented Reality and tangible interfaces already play a role in current
technologies as well as future uses. The Human Factor (HF) team works on vulgarization of
behaviour and cognitive user studies with a dedicated web platform (Peac²h[7]) that collects
metrics and surveys on user experiences.
The main contact person is Jeremy Laviole. He has a PhD in Augmented Reality, and created a
toolkit for see-through AR and projection-based AR. This toolkit has been developed for over 10
years with a few years gap here and then. He created an AR company to push projection-based
interfaces, and is now part of CATIE as a research engineer.
Quentin Gobert is a student at the Optic Institute of Aquitaine, and will join CATIE as
engineer. His education is physic-based and he specialized in the informatics branch of optic
(3D rendering, AR and VR technology). During his internship, he developed a demonstrator
based on PapARt, a projection-based AR technology developed by Jeremy Laviole, and whose
goal is to be shown in a showroom at the Estia Engineering school in Bidart, France.
3. Workshop Structure
The workshop lasted for a little less than the 4 hours planned at first. Here is the plan we
followed :
• 15mins : Workshop introduction, presentation of the subject, materials and previous
results.1
• 15mins : Presentation of the participants, creation of groups and subject picking or
creation.
• 1hours : Group work start.
• 30mins : Coffee break.
• 30mins : Group work end.
• 30mins : Test implementation of the group’s work
• 30mins : Group presentation with demonstration.
During the workshop, each group was invited to discuss around demonstrations and ask
questions for their workshop activities and around the project. Here what was demonstrated:
1
Slides are available at: https://doc.natar.fr/doc/presentations
• Circular detection, color reading inside the circle and discrimination of 5 colors in CIELAB
color space. HSV, RGB, XYZ are also available.
• Circular detection at scale: 8mm detector will get 8mm circular object, and 25mm will
get only bigger not seeing the smaller ones. It is impossible to mix sizes in groups and
lines for now.
• Tracking of dots, groups, and lines over time and filtering with 1€ filter[8].
• Example with IOT device: Puck.JS and 6Tron Z-Motion by Catie, both using Bluetooth.
4. Group work and workshop sample activities
4.1. Group work
The group work is focused on ideation and creation of interfaces. In order to kickstart the
ideas on TUIs using this system we introduce here some of our explorations over the past years.
Groups have then the possibility to build ideas on top of the concepts presented here of propose
completely different ones.
The strengths and weaknesses of the system will be tested during the workshop, in order to
check the real-world possibilities and constraints. Here is an example group work program, for
an accessibility focused group:
• Discussions on the subject that gathered the group together : tangible UIs that are
recognizable by touch.
• Proposition to recreate an existing work of physical UI shapes recognizable by touch.
• Creation of low fidelity mockup on paper and cardboard.
• Test of the detections and usage constraints using the provided detection system.
• Preparation of the group presentation with the structure: Problem tackled, existing
project, current exploration, strengths and limitations, conclusion.
4.2. Sample activities
We present here multiple projects or pieces of design that uses tangible interfaces which are
permitted by these small markers. Stickers or colored dots can be used to create UIs, moreover
it is notable that a colored sticker can be applied on a nail to achieve finger tracking on the table.
Likewise, coloured nail polish also works. Using the same idea, markers can be put on objects
or mock-ups augmenting the visualization of the model [9]. Detection can lead to events, as
well as the absence of detection like in [10]. Here a few examples of explorations created over
the few years.
4.3. Interactive object presentation
Keywords: Communication project, electronics, technology demonstration
The project aims to a presentation of electronic cards from 6TRON project[11], the open
source electronic design platform created at CATIE. Hence, we created a projection-based AR
demonstrator for a showroom at the Estia school, in Bidard, France. We placed colored stickers
Figure 1: Left: Electronic card identification, using colored dots, and QRCode to get more information
using a phone. Right: item selection using tokens.
on the electronic card to identify it as seen in figure 1. 3D mapping on the card enables to create
animations on the card at the correct size. When the card is identified by the camera, we display
several related information around it. The tangible interaction is a physical token on which we
placed colored stickers. In a next iteration we plan to use a colored token with a logo instead of
the two dots. One can manipulate the token and place it in the AR space. There are two ways
one can interact with the display :
• by placing the token directly around the card, on the text displayed. This interaction
enables to discover the component of the card directly around it, with co-located infor-
mation.
• by placing the token on a menu next to the display. This menu is more explicit for people
who want to know about a specific component.
4.4. Scientific teaching about artificial neural networks
Keywords: Scientific popularization, Neural Networks, Hackathon
Artificial Neural Network (ANN) and Deep Neural Network (DNN) are popular machine
learning tools. They can be quite obscure and hard to comprehend because of their large
structure that is partly bio inspired. This example is taken from a workshop on AI[12], and
was presented as a scientific mediation tool. Most of the elements of the ANN are physically
embedded. The learning data set are cards, the ANN is constructed using red tokens of different
shapes for the user, yet are detected as circular by the system. Sliders on the right in figure
2 are used for training, and the top slider can change the modes between ANN construction,
learning and prediction.
5. Post-workshop plans
The goal of this workshop is to open new possibilities for researchers and students. The
open-source project[13] has been used in many internships and group projects over the years.
Figure 2: Construction of a simple artificial neural network using a set of tokens. On the left the 4
input values are taken from an image, on the right the three prediction classes. The middle columns are
the hidden layers of the ANN. The green lines represent the grid detection.
The sticker tracking opens easy tangible design compared to electronic instrumentation, or
using depth cameras. In our opinion, it can be a first step before using more complex tracking
techniques, using commercial software for feature-based tracking, model-based tracking or
retro-reflective infrared markers. The design of projection-based interface, and experiences is
still a recent research field, and we hope it could lead to new explorations.
6. Material
The materials available were:
• Play dough with 8 different colors
• Paper, cardboard, scissors, post-it notes.
• Colored stickers, felt-tip pens, pencils.
The organizers did bring a projector-camera system to prototype tracking and display for the
groups if necessary. The possibility to code and create applications by workshop members was
taken out for the sake of time and ease of use.
7. Workshop outcomes
The workshop gathered 11 people who gathered in two groups of 5 and 5 persons each. The
first group worked on emergency situation management control system. The second group
created an education tool to experiment with the Earth, Moon, Sun trio.
7.1. Group 1: Emengency management
The first group created a container to store five tangible objects. Each object had its own use,
unique shape and color. Two objects were used to position elements on a map : a bus and a
helicopter to pinpoint the location where they should go to evacuate the population. The second
type of objects where placed on a container and their order in the container: top, to bottom
indicated which layer to overlay on the map and their visual intensity. The goal there was to
visualize and understand how water will flow to predict which roads will be blocked in a near
future and plan an evacuation to limit the number of person who could be stuck or in danger
by the flood.
The color tracking implementation using circle detection proved difficult for light colors,
which was a known limitations. The tracking is shown in figure 3. The system managed to
distinguish the colors after a calibration step. However the shapes where more ellipses than
circles, so for a real-world use we recommend to consider a large object as a set of multiple
small colored objects. Using this, it is possible to retrieve the orientation of these objects.
Figure 3: The group 1 at tracking phase: on the left the objects are placed to prototype the color
tracking, on the right the mock-up map. The group 2 at demonstration phase: on the left the Earth,
Moon, Sun trio made of play dough, on the right the paper prototype of an user interface showing
day/night cycle, tides, and seasons.
7.2. Group 2: Celestial bodies impact on Earth
The second group created a pedagogical experience to play with the Earth, Moon and Sun
and see some of the impacts of their locations on Earth. The Sun is fixed in the center of the
manipulation area, as seen in Figure 3. When the Earth rotates on itself the day/night cycle
is updated on the top right (a). When the Moon rotates around the Earth the tides changes in
the middle right part (b). When the Earth moves around the Earth, the seasons changes on the
bottom right part (c).
The three celestial bodies were tracked using color tracking, each had its own color and size
adjusted to be tracked on paper. In order for it work properly the Earth was simplified from
blue and green to a green sphere. The Moon was changed from grey to blue. The experience
could be done using this implementation for (b) and (c). However the rotation of the Earth on
itself could not be achieved like this. Using our system there are two simple ways to create is:
either using small IOT devices that uses an accelerometer, or using a set of dots that enable
rotation tracking.
References
[1] E. Molla, V. Lepetit, Augmented reality for board games, in: 2010 IEEE International
Symposium on Mixed and Augmented Reality, IEEE, 2010, pp. 253–254.
[2] B. Victor, J. Horowitz, L. Iannini, O. Rizwan, Dynamicland, Retrieved November 23 (2017)
2018.
[3] J. Laviole, M. Hachet, Papart: interactive 3d graphics and multi-touch augmented paper
for artistic creation, in: 2012 IEEE symposium on 3D user interfaces (3DUI), IEEE, 2012,
pp. 3–6.
[4] M. Billinghurst, H. Kato, I. Poupyrev, et al., Tangible augmented reality, Acm siggraph
asia 7 (2008) 1–10.
[5] D. Wagner, D. Schmalstieg, Artoolkitplus for pose tracking on mobile devices (2007).
[6] F. J. Romero-Ramirez, R. Muñoz-Salinas, R. Medina-Carnicer, Speeded up detection of
squared fiducial markers, Image and vision Computing 76 (2018) 38–47.
[7] CATIE, Peac²h platform for the integration of human factor, 2022. URL: https://peac2h.io.
[8] G. Casiez, N. Roussel, D. Vogel, 1€ filter: a simple speed-based low-pass filter for noisy
input in interactive systems, in: Proceedings of the SIGCHI Conference on Human Factors
in Computing Systems, 2012, pp. 2527–2530.
[9] A. Gillet, M. Sanner, D. Stoffler, A. Olson, Tangible interfaces for structural molecular
biology, Structure 13 (2005) 483–491.
[10] G. A. Lee, M. Billinghurst, G. J. Kim, Occlusion based interaction methods for tangible
augmented reality environments, in: Proceedings of the 2004 ACM SIGGRAPH interna-
tional conference on Virtual Reality continuum and its applications in industry, 2004, pp.
419–426.
[11] CATIE, 6TRON is a development environment of professional solutions in the field of
industrial internet of things, 2016. URL: https://6tron.io.
[12] H. Cerveau, Le hackathon sur le cerveau et l’ia. hackathon on ai and brain., 2017. URL:
https://mindlabdx.github.io/hack1cerveau/.
[13] J. Laviole, Papart: Paper augmented reality toolkit, https://github.com/natar-io/PapARt,
2022.