=Paper=
{{Paper
|id=Vol-260/paper-10
|storemode=property
|title=The User Interface of Button Type for Stereo Video-See-Through Device
|pdfUrl=https://ceur-ws.org/Vol-260/paper10.pdf
|volume=Vol-260
|dblpUrl=https://dblp.org/rec/conf/isuvr/ChoiS07
}}
==The User Interface of Button Type for Stereo Video-See-Through Device==
International Symposium on Ubiquitous VR 2007 1
The User Interface of Button Type for Stereo Video-See-Through Device
Youngju Choi and Yongduek Seo
ready state and it sets the button with red color. From this, the
Abstract—This paper proposes a user interface, on see-through user oneself can know what kind of button is selected.
system environment which shows the images via two different
cameras, that also ordinary user can control the computer 2. Main
systems or other various processes easily. For that, we include an
AR technology to synthesize the virtual button to the image which The overall process is composed with the preprocessing
is captured by the camera real-time. And we were looking for the section, the waiting section and the main section.
hand position in the image to judge whether the figure selects the
button. And the result of judgment visualizes through changing of 2.1 Preprocess
the button color. The user can easily interact with the system by A preprocess has two steps. First of all, it analyzes the
selecting the virtual button in the screen with watching the screen sample hand image which it takes with the cameras and
and moving her fingers at the air calculates a average and standard deviation. The values
represent the feature of the hand color and these become the
Index Terms—AR, HCI(Human Computer Interaction),
See-Through, Synthesis, Plane Projective Transformation
standard data for seeking the hand region from the captured
image. To receive the effect of the light a little this paper only
1 Introduction consider about a hue and saturate on the HSV space, and
calculate the mean(hm,sm ) and the standard deviation( hσ , sσ )
This paper proposes a user interface of the button type about of the sample image. Under an assumption which the
a see-through device which is receiving an interest as the new distribution of hand color has a normal distribution, we thinks
computer user interface. This is based on AR(Augmented that the pixel which has the following reliability is the region of
Reality). The proposed interface is to synthesize the virtual the hand color.
button to a display of the device equipped two cameras.
hm i − h xi , y sm i − s xi , y
Through this, a user can recognize as if the buttons are at the air < 2 .5 I < 2 .5 i = {1 , 2 }
hσ i
sσ i
and is selecting. As a result a event of the action is executed. eq1)
In the paper, we change a color of the button to expresses
visually what the button is selected. As well, since the device is Next, we calculate the homography matrix which represents the
equipped with two cameras like human eyes, it can derive a transformation between the coordinates of the images which
phenomenon which is like using depth information about are captured by the each camera. The virtual button which is on
objects. Consequently, we can receive a more realistic and the world plane is projected to the image which is captured
accurate feeling than one camera. [picture1] is a structure of from the each camera by this transformation matrix, so we can
system. An arrow substitutes a finger. synthesize the virtual button to the image to look like that the
virtual button is located in the special position of the real world.
For this, like the right side image of [Picture1] we built the
pattern image ( w Image )which is formed the rectangles with
black and white and the two cameras. And we got the 2
dimensional points from the image plane and the 3 dimensional
points from the world plane.
The relationship of the points is like the equation 2).
⎡ x ' ⎤ ⎡u ⎤ ⎡h11 h12 h13 ⎤ ⎡ x ⎤
⎢ '⎥ ⎢ ⎥ ⎢ ⎥⎢ ⎥
Picture1. Left) The structure of test environment, ⎢ y ⎥ = ⎢v ⎥ = ⎢h21 h22 h23 ⎥ ⎢ y ⎥
Right) Homograph relationship ⎢1 ⎥ ⎢ w⎥ ⎢h h h ⎥ ⎢1 ⎥
⎣ ⎦ ⎣ ⎦ ⎣ 31 32 33 ⎦ ⎣ ⎦ eq2)
It gets images (image-i , i ∈{1, 2}) of world coordinate (world
plane) by two cameras and draws the virtual red button above After develop this formula to Ah=0 form, calculate the H
the images. If a finger exists in a raw which includes a specific matrix through the SVD (singular value decomposition).
button but the button is not selected, the button become a
candidate button and a blue and if a finger is really on the 2.2 Waiting process
button, the button becomes the selected button and a green. In At the waiting process it captures the camera image, analyzes
case that not belong to the above two cases, we look as the the image and confirms the start sign in real-time. The start sign
is that the blob center is located to the inside of the special
square. If the sign is confirmed then the main process is started
International Symposium on Ubiquitous VR 2007 2
or else again the next image is captured and the start sign is exists simultaneously in the region of both candidate buttons
confirmed. with same number then a color of the buttons become a green
because the buttons came to be selected actually. Consequently,
2.3 Main process a color of the buttons is determined by running of a main
If the main process is started than an image is captured and the process and the main process is repeated until the end event
hand region is extracted from the image. The extraction process occurs.
of the hand region has the two steps.
First, it separates the pixel to be belonged the region of the 3 Experiment and Result
hand color and be not from the given image using the mean and
standard variable which are got through the sample data. But Two cameras which are used from the paper are Logitech
generally, like the left side image of [Picture2] we can know the Quickcam pro5000 and the resolution is 320 x 240. The
fact that is extracted not only the hand region but also other experiment accomplished from the interior. CCL works used
region which has the similar color with the hand at the OpenCV libraries.
background.
Picture2. Left) the black and white image of the hand color,
Right) the blob of hand obtained by CCL
For to obtain only the hand region, we apply CCL(Connected
Component Labeling) to the black and white image and appoint
the blob which has the max size from among the blobs to the
hand. A color of the obtained hand blob set to a green and a
color of the background set to a black.
The next work is to judge a location of the button which is
Picture4. Above) Original image & binary image,
pointed actually by the finger through the extracted hand blob. Middle) The state of the candidate button,
Generally, since the finger which points the button has the Bottom) The state of the selected button
longest length from in the region of the hand blob, it seeks the
longest raw and designates an axis of the raw to a location of As a result of an experiment we could know that an
the hand and sets the button which is belong to the raw to a extraction of a hand region not receive largely an effect about a
candidate button. The next step is to confirm the existence yes property of the light and camera and is accomplished well. Also
or no of a green with scanning a region of the candidate button. the process came to be ran in real-time without difficulty of a
If a green exists in the button region, it means that a finger is on use through the robust and fast process ability. In conclusion,
the button. So we can think that the button is selected and set a the proposed user interface method decides the selective
color of the button with a green. presence of the virtual button as it grasps the location of the
hand using the two cameras. With this user can easily interact
with an application system by selecting the virtual button. This
method does not demand an additional device for grasping of
information of the hand and a study which is many. Therefore
the users will be able to use the system easily by intuition.
4 Reference
[1] Richard Hartley and Andrew Zisserman, Multiple View
Geometry, p24-64, 2003.
[2] Rafael C. Gonzalez Richard E. Woods, Digital Image
Processing, p282-344,2001
[3] Intel Open Computer Vision Library,
Picture3. Above) State of the candidate button and the histogram of this, http://sourceforge.net/projects/opencvlibrary
Bottom) State of the selected button and the histogram of this
If the above process is extended to the device with two
cameras then seek the candidate button in the images which get
with each camera. If the number of the two candidate buttons is
same then a color of the buttons becomes a blue. And if a green