=Paper= {{Paper |id=Vol-2801/paper4 |storemode=property |title=Improving User Interfaces for Physicians through New Materials, Tangible Interaction, and Tactile Feedback |pdfUrl=https://ceur-ws.org/Vol-2801/paper4.pdf |volume=Vol-2801 |authors=Anke Reinschluessel,Tanja Döring,Rainer Malaka |dblpUrl=https://dblp.org/rec/conf/etis/ReinschluesselD20 }} ==Improving User Interfaces for Physicians through New Materials, Tangible Interaction, and Tactile Feedback== https://ceur-ws.org/Vol-2801/paper4.pdf
Improving User Interfaces for Physicians through
New Materials, Tangible Interaction, and Tactile
Feedback
Anke V. Reinschluessela , Tanja Döringa and Rainer Malakaa
a
    University of Bremen, Digital Media Lab, Bremen, Germany


                                         Abstract
                                         In the field of medicine, the standard human-computer interaction is still widely based on classical
                                         WIMP interfaces with 2D screens. Due to the ever increasing number of computational systems in med-
                                         ical care this leads to a situation, in which these systems are hard to handle, especially as often a number
                                         of different activities need to be executed in parallel. Furthermore, physicians have little time to famil-
                                         iarize themselves with software and hardware. Due to these reasons multimodal interaction approaches
                                         with devices that offer clear affordances and that are directly integrated into existing medical activities –
                                         instead of introducing additional computer-based tasks – have the potential to significantly improve the
                                         physicians’ work. Especially as medical treatments often are manual tasks, enhancing existing medical
                                         tools or using new materials and tangibles for interaction fits the target group of medical profession-
                                         als well. This ongoing research explores this design space. In this paper, we present three different
                                         user interfaces incorporating electroluminescence displays, tactile feedback and 3D-printed tangibles
                                         in combination with virtual reality to support physicians.

                                         Keywords
                                         health, medical imaging, tangible user interface, TUI, vibrotactile feedback, electroluminescence, VR,
                                         user study




1. Introduction
In the fields of medicine and health, advancements in technology can have an enormous impact
on the well-being of individuals. When talking about serious diseases like cancer, new methods
might even help the decision between life and death, especially as cancer is among the world’s
leading causes for morbidity and mortality [1]. Among various other tests, one diagnostic tool
is imaging of the patient’s body. Using the means of computed tomography scans (CT scans)
or magnetic resonance imaging (MRI/MR) physicians create a vast amount of image data to
locate possible malicious areas. To verify whether it is really cancerous tissue, the physicians
take tissue samples, i.e. make a biopsy, if possible. Modern digital tools, such as visualization
software, can be powerful instruments assisting the surgeon [2, 3]. Nevertheless, to the best
of our knowledge, all commercial tools are still using a 2D display and some form of WIMP
(windows, icons, menus, pointer) interface. The task itself on the other hand is a 3D task, as

Proceedings of ETIS 2020, November 16–20, 2020, Siena, Italy
email: areinsch@uni-bremen.de (A. V. Reinschluessel); tanja.doering@uni-bremen.de (T. Döring);
malaka@uni-bremen.de (R. Malaka)
url: https://www.uni-bremen.de/dmlab-1/team/anke-reinschluessel (A. V. Reinschluessel)
                                       © 2020 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)
the human body expands in all dimensions. Therefore, this viewing and interaction concept is
maybe not the most suitable one and there are various opportunities to design better interfaces
using for example new materials or tangibles. These approaches fit medical professionals
especially well, as medicine (1) relies on manual senses, e.g. during surgery or feeling the
patient’s physical condition, and therefore tangible devices fit their line of work, (2) physicians
generally do not accept technology even if they perceive it as useful [4] and (3) physicians are
always short on time and therefore the devices they use need clear affordances and are best
designed in a way, that they are familiar to them. A good design in this field makes use of a
physician’s everyday objects and skills.
   In the following sections, we will describe three works: two will focus on how to improve or
change the workflow of biopsy taking and one will discuss another way of viewing medical
data generated by CT or MR scans. In each section, we will briefly discuss relevant related work.
These works will use new materials like electroluminescence displays and make use of tactile
feedback to present information to physicians. The last work about viewing medical data will
use the benefits of tangible interaction in combination with virtual reality (VR) systems.


2. Needle Guidance
After suspecting certain regions of tissue to be malicious, a biopsy is performed. To verify the
certain kind of cancer, various tissue samples from the respective organ, e.g. prostate or liver,
have to be taken. In the following, we will present two ways to support the biopsy workflow:
on the one hand with an illuminated needle guidance template using electroluminescence and
on the other hand with vibration guidance.

2.1. Related Work
Various approaches have been developed to support surgical navigation. Different devices have
been evaluated to realize overlaying the navigational information onto the view of the operator
by using augmented (AR) or mixed reality (MR), e.g., head-mounted displays (HMDs), tablet
computers [5, 6, 7] or projection [8]. Additional approaches incorporate instrument-mounted
displays [9] or audio, i.e. sonification of navigational information [10]. As some biopsies and
treatments take place within the MRI, MRI-safe technologies are of interest. The aforementioned
technologies are not MRI-safe, i.e., they are not allowed in the strong magnetic field of an MRI
scanner and thus are not feasible options for MRI-guided interventions.
   Besides manual placement, robotic placement is also an area of research. There are moveable
templates, which can be manipulated by two rotary guides [11, 12] with the help of a robot.
Zangos et al. [13] tested a commercially-available MRI-compatible manipulator (Innomotion)
for prostate biopsy.

2.2. Needle Guidance in MRIs
One very common cancer among males is prostate cancer [1, 14]. To obtain prostate tissue
samples and also to perform therapy like radiation therapy [15], cryoablation or thermal
ablation [16], a needle-shaped tool is inserted into the prostate through either the perineum
Figure 1: The illuminated template prototype (left) & the visual feedback of the illuminated (right).



(the area between the anus and the outer sexual organs) or the rectum. Besides having a skilled
physician to place the needle correctly, a needle guidance template can support the physician
during this task [17]. This template works best in combination with the software 3D Slicer1 and
an MRI scanner for needle and template tracking.
   The template consists of 210 holes (see Figure 1 (left)), having a diameter of 1.3 mm, with
a spacing of 5 mm. The template is placed at arm’s length at the centre of an MRI scanner to
achieve the best image quality. Even though the software 3D Slicer already calculates which
holes to use, the operating physician still has to manually count the holes, which is prone to
prolonged procedure time and human error. We added an EL (electroluminescent) display to the
tangible needle guidance template, which points towards the insertion point using a cross-hair
metaphor (see Figure 1 (right)). By adding this type of display, the task completion time of
inserting the needle can be reduced by 51 % as well as the task load by 47 % [18]. Additionally,
we increased the usability by 30 % compared to the regular manual counting. EL displays have
the advantage that they can be used in the critical environment of MRI scanners, where other
technologies are not allowed due to the strong magnetic field. This shows that “novel” materials,
which have not been applied in this context, can overcome design problems and therefore
support the physicians in their daily work.
   Although viewing the insertion point is already of help, possible interesting future work
would be to explore which other information would be helpful and how to visualise this in a
useful manner. As for needle placement the depth of the needle is another crucial aspect, one
could think about ways to visualise the insertion depth. Here questions about showing the
insertion depth in a linear or non-linear fashion arise, as more precision is needed the closer
the needle is to the target.

2.3. Needle Guidance using Vibrations
As already mentioned, the needles can be placed manually as well. There are also systems
like the CAS-One optical surgical navigation system2 that support this process by showing
a 3D view with the needle, the target and the optimal trajectory to reach the target on a 2D
monitor. In real settings, the placement of the monitor regarding ergonomic aspects is not

    1
        https://www.slicer.org/
    2
        http://cascination.com/
Figure 2: The vibro-band.


always great, e.g., far left or right of the users and the users have to split their attention between
the screen, the needle and the patient. Therefore, we investigated with a vibration wristband
(“virbo-band”) (see Figure 2) (1) whether users can integrate tactile navigational cues with
the visual navigation, (2) if they can correctly identify the vibration and (3) if the vibrations
disrupt their performance of fine motor skills [19]. The results are promising, as all participants
performed the fine motor skill task without interferences and the identification of the vibration
direction was good. Similar results in the aspects time, accuracy, task load and usability were
achieved. The video analysis showed that also some participants focused more on the “patient”
with the vibration condition compared to the visual only condition. This nicely shows how a
multimodal approach could shift the focus towards the patient. One interesting question to
answer in future work would be, if the users focus their attention even more on the patient
with more training with the system as the identification of the directions already worked well
with a minimal training period.


3. Tangibles and Medical 3D Data
While the two previous approaches are about either verifying a diagnose or treating the disease
with radiotherapy or ablations, physicians view the medical data and discuss possible diagnoses
beforehand. Currently, it is still state of the art to view the 2D images as created by CT and
MRI scanners as they are. The physicians have to make the effort to create their own mental
3D model based on the images and their prior anatomy knowledge. This is challenging and
as a cooperating head surgeon stated once “The ability to imagine the 3D situation can decide
between a palliative or a curative approach for the patient.”. One support for this challenge
could be 3D models and VR. The 3D models could be either viewed in VR to achieve a better
understanding of the spatial features or be 3D printed to add a haptic component as well.

3.1. Related Work
Researchers investigated the use of virtual environments to show medical image data. One
example is the VR by Reinschluessel et al. [20], who evaluated different gestures to interact with
2D images shown on a screen in a virtual setting. King et al. [21] explored showing 2D images
Figure 3: The tangible organ models and the virtual 3D model (bottom right). Top: 100 %, middle: 75 %,
bottom: 50 %. Scales are relative to a real patients’ organ size.



of MRI and CT scans in the complete field of view in the virtual space and thereby allowing for
more images to compare at once.
   Also the use of 3D printing in medicine has been increasing since 2000 [22]. A literature
review by Martelli et al. [23] for the period of time between 2005 and 2015 showed that 71.5 %
used it to produce anatomical models. The main advantage reported was better preoperative
planning (48.7 %) and that it saved time in the operating room (32.9 %). As 3D printing is an
expensive endeavor, there are efforts to make it more cost effective and affordable as published
by Witowski et al. [24]. One example for using 3D printed organ models has been conducted by
Zein et al. [25]. In this work, the authors used 3D printed livers or parts of the liver to discover
potential anomalies in the anatomical structure before transplantations.

3.2. 3D-Printed Organ Models with VR
All VR solutions so far either use gestures or a device to explore the images or model. 3D-prints
on the other hand as used by Zein et al. [25] for example have the benefit that the haptics
match the view, but the view is fixed. Therefore, we aim at combining both modalities into one
system. This should lower the cognitive load, as the viewing interaction can be as natural as
with the “real” organ or a real object. Our first approach already showed that surgeons are very
fond of the idea and rate our approach as useful [26]. We showed them an early prototype of a
3D model of a liver in VR (see Figure 3 (bottom right)), which was yet to be controlled with
standard HTC VIVE3 controllers and a transparent 3D print of the same liver (in the size 50 %
of the real size). The participants of the workshop highlighted the advantage of easier grasping
the spatial relations. Furthermore, we observed that a physical same-shaped object naturally
sparked discussions.
   The next question to answer is about the size of the liver controller, i.e. the liver-shaped
object used to control the view of the 3D model in VR. The liver print used in the workshop had
50 % of the original size and therefore had a “convenient” size. Previous research suggests that
the real shape results in higher satisfaction [27, 28, 29] and that tangible interaction is faster
and more intuitive [30] because it benefits from the natural spatial memory of humans [31].
Yet, existing research barely addresses the size of the tangible objects. E.g., when building
tangible devices in game research, like guns for shooters, researchers tried to map the size [28].
   3
       https://www.htcvive.com/
Tinguy et al. [32] explored the just-noticeable difference regarding width, local orientation and
curvature when grasping an object, which indicates that tangible and virtual objects do not
need to be an exact match.
   Still, there are no results about the size differences. This is especially important for large
objects like the liver. A real liver weights about 1400-1800 gram (49.4-63.5 oz), has a transversal
size of 20-23 cm (7.87-9.05 inch), and a lateral size of 15-18 cm (5.90-7.08 inch)4 , depending on
gender, age, body size, weight and health. Adding the shape of the liver, this results in a quite
unhandy object, when thinking of an interaction device. In VR though, we still need the real size
to be able to make decisions about cutting planes and infiltrations. Therefore, we evaluated three
different sizes (50 %, 75 % and 100 %) of the liver controller to investigate whether we needed a
100 % sized object as used in related work (see Figure 3). Further interesting research questions
could explore what haptic properties of the liver controller increase immersion and create a
more convincing experience for medical professionals. Therefore, we aim at experimenting
with 3D printing materials varying in softness and texture. As medical professionals judge the
usefulness also on the ease of use [4], the integration of features like zooming, marking and
changing views as well as patient data are design challenges in terms of intuitive use.


4. Conclusion
This work showed an overview of different approaches how new (tangible) technologies can
support surgeons. By using materials like electroluminescence displays, we were able to design
for a very specific design space - the MRI scanner - and improve an existing tangible user
interface. The results of 51 % faster task completion time in combination with significantly
reduced task load and improved usability clearly show the benefit of our solution. Yet, we
only improved one aspect of this process. Working on a similar problem, needle placement,
we showed that tactile feedback can be an option to present information in a medical context
without disrupting the process. Its whole impact yet has to be investigated, but we can carefully
interpret the results in a way that it can help to focus the attention of the user more on the
task instead of a computer screen, which might even be placed in a non-ergonomic way. Last,
we discussed how tangibles in the shape of the visually presented 3D models can improve
the viewing process of medical data. Although 3D-prints are already used in medicine and
other fields, the question about the appropriate size of an interaction device in relation to
its visual representation in VR remains. All three projects have in common that they aim at
better representations of spatial information, which is among the most important aspects for
the medical professionals. This was highlighted in the various interviews during the design
processes. Therefore using the means of new materials to overcome design space restrictions,
exploring new ways of feedback and especially using tangibles are promising approaches for
this application domain.




   4
       https://radiopaedia.org/articles/hepatomegaly
Acknowledgments
I would like to thank especially Dr. Tanja Döring and Prof. Dr. Rainer Malaka for their support.
Additionally, I would like to thank my collaboration partners for their contribution to the
presented work.


References
 [1] J. Ferlay, E. Steliarova-Foucher, J. Lortet-Tieulent, S. Rosso, J. Coebergh, H. Comber, D. For-
     man, F. Bray, Cancer incidence and mortality patterns in europe: Estimates for 40 countries
     in 2012, European Journal of Cancer 49 (2013) 1374–1403. doi:10.1016/j.ejca.2012.
     12.027.
 [2] I. Endo, R. Matsuyama, R. Mori, K. Taniguchi, T. Kumamoto, K. Takeda, K. Tanaka, A. Köhn,
     A. Schenk, Imaging and surgical planning for perihilar cholangiocarcinoma, Journal of
     Hepato-Biliary-Pancreatic Sciences 21 (2014) 525–532.
 [3] A. Schenk, D. Haemmerich, T. Preusser, Planning of image-guided interventions in the
     liver, IEEE pulse 2 (2011) 48–55.
 [4] P. Ifinedo, Technology acceptance by health professionals in canada: An analysis with a
     modified utaut model, in: 2012 45th Hawaii International Conference on System Sciences,
     2012, pp. 2937–2946.
 [5] F. S. Azar, N. Perrin, A. Khamene, S. Vogt, F. Sauer, User performance analysis of different
     image-based navigation systems for needle placement procedures, volume 5367, 2004, pp.
     5367 – 5367 – 12.
 [6] J. Traub, P. Stefan, S. M. Heining, T. Sielhorst, C. Riquarts, E. Euler, N. Navab, Hybrid
     navigation interface for orthopedic and trauma surgery, in: International Conference
     on Medical Image Computing and Computer-Assisted Intervention, Springer, 2006, pp.
     373–380.
 [7] M. A. Livingston, W. F. Garrett, G. Hirota, M. C. Whitton, E. D. Pisano, H. Fuchs, et al.,
     Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies,
     in: Proceedings of the 23rd SIGGRAPH, ACM, 1996, pp. 439–446.
 [8] K. A. Gavaghan, S. Anderegg, M. Peterhans, T. Oliveira-Santos, S. Weber, Augmented
     reality image overlay projection for image guided open liver ablation of metastatic liver
     cancer, in: Workshop on Augmented Environments for Computer-Assisted Interventions,
     Springer, 2011, pp. 36–46.
 [9] M. Herrlich, P. Tavakol, D. Black, D. Wenig, C. Rieder, R. Malaka, R. Kikinis, Instrument-
     mounted displays for reducing cognitive load during surgical navigation, International
     Journal of Computer Assisted Radiology and Surgery 12 (2017) 1599–1605.
[10] C. Hansen, D. Black, C. Lange, F. Rieber, W. Lamadé, M. Donati, K. J. Oldhafer, H. K. Hahn,
     Auditory support for resection guidance in navigated liver surgery, The International
     Journal of Medical Robotics and Computer Assisted Surgery 9 (2013) 36–43.
[11] K. Fujimoto, Y. Arimitsu, N. Hata, S.-E. Song, J. Tokuda, Needle placement manipulator
     with two rotary guides, 2015.
[12] S.-E. Song, J. Tokuda, K. Tuncali, A. Yamada, M. Torabi, N. Hata, Design evaluation of a
     double ring rcm mechanism for robotic needle guidance in mri-guided liver interventions,
     in: Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on,
     IEEE, 2013, pp. 4078–4083.
[13] S. Zangos, A. Melzer, K. Eichler, C. Sadighi, A. Thalhammer, B. Bodelle, R. Wolf, T. Gruber-
     Rouh, D. Proschek, R. Hammerstingl, et al., Mr-compatible assistance system for biopsy in
     a high-field-strength system: initial results in patients with suspicious prostate lesions,
     Radiology 259 (2011) 903–910.
[14] If prostate cancer screening test results aren’t normal, 2016. URL: https://www.cancer.org/
     cancer/prostate-cancer/early-detection/if-test-results-not-normal.html.
[15] A. V. D’Amico, R. Cormack, C. M. Tempany, S. Kumar, G. Topulos, H. M. Kooy, C. N.
     Coleman, Real-time magnetic resonance image-guided interstitial brachytherapy in the
     treatment of select patients with clinically localized prostate cancer, Int. J. Radiat. Oncol.
     Biol. Phys. 42 (1998) 507–515.
[16] M. Valerio, Y. Cerantola, S. E. Eggener, H. Lepor, T. J. Polascik, A. Villers, M. Emberton,
     New and established technology in focal ablation of the prostate: A systematic review,
     Eur. Urol. 71 (2017) 17–34.
[17] J. Tokuda, K. Tuncali, I. Iordachita, S.-E. Song, A. Fedorov, S. Oguro, A. Lasso, F. M. Fennessy,
     C. M. Tempany, N. Hata, In-bore setup and software for 3t mri-guided transperineal prostate
     biopsy, Physics in medicine and biology 57 (2012) 5823.
[18] A. V. Reinschluessel, M. Herrlich, T. Döring, M. Vangel, C. Tempany, R. Malaka, J. Tokuda,
     Insert needle here! a custom display for optimized biopsy needle placement, in: Proceedings
     of the 2018 CHI Conference on Human Factors in Computing Systems, CHI ’18, Association
     for Computing Machinery, New York, NY, USA, 2018. doi:10.1145/3173574.3173837.
[19] A. V. Reinschluessel, S. C. Cebulla, M. Herrlich, T. Döring, R. Malaka, Vibro-band: Sup-
     porting needle placement for physicians with vibrations, in: Extended Abstracts of the
     2018 CHI Conference on Human Factors in Computing Systems, CHI EA ’18, Association
     for Computing Machinery, New York, NY, USA, 2018. doi:10.1145/3170427.3188549.
[20] A. V. Reinschluessel, J. Teuber, M. Herrlich, J. Bissel, M. van Eikeren, F. Ganser, J.and Koeller,
     F. Kollasch, T. Mildner, L. Raimondo, et al., Virtual reality for user-centered design and
     evaluation of touch-free interaction techniques for navigating medical images in the
     operating room, in: Proceedings of the 2017 CHI Conference Extended Abstracts on
     Human Factors in Computing Systems, ACM, 2017, pp. 2001–2009.
[21] F. King, J. Jayender, S. K. Bhagavatula, P. B. Shyn, S. Pieper, T. Kapur, A. Lasso, G. Fichtinger,
     An immersive virtual reality environment for diagnostic imaging, Journal of Medical
     Robotics Research 1 (2016) 1640003.
[22] C. L. Ventola, Medical applications for 3d printing: current and projected uses, Pharmacy
     and Therapeutics 39 (2014) 704.
[23] N. Martelli, C. Serrano, H. van den Brink, J. Pineau, P. Prognon, I. Borget, S. E. Batti,
     Advantages and disadvantages of 3-dimensional printing in surgery: Asystematic review,
     Surgery 159 (2016) 1485 – 1500.
[24] J. S. Witowski, M. Pędziwiatr, P. Major, A. Budzyński, Cost-effective, personalized, 3d-
     printed liver model for preoperative planning before laparoscopic liver hemihepatectomy
     for colorectal cancer metastases, International journal of computer assisted radiology and
     surgery 12 (2017) 2047–2054.
[25] N. N. Zein, I. A. Hanouneh, P. D. Bishop, M. Samaan, B. Eghtesad, C. Quintini, C. Miller,
     L. Yerian, R. Klatte, Three-dimensional print of a liver for preoperative planning in living
     donor liver transplantation, Liver Transplantation 19 (2013) 1304–1310.
[26] A. V. Reinschluessel, T. Muender, V. Uslar, D. Weyhe, A. Schenk, R. Malaka, Tangible
     organs: Introducing 3d printed organ models with vr to interact with medical 3d models,
     in: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing
     Systems, CHI EA ’19, ACM, New York, NY, USA, 2019, pp. LBW1816:1–LBW1816:6. doi:10.
     1145/3290607.3313029.
[27] M. Azmandian, M. Hancock, H. Benko, E. Ofek, A. D. Wilson, Haptic retargeting: Dynamic
     repurposing of passive haptics for enhanced virtual reality experiences, in: Proceedings of
     the 2016 CHI Conference on Human Factors in Computing Systems, CHI ’16, ACM, New
     York, NY, USA, 2016, pp. 1968–1979. doi:10.1145/2858036.2858226.
[28] A. Krekhov, K. Emmerich, P. Bergmann, S. Cmentowski, J. Krüger, Self-transforming con-
     trollers for virtual reality first person shooters, in: Proceedings of the Annual Symposium
     on Computer-Human Interaction in Play, CHI PLAY ’17, ACM, New York, NY, USA, 2017,
     pp. 517–529. URL: http://doi.acm.org/10.1145/3116595.3116615. doi:10.1145/3116595.
     3116615.
[29] T. Muender, A. V. Reinschluessel, S. Drewes, D. Wenig, T. Döring, R. Malaka, Does it feel
     real?: Using tangibles with different fidelities to build and explore scenes in virtual reality,
     in: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems,
     CHI ’19, ACM, New York, NY, USA, 2019, pp. 673:1–673:12. URL: http://doi.acm.org/10.
     1145/3290605.3300903. doi:10.1145/3290605.3300903.
[30] L. Besançon, P. Issartel, M. Ammi, T. Isenberg, Mouse, tactile, and tangible input for
     3d manipulation, in: Proceedings of the 2017 CHI Conference on Human Factors in
     Computing Systems, CHI ’17, ACM, New York, NY, USA, 2017, pp. 4727–4740.
[31] A. Cockburn, B. McKenzie, Evaluating the effectiveness of spatial memory in 2d and 3d
     physical and virtual environments, in: Proceedings of the SIGCHI Conference on Human
     Factors in Computing Systems, CHI ’02, ACM, New York, NY, USA, 2002, pp. 203–210.
[32] X. d. Tinguy, C. Pacchierotti, M. Emily, M. Chevalier, A. Guignardat, M. Guillaudeux,
     C. Six, A. Lécuyer, M. Marchal, How different tangible and virtual objects can be while
     still feeling the same?, in: 2019 IEEE World Haptics Conference (WHC), 2019, pp. 580–585.
     doi:10.1109/WHC.2019.8816164.