=Paper=
{{Paper
|id=Vol-3172/short4
|storemode=property
|title=Possibilities Emerging on the Trajectory from IoT to IoMusT: Enabling Ubiquitous Musical Interactions for Wellbeing
|pdfUrl=https://ceur-ws.org/Vol-3172/short4.pdf
|volume=Vol-3172
|authors=Azeema Yaseen,Joseph Timoney
|dblpUrl=https://dblp.org/rec/conf/avi/YaseenT22
}}
==Possibilities Emerging on the Trajectory from IoT to IoMusT: Enabling Ubiquitous Musical Interactions for Wellbeing==
Possibilities Emerging on the Trajectory from IoT to
IoMusT: Enabling Ubiquitous Musical Interactions for
Wellbeing
Azeema Yaseen1,∗ , Joseph Timoney1
1
Maynooth University, Co. Kildare, Ireland
Abstract
The Internet of Musical Things (IoMusT) and ubiquitous music (ubimus) are interrelated research fields
concerned with the design and development of advanced technology to support novel musical and artistic
experiences. IoMusT primarily motivates the deployment of embedded computing devices and platforms
for musically oriented individual or collaborative activities. Ubimus schemes are musically driven while
keeping the participating agents and entities (such as tablets or computers), interfaces (hardware, or
software), interaction metaphors (visual, tactile, or gestural), and instruments (traditional or novel) not
necessarily complex. It also encourages the design of new musical interactions for participants with little
know-how of the musical domain. Within the context of initiatives embracing the internet of things
(IoT) for healthcare, alongside remodeling IoT devices for IoMusT interactions, this paper considers the
area of ubiquitous musical interaction design for music therapies, offering an example application. This
convergence, with the appropriate technologies, could enable new opportunities for the promotion of
wellbeing, both individually and collectively, in-person and remotely.
Keywords
Internet of Things (IoT), Internet of Musical Things (IoMusT), Ubiquitous music (ubimus), Wellbeing,
Music Therapy
1. Introduction
The Internet of Musical Things is a confluence of cross disciplinary fields including music
technology, the internet of things, human–computer interaction, and artificial intelligence
applied to musical contexts. As a technological perspective, the IoMusT ecosystem is composed
of three core components: i) musical things, ii) connectivity and iii) applications and services.
In the IoMusT network, musical things are computing devices of any form such as wearables,
computers, and tablets dedicated to the production and/or reception of musical content. Musical
information is data detected and processed by a musical thing and is sent to a human or
another musical thing across a network [1]. IoMusT scenarios offer possibilities to support
remote musical experiences and interactions between users involved in a musical activity.
These interactions are also supported by the direct interconnection of interoperable musical
instruments. Smart musical instruments (SMIs) are a subpart of IoMusT applications. SMIs are
EMPATHY: 3rd International Workshop on Empowering People in Dealing with Internet of Things Ecosystems.
Workshop co-located with AVI 2022, June 06, 2022, Frascati, Rome, Italy
Envelope-Open azeema.yaseen.2020@mumail.ie (A. Yaseen); joseph.timoney@mu.ie (J. Timoney)
Orcid 0000-0002-4652-0875 (A. Yaseen); 0000-0001-5822-4742 (J. Timoney)
© 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
Workshop
Proceedings
http://ceur-ws.org
ISSN 1613-0073
CEUR Workshop Proceedings (CEUR-WS.org)
embedded with sensors and actuators, and are able to perform intelligent operations allowing
connectivity to both local networks and the web. On the other hand, the growth of IoT has
facilitated many proposals for healthcare technologies that should provide better life quality. The
main purpose of this paper is to discuss the integration of IoMusT, ubimus, and IoT paradigms,
to present a framework, and to discuss a simple application. The potential of this area is for
applications that could be quickly successful in supporting new creative therapies that could
simultaneously help both the musically experienced and inexperienced. We commence with a
brief review of music technology enabled therapies for wellbeing and then mention some of the
established contributions by ubimus regarding everyday devices for music making. IoMusT
would be the glue that joins them together.
2. Related Work for Motivation
2.1. Wellbeing and its Relationship to Music Therapy
Wellbeing is broader than just the absence of a disease in human body. It brings psychological,
social, and spiritual health along with physical health. Music has been widely used for therapies
improving resilience and building confidence to deal with various scenarios of anxiety and
stress. Recently [2] used beat-making technologies for improvisation during music therapy.
The Ableton Push 2, a Roland TR-8 drum machine and audio interface were used for the
sessions. These MIDI-Linked devices were set up to play and control presets at a synchronized
tempo. Improvisation is an even freer form of music therapy where users are deeply involved
in spontaneous music creation using different instruments, tools, or technologies.
2.2. Ubimus and its Implementation
Ubimus believes that music is universal, and everyone has the potential to create sounds that
are musical. It started in 2007 and affords new metaphors for creative actions (e.g., soundsphere
[3]), interaction modalities (gesture, touch, visualization) and everyday devices to create musical
content. Ubimus research involves performing participants, material resources and extends
musical activities by means of creativity supported systems. An example is Playsound.space, a
web-based tool that searches the sounds of Freesound.org, that can then be dragged and dropped
into a mixing window for free improvisation and experimental music production. In [4], a
hand metaphor (Handy hear and Handy see) was developed to enable camera-based touchless
sonic interactions mapped through gesture recognition for modifying pitch, amplitude, and /or
duration. Using these metaphors even a lay person can interact with it. Ryan Monro designed
Bloomish [5] as a web-based interface implemented in JavaScript to generate tones and melodies
by just screen tapping. The pitch gets higher from bottom to top in the given space. It allows
improvisation, or presets can be played based on user selection.
2.3. Relationship to IoMusT
IoMusT uses Wireless Sensor Networks (WSNs), IoT, and Tactile Internet (TI) as background
communication technologies. These technologies enable the development of SMIs. SMIs cross
boundaries of traditional music settings and venues such as studios and stages due to embedded
technologies for sound production and processing [6]. The smart guitar and smart mandolin
are two examples of SMIs for ubimus remote interactions. The embedded instruments give
flexibility to deliver sonic interaction using hardware (instrument) and software. The real benefit
of IoMusT is the built-in assumption of networked interaction. Such systems can be connected
via local area networks (LAN), wide area networks (WAN), or WSNs. It gives freedom in choices
of collaborations as well as experiences. A known example of a networked music performance
(NMP) system is Reactable [7]. It consists of a touchable table interface which allows users to
control musical output when objects are moved on the surface. Musicians can participant from
different locations to collaborate with each other.
3. Ubimus Driven IoT and IoMusT: An Application for Active
Music Therapies
Figure 1 highlights where IoT and IoMusT can share the same devices and computing resources
for healthcare and music making together. To let cross modal devices and sensors communicate
for such applications, protocols and communication standards for wireless data collection, pro-
cessing and transmission are required as a common language. For example, electromyography
sensor (EMG) measures small electrical signals generated by human muscles as they are moved.
This includes lifting an arm up, clenching a fist, or gestures like moving a finger.
With IoT-based monitoring devices designed in form of bands or headsets that observe
physiological data, the next stage of IoMusT is to give meaning to this data. Thus, the data is
analyzed to detect an artistic intention that reflects some emotion. This data is sent to a musical
interface that configures the smart instrument presets or music sound repositories based on
the users’ requirement. The physiological data can be mapped to various music parameters
such as pitch, timbre, or volume. In active music therapy, users create melodies or improvise,
and this process is continuous. Designing interfaces for such interactions is a challenging task.
The prediction of human musical activity can also inform the interface configurations to offer
more customized resources, but it is also difficult. To accommodate an “active” role of music,
the user would perform gestures that are simpler as the interaction begins. In the therapeutic
scenario, the configuration of movement-musical events would partially depend on the user but
the system would take responsibility for constraining the user to a selection of musical objects
on the interface that are specifically designed for IoMusT. If the initial interaction is based on
the triggering of musical events when two arms are moved, the user would have the control to
add more such mappings between their movements and the music objects as the interaction
evolves. It might be too optimistic to say at this stage, but use of artificial intelligence for
human-centered interfaces presented in [8] suggests a direction towards musical interfaces
that adapt to user scenarios. Though, the designer must ultimately draw a line between how
much control users have and where the system comes forward. Achieving the correct balance
is crucial for such applications.
Figure 1: IoT enabled emotion detection and IoMusT based musical interactions in therapeutic context.
3.1. Short Description about Work in Progress and Future Steps
In our ongoing project we are working on interaction modalities for musical interaction including
color-based metaphors for amateurs and gesture-based musical interactions using computer
vision techniques. The aim is to design and develop low-cost, easy-to-use, and available
technologies for music making and defining new applications based on these systems as an
application to the field of IoMusT. Currently, sensing of the heart rate and galvanic skin response
as shown in figure 2 (b) are used for a low fidelity prototype design of a wrist band (see figure
2 (a)) that triggers music based on the Solfeggio note frequencies for body and mind health.
These Solfeggio tones are known as sound healing modalities for various mental, emotional, and
physical ailments [9]. During an investigation across the multitude of soothing possibilities, we
experimented with a vibration motor to give tactile feedback to users on their wrist as rhythmic
vibrations. In the activity, the prototype was ascertained to operate as was intended, and the
output responses were expected for particular inputs. The alignment of movements along with
IoMusT based musical interactions require mapping between these two domains, and in the
current prototype the selected solfege tones are independent of the user’s control, as it was
simply a rhythmic presentation of the sounds and the vibrations that were delivered. Thus, this
prototype is an example of a passive approach to music therapy. The long-term goal however is
to extend it for active therapies by giving the users more control over the devices.
Figure 2: Initial design of band (a) equipped with sensors (b) to provide music with soothing frequencies.
4. Conclusion
The short paper has introduced a music-based rhythmic vibration device that responds to a
users’ stress level. It embraces the concepts and ideas of using IoT (the sensors), ubimus (the
ease of use), and IoMusT (the musical responsiveness) together in the context of health and
wellbeing. This proposed framework is a work in progress and the mapping of movements to
music, inclusion of interaction over the network and user testing are the next tasks.
References
[1] L. Turchet, C. Fischione, G. Essl, D. Keller, M. Barthet, Internet of musical things: Vision
and challenges, Ieee access 6 (2018) 61994–62017.
[2] A. H. D. Crooke, K. S. Mcferran, Improvising using beat making technologies in music
therapy with young people, Music Therapy Perspectives 37 (2019) 55–64.
[3] W. R. B. Bessa, D. Keller, J. B. F. Da Silva, D. F. Da Costa, A metáfora da esfera sonora desde
a perspectiva wydiwyhe, Journal of Digital Media & Interaction 3 (2020) 60–88.
[4] D. Keller, C. Gomes, L. Aliel, The handy metaphor: Bimanual, touchless interaction for the
internet of musical things, Journal of New Music Research 48 (2019) 385–396.
[5] R. Monro, Bloomish, 2019. URL: https://www.ryanmonro.com/bloomish/.
[6] L. Turchet, P. Bouquet, A. Molinari, G. Fazekas, The smart musical instruments ontology,
Journal of Web Semantics 72 (2022) 100687.
[7] S. Jordà, M. Kaltenbrunner, G. Geiger, R. Bencina, The reactable, in: ICMC, Citeseer, 2005.
[8] F. Catania, G. D. Luca, N. Bombaci, E. Colombo, P. Crovari, E. Beccaluva, F. Garzotto,
Musical and conversational artificial intelligence, in: Proceedings of the 25th International
Conference on Intelligent User Interfaces Companion, 2020, pp. 51–52.
[9] Y. N. E. H. Baakek, S. M. E. A. Debbal, Digital drugs (binaural beats): how can it affect the
brain/their impact on the brain, Journal of medical engineering & technology 45 (2021)
546–551.