<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>use of the Emotiv Epoc Flex kit in applications involving artificial intelligence</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Dawid</string-name>
          <email>pawusdawid@gmail.com</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Pawuś</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Paszkiel</string-name>
          <email>s.paszkiel@po.edu.pl</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Control</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Informatics</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Poland</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Workshop</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of Technology</institution>
          ,
          <addr-line>Prószkowska 76 Street, 45-758 Opole</addr-line>
          ,
          <institution>Faculty of Electrical Engineering</institution>
          ,
          <addr-line>Automatic</addr-line>
        </aff>
      </contrib-group>
      <fpage>0000</fpage>
      <lpage>0003</lpage>
      <abstract>
        <p>This article presents a wide, proprietary range of Emotiv Epoc Flex Gel headset applications for EEG signal measurement. It is about its use in systems involving artificial intelligence, such as artificial neural networks and expert systems. The constantly developing field of biomedical engineering as well as newer and more advanced BCI (brain-computer interface) systems require their designers to constantly develop and search for various innovative methods used in their creation. In response to practical requirements and the possibility of using the system in real conditions, the authors propose an advanced solution using EEG signal analysis (electroencephalography). An AI-based approach to designing the BCI system was used for advanced signal analysis. The article contains a detailed description of two applications based on artificial intelligence using EEG signals. The first one, for controlling a mobile robot using mental commands. The second one, on the other hand, for controlling a mobile robot with verification in the form of an EMG signal. This article provides a comprehensive overview of the integration of the Emotiv Epoc Flex Kit with proprietary AI systems and its significant impact on the field. EEG; Emotiv EPOC Flex Gel; neural networks; EMG; LEGO Mindstorms; brain-computer interface (BCI); motor imagery verification; signal classification; expert system; artificial Proceedings ITTAP'2023: 3rd International Workshop on Information Technologies: Theoretical and Applied Problems, November 22-24, Proceedings</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>intelligence</p>
    </sec>
    <sec id="sec-2">
      <title>1. Introduction</title>
      <p>
        Electroencephalography (EEG) is widely used in research involving biomedical engineering,
neuroscience and others (e.g. Brain-Computer Interface, BCI), as well as in sleep analysis and detecting
abnormal brain function. The reason is e.g. non-invasiveness and relatively low financial costs [
        <xref ref-type="bibr" rid="ref1 ref2 ref3">1, 2</xref>
        ].
      </p>
      <p>
        An electroencephalogram (EEG) captures the electrical activity patterns emanating from the cerebral
cortex. Due to the minute nature of these electrical signals, typically measured in microvolts, a
substantial amplification, roughly on the order of a millionfold, is required for them to be visualized on
a computer screen. The recorded signals primarily originate from the neurons, within which a multitude
of bioelectric events occur. These encompass phenomena like action potentials, post-synaptic potentials
(PSP), and the protracted depolarization of neurons over an extended period [
        <xref ref-type="bibr" rid="ref37 ref4 ref5 ref6">36, 3, 4, 5</xref>
        ].
      </p>
      <p>
        Brain-computer interface (BCI) technology facilitates direct communication between the brain and
external devices. The analysis of electroencephalogram (EEG) signals plays a pivotal role in the
ongoing exploration of BCI capabilities. BCI technology emerged in the 1990s, and although it remains
relatively new, its potential to transform the way people interact with computers and other devices is
profound [
        <xref ref-type="bibr" rid="ref10 ref7 ref8 ref9">6, 7, 8, 9</xref>
        ].
      </p>
      <p>2020 Copyright for this paper by its authors.
CEUR</p>
      <p>ceur-ws.org</p>
      <p>
        Currently, most widely adopted methods for computer interaction rely on muscular movements. In
contrast, EEG-based brain-computer interfaces have been under scrutiny for numerous years as a means
of communication and control for individuals with physical disabilities. Through training, individuals
can learn to employ imagined motor actions as input data for computers or to control assistive
technologies. This promises to enhance the accessibility and usability of technology for those with
limited physical mobility [
        <xref ref-type="bibr" rid="ref11 ref12 ref13 ref38">37, 10, 11, 12, 39</xref>
        ].
      </p>
      <p>
        Research on brain-computer interfaces, including research involving projects using artificial
intelligence, was addressed in [
        <xref ref-type="bibr" rid="ref14 ref15 ref16 ref17 ref18 ref19 ref20">13, 14, 15, 16, 17, 18, 19, 38</xref>
        ]. The authors of these papers addressed
various issues, from BCI systems, through classifiers and simulations, to various types of medical and
biomedical applications.
      </p>
      <p>
        Electromyography (EMG) is an electrophysiological test that captures the electrical impulses
generated by muscle activity. In the realm of neuromodulation research, EMG is frequently employed
to assess the diverse effects of stimulation within the brain's motor regions. In clinical settings, EMG
serves as a valuable tool for diagnosing nervous and muscular disorders, allowing for the localization
and characterization of pathologies [
        <xref ref-type="bibr" rid="ref21 ref22">20, 21</xref>
        ].
      </p>
      <p>In clinical applications, EMG may necessitate the insertion of a small needle into muscles to record
electrical activity accurately. However, in the field of biomedical engineering, researchers commonly
opt for non-invasive methods, utilizing surface electrodes placed on the skin to detect muscle-generated
electrical activity. This non-invasive approach eliminates health risks associated with invasive methods.</p>
      <p>
        Myoelectric interfaces also find utility in rehabilitation technology as supportive devices. The EMG
signal, a prominent biological signal, is often harnessed to predict human motor intentions and can be
integrated into human-robot collaboration systems [
        <xref ref-type="bibr" rid="ref23 ref24 ref25 ref26 ref27">22, 23, 24, 25, 26</xref>
        ].
      </p>
      <p>
        The Emotiv EPOC Flex Gel is an affordable, lightweight, wireless brain-computer interface (BCI)
headset that offers reliable control and efficient measurement capabilities. Each sensor on this headset
has the capability to capture real-time data from four distinct brainwave frequencies, including delta,
theta, alpha, and beta [
        <xref ref-type="bibr" rid="ref28 ref29 ref30 ref31 ref32 ref33">27, 28, 29, 30, 31, 32</xref>
        ].
      </p>
      <p>
        Despite its advanced features and accuracy, the utilization of this particular device (the Flex Gel
version) in research remains relatively scarce. It's worth noting that this headset stands out as one of the
manufacturer's most intricate and precise offerings. However, its adoption is steadily increasing, and its
applications are expanding across various domains [
        <xref ref-type="bibr" rid="ref15 ref16 ref17 ref29 ref34 ref35 ref36">14, 15, 16, 28, 33, 34, 35</xref>
        ].
      </p>
      <p>The first system revolves around research utilizing the Emotiv Epoc Flex kit, developed as a
response to the quest for innovative solutions for controlling robotic components through
usergenerated mental commands. In this endeavor, the recorded signal, acquired through a 32-electrode
apparatus, underwent preprocessing for classification. This involved a novel approach that integrated
the EEG signal, thereby producing modified waveforms that could be identified not only by
conventional proprietary software but also by an artificial neural network. Effective signal classification
culminated in the generation of a control signal, which was subsequently employed to govern the actions
of the LEGO EV3 Mindstorms robot.</p>
      <p>The aim of the research included in the second project was to design an EEG signal classification
system for controlling a mobile robot while verifying pure mental commands using a sensor that
measures the EMG signal. This is crucial, because paralyzed people can control objects only by means
of generated changes in the EEG signal, without any additional "support" by movements of the muscles
of the limbs.</p>
    </sec>
    <sec id="sec-3">
      <title>2. Emotiv device and software</title>
      <p>The Emotiv EPOC Flex is a wearable electroencephalography (EEG) headset designed for capturing
and interpreting brain activity. It's a product of Emotiv, a company specializing in brain-computer
interface (BCI) technologies. The EPOC Flex is known for its flexibility and adaptability, making it
suitable for various applications, including brain research, human-computer interaction, and
neurofeedback. An example EEG signal waveform in the Emotiv PRO environment is presented in
Figure 1.</p>
      <p>An equally important issue is the arrangement of electrodes and their nomenclature. This is described
in Table 1.</p>
    </sec>
    <sec id="sec-4">
      <title>3. First system</title>
      <p>
        This chapter introduces an intriguing approach to the task of recognizing and categorizing
electroencephalographic (EEG) signals. The scarcity of studies utilizing the Emotiv Epoc Flex kit
prompted the pursuit of original solutions, particularly in the realm of controlling robotic components
through user-issued mental commands. The measured signal, acquired through a 32–electrode device,
underwent preparation for classification via a novel technique involving the integration of the EEG
signal. This transformation led to the generation of new waveforms, which could subsequently be
recognized by an artificial neural network. Through the appropriate classification of the signal, a control
signal was generated, facilitating the manipulation of the LEGO EV3 Mindstorms robot [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
3.1.
      </p>
    </sec>
    <sec id="sec-5">
      <title>Presentation of methodology, data acquisition and system</title>
      <p>The construction of the EEG signal analysis and classification system described in this article
comprises several components. The raw EEG signal was recorded using an Emotiv EPOC Flex device,
and measurements were acquired using the dedicated EmotivPRO software. In Figure 2, you can
observe the electrode configuration on the headset, along with a vehicle designed by the authors, which
is based on the LEGO EV3 cube.</p>
      <p>
        The electrodes in the set are designated by the following channel names: Cz, Fz, Fp1, F7, F3, FC1,
C3, FC5, FT9, T7, TP9, CP5, CP1, P3, P7, O1, Pz, Oz, O2, P8, P4, CP2, CP6, TP10, FC6, C4, FC2,
F4, F8, and Fp2 [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
      </p>
      <p>
        Two individuals participated in this study, during which several hundred tests were conducted.
Signals were recorded both during resting states and when commands to move forward, backward,
right, and left were triggered. An essential factor influencing the study's outcomes is the level of focus.
It is imperative that the participants maintain a state of deep concentration, as the absence of focus can
disrupt the test process, potentially leading to inaccurate results. After collecting a substantial number
of measurements, further processing and classification were performed using various methods [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
      </p>
      <p>
        In this research endeavor, all analyses and classification methodologies were executed using Matlab,
a platform for programming and numerical calculations. The system's operational flow is depicted in
Figure 3. Within this program, it is possible to appropriately filter the signals, followed by the execution
of relevant analyses and classification procedures. In the subsequent stage, the signal is integrated over
time for each of the 32 electrodes individually, spanning one second for the entire signal duration. This
process yields one–second integrated samples, which serve as a foundation for the precise determination
and classification of signal types based on the integrated potentials [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
      </p>
    </sec>
    <sec id="sec-6">
      <title>EEG signal integration and filtration method</title>
      <p>
        The EEG signal obtained from the headset undergoes a crucial filtering process. The raw signal
received from the 22 most essential electrodes is not directly subjected to classification by the designed
system. This decision was made to simplify the algorithm's development process, as unfiltered
waveforms exhibit prolonged stabilization and transient times when voltage measured by the sensors
increases. Consequently, a high–pass filter with a sampling frequency of fs= 1000 Hz and a bandwidth
of fp= 200 Hz was employed [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
      </p>
      <p>
        The dedicated application facilitates EEG signal sampling at a frequency of fe= 128 Hz. With this
knowledge in mind, the authors proposed an innovative approach that involves the rolling integration
of the signal from each electrode for a one–second duration. This approach is unique in the context of
pre–classifying biomedical signals such as electroencephalograms. In terms of methodology, this can
be effectively represented analytically. The variable M represents the number of samples within the
signal, while k signifies the count of full one–second periods in the signal. Given the information about
the sampling frequency fe, it's understood that, for instance, a 5–second EEG signal comprises M= 640
samples. Utilizing the calculations in Equation 1, it is determined that there will be k= 5 full one–second
periods available for classification. Importantly, in the case of a 10.5–second signal, only the first 10
seconds will be considered, amounting to a total of M= 1280 samples, which yields k= 10 integrated
signal samples. This approach is implemented to ensure efficiency and clarity in the interpretation of
each waveform [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
      </p>
      <p>
        =

128
index of each array corresponds to a specific sample within the integrated value. To illustrate, for a 10–
second signal from the second electrode, where k= 10, the S variable takes the form of S2(1...k). Organizing
data in this tabular manner proves to be both convenient and efficient for applications of this nature
[
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
      </p>
      <p>
        The EEG variable represents the signal obtained from individual electrodes within the set. As
outlined in Equation 2, it is evident that the absolute value of the signal from each channel is integrated
over a span of 128 samples. This integration approach facilitates the utilization of the rolling integration
method and enables the recording of these integrated values into the S variables [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
3.3.
      </p>
    </sec>
    <sec id="sec-7">
      <title>Signal classification by the designed neural network</title>
      <p>In this section, we will delve into an advanced approach that incorporates a neural network
algorithm. Specifically, a feed–forward neural network has been put forth for analysis. The
determination of the number of layers, the choice of activation functions, and the allocation of neurons
in each layer were made through a process of trial and error, guided by an expert approach.</p>
      <p>To conduct the neural network training procedure, it was essential to prepare a training dataset
comprising both input and output data. In the training process, the training input set U was employed,
which utilized a pattern matrix u with a total of i= 32 rows. These input values were derived from</p>
      <p>The variable S is individually defined for each electrode and is represented as S1 to S22, as
demonstrated in Equation 2 below.</p>
      <p>
        In turn, the set of output patterns follows the relationship:
the number of electrodes in the set [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
      </p>
      <p>
        For the neural network training, an output set Y was also imperative. This set consisted of an input
pattern matrix y with j= 5 rows, reflecting the recognition of 5 distinct signal types: neutral state,
forward movement, backward movement, right movement, and left movement. It's noteworthy that both
the input and output training patterns comprised an equal number of elements, totaling n= 250. The
presentation of the data can be substantiated using the following formulas [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ]:
      </p>
      <p>The chosen neural network architecture incorporates 4 hidden layers. In the first hidden layer, the
tangent function tansig is employed with 35 neurons. The second layer utilizes the logistic sigmoid
activation function logsig and comprises 30 neurons. Moving on to the third layer, it incorporates the
radial basis function radbas with 20 neurons. Finally, the last hidden layer once again employs the
tangent function tansig and is composed of 15 neurons.</p>
      <p>
        The concluding segment of this feed-forward neural network consists of a single layer containing 5
neurons, each utilizing a linear activation function purelin. This linear layer plays a pivotal role in
summing up the outputs from the non-linear neuron activation functions situated in the preceding layers
[
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
(3)
(4)
      </p>
      <p>
        In the network diagram (Figure 4), the output data vector is denoted as y, while the network inputs,
which represent the integrated values of the EEG signal from the 32 electrodes, are indicated as u [
        <xref ref-type="bibr" rid="ref17">16</xref>
        ].
3.4.
      </p>
    </sec>
    <sec id="sec-8">
      <title>Selected examples of results</title>
      <p>The ultimate phase of this research involves validating the functionality of the developed
classification system. To accomplish this, both the performance of the classical algorithm and artificial
intelligence solutions were assessed. These tests were conducted using pre–processed EEG signals from
the archive. The outcomes of signal recognition pertaining to the study participants are illustrated in
Figures 5 and 6.</p>
    </sec>
    <sec id="sec-9">
      <title>4. Second system</title>
      <p>This chapter will present issues related to the second designed system. The authors will present the
methodology, data acquisition and system. In addition, a method of preparing the EMG signal for
classification will be shown. Then the classification system is described, and the chapter ends with
examples of results.</p>
      <p>The EEG and EMG signal analysis and classification system described in this paper comprises
several interconnected components. Initially, the raw EEG signals were captured using the Emotiv
EPOC Flex Gel headset, and the data acquisition was facilitated by the dedicated EmotivPRO software.
On the other hand, the EMG signals were acquired using the MyoWare Muscle Sensor device, which
was connected to an Arduino UNO microcontroller. This setup was then synchronized with the Matlab
program, where the acquired data was collected and subjected to analysis. Both the EEG and EMG
signals underwent assessment through an expert system that incorporates artificial intelligence
algorithms in the form of a neural network. Additionally, a conventional algorithm for artifact detection
and activation of the arm muscles was integrated into the system. The schematic representation of this
system can be observed in Figure 7.</p>
      <p>An illustrative example of the research approach is depicted in Figure 8, showcasing the moment
when EEG and EMG signals were recorded from two study participants. In total, a cohort of 10
individuals underwent testing. This sample size was chosen to enable a robust evaluation of the
algorithm's effectiveness and, notably, to design it with a degree of resilience to variations in
measurement values that can be attributed to individual differences.</p>
      <p>The Emotiv EPOC Flex Gel device utilized in the study features 22 electrodes, each designated with
specific channel names: Cz, Fz, Fp1, F7, F3, FC1, C3, FC5, FT7, T7, CP5, CP1, CP2, CP6, FT8, FC6,
C4, T8, FC2, F4, F8, and Fp2.
right).
left).</p>
      <p>Imagining an arm movement.
imaginings directing the robot forward).</p>
      <p>Imagining an arm movement.</p>
      <p>Imagining an arm movement (mentally commanding the mobile robot's movement).
Blinking the right eye (a switch that causes the next move command to move the robot to the
Imagining an arm movement.</p>
      <p>Blinking the left eye (a switch that causes the next move command to move the robot to the
Blinking both eyes (returning the system to the initial state, with subsequent arm movement
Each participant in the study conducted a series of 15 measurements, involving the repetition of
specific mental command sequences. These sequences, each lasting a few seconds, encompassed the
following steps:</p>
      <p>ℎ =


 (1… ) = {
1, 
0, 
 
 
(1…ℎ)…( −ℎ… ) ≥  
(1…ℎ)…( −ℎ… ) &lt;</p>
      <p>This carefully orchestrated sequence of activities resulted in measurement signals that extended over
approximately 45 seconds.</p>
      <p>Concurrently with the EEG measurements, participants also wore the MyoWare Muscle Sensor
device on their biceps brachii muscles. This allowed for the effective collection of the EMG signal,
which later facilitated the verification of pure mental commands, excluding any reliance on additional
muscle movements. This aspect is of paramount importance, particularly for individuals with paralysis,
as it enables research on systems exclusively reliant on mental commands for controlling arm
movements.</p>
      <p>
        Participants were instructed to maintain high levels of concentration throughout the experiments, as
lapses in focus could disrupt the research process and lead to erroneous results. Once a sufficient
number of measurements were obtained, subsequent processing and classification were carried out
using a combination of artificial intelligence and conventional algorithms [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
4.2.
      </p>
    </sec>
    <sec id="sec-10">
      <title>Method of preparing the EMG signal for classification</title>
      <p>
        The MyoWare Muscle Sensor (EMG) device is a distinct kit and operates independently from the
Emotiv EPOC Flex Gel (EEG) kit. Consequently, synchronization between these two systems is not
seamless. This disparity impacts the chosen sampling time, which was arbitrarily determined by the
system designers. It's essential to acknowledge that EMG measurement devices are not obligated to
produce signals sampled in identical fashion to an electroencephalograph (EEG). In this context, the
emphasis is not on prolonged signal analysis; rather, the priority lies in promptly and effectively
detecting muscle activation [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
      </p>
      <p>The approach for segmenting the collected samples was based on the formulation provided in
Equation 1. For instance, given a 10–second EEG signal with a parameter value of k = 10, and an EMG
signal consisting
of</p>
      <p>
        N=
100 samples, it can
be computed that for every second of the
electroencephalographic signal, there will be h = 10 samples of the EMG waveform [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
(1)
(2)
      </p>
      <p>The electromyographic signal will be assessed in a binary manner, as outlined in Equation 2. The
variable L will store a sequence of binary values, where 0 signifies no muscle movement (EEMG &lt; Vmax),
and 1 indicates muscle activity (EEMG ≥ Vmax). Here, Vmax represents the maximum tension level,
determined through observations of the system when the arm muscles are deemed to be at rest, set at
4.5 V.</p>
      <p>
        To illustrate, suppose that within the 40–50 range of the EMG signal, at least one of the samples
registers a value greater than the specified voltage, i.e., E(40…50) &gt;= Vmax, then L(5) will be set to 1. This
will unequivocally halt any commands to control the robot, even if the conditions for imagining arm
EMG
movement are met [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
      </p>
      <p>
        This approach offers distinct advantages, with the primary one being its independence from the
sampling rate of the MyoWare Muscle Sensor. The method employed allows for arbitrary sampling of
the sensor, as it hinges on the variable k, which governs the range of samples h per second of the EEG
signal [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
4.3.
      </p>
    </sec>
    <sec id="sec-11">
      <title>The functioning of the designed neural network</title>
      <p>
        The neural network was trained using two primary datasets: I (input) and O (output). The first
dataset, denoted as the input training set, comprises 22 network inputs corresponding to the
measurement channels of the EEG signal. These inputs were organized as a pattern matrix with a total
of a= 22 rows, each representing integrated one–second segments of the EEG signal. The ouput training
data was stored in a separate matrix, which had b= 2 rows. This specific data structure aligns with the
requirements of the neural network used, which was designed for recognizing binary 0–1 signals,
specifically related to the imagination of arm movement or its absence [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
      </p>
      <p>The input and output training patterns were determined through an iterative process, and it was
determined that a satisfactory set size was n= 450. In Figure 9, you can see the architecture of the feed–
forward neural network employed for classifying periodically integrated EEG signals.</p>
      <p>
        This feed–forward network utilized in the research comprises 4 hidden layers and one output layer.
The first layer is equipped with a tangential activation function and consists of 25 neurons. The second
layer employs a logarithmic sigmoid activation function with 20 neurons. The third layer, featuring 15
neurons, employs a radial activation function. Moving further, the penultimate fourth layer, comprising
10 neurons, employs the tangential activation function. Finally, the output layer contains a linear
activation function and consists of 2 neurons, which aligns with the number of neural network outputs.
The primary role of this output layer is to aggregate the outputs of the non–linear neuron activation
functions [
        <xref ref-type="bibr" rid="ref15">14</xref>
        ].
4.4.
      </p>
    </sec>
    <sec id="sec-12">
      <title>Selected examples of results</title>
      <p>In this section, the authors provide an illustrative test of the designed classification system, which
encompasses the classification and validation of both EEG and EMG signals. This evaluation utilizes
an expert system that combines classical algorithms with artificial intelligence techniques.</p>
    </sec>
    <sec id="sec-13">
      <title>5. Discussion and conclusions</title>
      <p>The first system under investigation focuses on the Emotiv EPOC Flex headset, which is relatively
uncommon compared to other products from the same manufacturer. In considering its potential
applications in controlling robotic components, it becomes evident that the EEG signal classification
methods outlined in this paper open up new avenues for research in this domain. It's noteworthy that
the research employs a novel approach to EEG signal analysis, revolving around periodic signal
integration. Furthermore, it incorporates artificial intelligence techniques and an expert–driven
approach to signal classification. An additional factor contributing to the innovative nature of this study
is the utilization of the Emotiv Epoc Flex Gel headset.</p>
      <p>The second system detailed in this paper offers numerous prospects for further development and is
well–suited for subsequent research catering to individuals with disabilities. The validation of the EMG
signal serves as a means to assess the algorithm's performance in scenarios where mental commands
are issued without concurrent muscle activation, as is often the case with paralyzed individuals who
lack limb mobility. The results obtained in this experiment conclusively demonstrate the system's ability
to classify integrated EEG signals and verify muscle movements, thereby achieving its intended
objectives.</p>
    </sec>
    <sec id="sec-14">
      <title>6. References</title>
      <p>[38] Nisha, S. S., &amp; Meeral, M. N. Applications of deep learning in biomedical engineering. In</p>
      <p>Handbook of deep learning in biomedical engineering (pp. 245-270). Academic Press, (2021).
[39] Khalil, M. A., &amp; George, K. Using Neural Network Models for BCI Based Lie Detection. In
2022 IEEE 13th Annual Ubiquitous Computing, Electronics &amp; Mobile Communication
Conference (UEMCON) (pp. 0505-0509). IEEE, (2022).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Wang</surname>
            , Xin,
            <given-names>Tapani</given-names>
          </string-name>
          <string-name>
            <surname>Ahonen</surname>
            , and
            <given-names>Jari</given-names>
          </string-name>
          <string-name>
            <surname>Nurmi</surname>
          </string-name>
          .
          <article-title>"Applying CDMA technique to network-onchip." IEEE transactions on very large scale integration (VLSI) systems 15</article-title>
          .10 (
          <year>2007</year>
          ):
          <fpage>1091</fpage>
          -
          <lpage>1100</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Bricker</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          <article-title>The neural and cognitive mechanisms of knowledge attribution: An EEG study</article-title>
          .
          <source>Cognition</source>
          ,
          <volume>203</volume>
          ,
          <fpage>104412</fpage>
          , (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Gordleeva</surname>
            ,
            <given-names>S. Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lobov</surname>
            ,
            <given-names>S. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grigorev</surname>
            ,
            <given-names>N. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Savosenkov</surname>
            ,
            <given-names>A. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shamshin</surname>
            ,
            <given-names>M. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lukoyanov</surname>
            ,
            <given-names>M. V.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Kazantsev</surname>
          </string-name>
          , V. B.
          <article-title>Real-time EEG-EMG human-machine interfacebased control system for a lower-limb exoskeleton</article-title>
          .
          <source>IEEE Access</source>
          ,
          <volume>8</volume>
          ,
          <fpage>84070</fpage>
          -
          <lpage>84081</lpage>
          , (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Sherman</surname>
            ,
            <given-names>D. L.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Thakor</surname>
            ,
            <given-names>N. V.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Eeg signal processing: Theory and applications</article-title>
          .
          <source>Neural Engineering</source>
          ,
          <fpage>97</fpage>
          -
          <lpage>129</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [4]
          <string-name>
            <surname>He</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Astolfi</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Valdés-Sosa</surname>
            ,
            <given-names>P. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Marinazzo</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Palva</surname>
            ,
            <given-names>S. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bénar</surname>
            ,
            <given-names>C. G.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Koenig</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>Electrophysiological brain connectivity: theory and implementation</article-title>
          .
          <source>IEEE transactions on biomedical engineering</source>
          ,
          <volume>66</volume>
          (
          <issue>7</issue>
          ),
          <fpage>2115</fpage>
          -
          <lpage>2137</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Baravalle</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guisande</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Granado</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rosso</surname>
            ,
            <given-names>O. A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Montani</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>Characterization of visuomotor/imaginary movements in EEG: An information theory and complex network approach</article-title>
          . Frontiers in Physics,
          <volume>7</volume>
          ,
          <fpage>115</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Tudor</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tudor</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Tudor</surname>
            ,
            <given-names>K. I.</given-names>
          </string-name>
          (
          <year>2005</year>
          ).
          <article-title>Hans Berger (</article-title>
          <year>1873</year>
          -1941)
          <article-title>--the history of electroencephalography</article-title>
          .
          <source>Acta medica Croatica: casopis Hravatske akademije medicinskih znanosti</source>
          ,
          <volume>59</volume>
          (
          <issue>4</issue>
          ),
          <fpage>307</fpage>
          -
          <lpage>313</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Paszkiel</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rojek</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lei</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Castro</surname>
            ,
            <given-names>M. A.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>A Pilot Study of Game Design in the Unity Environment as an Example of the Use of Neurogaming on the Basis of brain-computer interface Technology to Improve Concentration</article-title>
          .
          <source>NeuroSci</source>
          ,
          <volume>2</volume>
          (
          <issue>2</issue>
          ),
          <fpage>109</fpage>
          -
          <lpage>119</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Kaiser</surname>
            ,
            <given-names>D. A.</given-names>
          </string-name>
          (
          <year>2005</year>
          ).
          <article-title>Basic principles of quantitative EEG</article-title>
          .
          <source>Journal of Adult Development</source>
          ,
          <volume>12</volume>
          ,
          <fpage>99</fpage>
          -
          <lpage>104</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Pfurtscheller</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Da Silva</surname>
            ,
            <given-names>F. L.</given-names>
          </string-name>
          (
          <year>1999</year>
          ).
          <article-title>Event-related EEG/MEG synchronization and desynchronization: basic principles</article-title>
          .
          <source>Clinical neurophysiology</source>
          ,
          <volume>110</volume>
          (
          <issue>11</issue>
          ),
          <fpage>1842</fpage>
          -
          <lpage>1857</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Lotte</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Congedo</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lécuyer</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lamarche</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Arnaldi</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          (
          <year>2007</year>
          ).
          <article-title>A review of classification algorithms for EEG-based brain-computer interfaces</article-title>
          .
          <source>Journal of neural engineering</source>
          ,
          <volume>4</volume>
          (
          <issue>2</issue>
          ),
          <fpage>R1</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Lotte</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bougrain</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cichocki</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Clerc</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Congedo</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rakotomamonjy</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Yger</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          (
          <year>2018</year>
          ).
          <article-title>A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update</article-title>
          .
          <source>Journal of neural engineering</source>
          ,
          <volume>15</volume>
          (
          <issue>3</issue>
          ),
          <fpage>031005</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Renard</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lotte</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gibert</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Congedo</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maby</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Delannoy</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Lécuyer</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2010</year>
          ).
          <article-title>Openvibe: An open-source software platform to design, test, and use brain-computer interfaces in real and virtual environments</article-title>
          .
          <source>Presence</source>
          ,
          <volume>19</volume>
          (
          <issue>1</issue>
          ),
          <fpage>35</fpage>
          -
          <lpage>53</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Sokół</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pawuś</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Majewski</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Krok</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>The Study of the Effectiveness of Advanced Algorithms for Learning Neural Networks Based on FPGA in the Musical Notation Classification Task</article-title>
          .
          <source>Applied Sciences</source>
          ,
          <volume>12</volume>
          (
          <issue>19</issue>
          ),
          <volume>9829</volume>
          , (
          <year>2022</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Pawuś</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Paszkiel</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <article-title>Application of EEG Signals Integration to Proprietary Classification Algorithms in the Implementation of Mobile Robot Control with the Use of Motor Imagery Supported by EMG Measurements</article-title>
          .
          <source>Applied Sciences</source>
          ,
          <volume>12</volume>
          (
          <issue>11</issue>
          ),
          <volume>5762</volume>
          , (
          <year>2022</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Pawuś</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Paszkiel</surname>
            ,
            <given-names>S. BCI</given-names>
          </string-name>
          <article-title>Wheelchair Control Using Expert System Classifying EEG Signals Based on Power Spectrum Estimation and Nervous Tics Detection</article-title>
          .
          <source>Applied Sciences</source>
          ,
          <volume>12</volume>
          (
          <issue>20</issue>
          ),
          <volume>10385</volume>
          , (
          <year>2022</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Pawuś</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Paszkiel</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>The application of integration of EEG signals for authorial classification algorithms in implementation for a mobile robot control using movement imagery-Pilot study</article-title>
          .
          <source>Applied Sciences</source>
          ,
          <volume>12</volume>
          (
          <issue>4</issue>
          ),
          <fpage>2161</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Ko</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jeon</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jeong</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Suk</surname>
            ,
            <given-names>H. I.</given-names>
          </string-name>
          <article-title>Multi-scale neural network for EEG representation learning in BCI</article-title>
          .
          <source>IEEE Computational Intelligence Magazine</source>
          ,
          <volume>16</volume>
          (
          <issue>2</issue>
          ),
          <fpage>31</fpage>
          -
          <lpage>45</lpage>
          , (
          <year>2021</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Tortora</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ghidoni</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chisari</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Micera</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Artoni</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <article-title>Deep learning-based BCI for gait decoding from EEG with LSTM recurrent neural network</article-title>
          .
          <source>Journal of neural engineering</source>
          ,
          <volume>17</volume>
          (
          <issue>4</issue>
          ),
          <fpage>046011</fpage>
          , (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Hosseini</surname>
            ,
            <given-names>M. P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hosseini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Ahi</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          <article-title>A review on machine learning for EEG signal processing in bioengineering</article-title>
          .
          <source>IEEE reviews in biomedical engineering</source>
          ,
          <volume>14</volume>
          ,
          <fpage>204</fpage>
          -
          <lpage>218</lpage>
          , (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Reilly</surname>
            ,
            <given-names>R. B.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>T. C.</given-names>
          </string-name>
          <article-title>Electrograms (ecg, eeg</article-title>
          , emg, eog).
          <source>Technology and Health Care</source>
          ,
          <volume>18</volume>
          (
          <issue>6</issue>
          ),
          <fpage>443</fpage>
          -
          <lpage>458</lpage>
          , (
          <year>2010</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Farina</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Merletti</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Enoka</surname>
            ,
            <given-names>R. M.</given-names>
          </string-name>
          <article-title>The extraction of neural strategies from the surface EMG: an update</article-title>
          .
          <source>Journal of Applied Physiology</source>
          ,
          <volume>117</volume>
          (
          <issue>11</issue>
          ),
          <fpage>1215</fpage>
          -
          <lpage>1230</lpage>
          , (
          <year>2014</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Chai</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guan</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lu</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          , Zhang,
          <string-name>
            <given-names>T.</given-names>
            , &amp;
            <surname>Niu</surname>
          </string-name>
          ,
          <string-name>
            <surname>H.</surname>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>A hybrid BCIcontrolled smart home system combining SSVEP and EMG for individuals with paralysis</article-title>
          .
          <source>Biomedical Signal Processing and Control</source>
          ,
          <volume>56</volume>
          ,
          <fpage>101687</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>He</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhou</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gu</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Ming</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Incorporating EEG and EMG patterns to evaluate BCI-based long-term motor training</article-title>
          .
          <source>IEEE Transactions on Human-Machine Systems</source>
          ,
          <volume>52</volume>
          (
          <issue>4</issue>
          ),
          <fpage>648</fpage>
          -
          <lpage>657</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ji</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yu</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jin</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Ye</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>A sequential learning model with GNN for EEG-EMG-based stroke rehabilitation BCI</article-title>
          . Frontiers in Neuroscience,
          <volume>17</volume>
          ,
          <fpage>1125230</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Chowdhury</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Raza</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Meena</surname>
            ,
            <given-names>Y. K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dutta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Prasad</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>An EEG-EMG correlation-based brain-computer interface for hand orthosis supported neuro-rehabilitation</article-title>
          .
          <source>Journal of neuroscience methods</source>
          ,
          <volume>312</volume>
          ,
          <fpage>1</fpage>
          -
          <lpage>11</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Sarhan</surname>
            ,
            <given-names>S. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Al-Faiz</surname>
            ,
            <given-names>M. Z.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Takhakh</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>A review on EMG/EEG based control scheme of upper limb rehabilitation robots for stroke patients</article-title>
          .
          <source>Heliyon.</source>
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [27]
          <string-name>
            <surname>Lang</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>Investigating the Emotiv EPOC for cognitive control in limited training time</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [28]
          <string-name>
            <surname>Browarska</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kawala-Sterniuk</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zygarlicki</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Podpora</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pelc</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Martinek</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Gorzelańczyk</surname>
            ,
            <given-names>E. J.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Comparison of smoothing filters' influence on quality of data recorded with the emotiv epoc flex brain-computer interface headset during audio stimulation</article-title>
          .
          <source>Brain sciences</source>
          ,
          <volume>11</volume>
          (
          <issue>1</issue>
          ),
          <fpage>98</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [29]
          <string-name>
            <surname>Mudgal</surname>
            ,
            <given-names>S. K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sharma</surname>
            ,
            <given-names>S. K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chaturvedi</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Sharma</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Brain computer interface advancement in neurosciences: Applications and issues</article-title>
          .
          <source>Interdisciplinary Neurosurgery</source>
          ,
          <volume>20</volume>
          ,
          <fpage>100694</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          [30]
          <string-name>
            <surname>Kaur</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Singh</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Roy</surname>
            ,
            <given-names>P. P.</given-names>
          </string-name>
          (
          <year>2018</year>
          ).
          <article-title>EEG based emotion classification mechanism in BCI</article-title>
          . Procedia computer science,
          <volume>132</volume>
          ,
          <fpage>752</fpage>
          -
          <lpage>758</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          [31]
          <string-name>
            <surname>Cimtay</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Ekmekcioglu</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset EEG emotion recognition</article-title>
          .
          <source>Sensors</source>
          ,
          <volume>20</volume>
          (
          <issue>7</issue>
          ),
          <year>2034</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          [32]
          <string-name>
            <surname>Sasaki</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iversen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Callan</surname>
            ,
            <given-names>D. E.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>Music improvisation is characterized by increase EEG spectral power in prefrontal and perceptual motor cortical sources and can be reliably classified from non-improvisatory performance</article-title>
          .
          <source>Frontiers in Human Neuroscience</source>
          ,
          <volume>13</volume>
          ,
          <fpage>435</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          [33]
          <string-name>
            <surname>Ghosh</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Deb</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sengupta</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Phukan</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Choudhury</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kashyap</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Dutta</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>SAM 40: Dataset of 40 subject EEG recordings to monitor the induced-stress while performing Stroop color-word test, arithmetic task, and mirror image recognition task</article-title>
          .
          <source>Data in Brief</source>
          ,
          <volume>40</volume>
          ,
          <fpage>107772</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref35">
        <mixed-citation>
          [34]
          <string-name>
            <surname>Antoniou</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bozios</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Christou</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tzimourta</surname>
            ,
            <given-names>K. D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kalafatakis</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          , G. Tsipouras,
          <string-name>
            <given-names>M.</given-names>
            , ... &amp;
            <surname>Tzallas</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. T.</surname>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>EEG-based eye movement recognition using brain-computer interface and random forests</article-title>
          .
          <source>Sensors</source>
          ,
          <volume>21</volume>
          (
          <issue>7</issue>
          ),
          <fpage>2339</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref36">
        <mixed-citation>
          [35]
          <string-name>
            <surname>Williams</surname>
            ,
            <given-names>N. S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McArthur</surname>
            ,
            <given-names>G. M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Badcock</surname>
            ,
            <given-names>N. A.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>It's all about time: precision and accuracy of Emotiv event-marking for ERP research</article-title>
          .
          <source>PeerJ</source>
          ,
          <volume>9</volume>
          ,
          <fpage>e10700</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref37">
        <mixed-citation>
          [36]
          <string-name>
            <surname>Lara</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Marcuse</surname>
            ,
            <given-names>M. D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Madeline</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fields</surname>
            ,
            <given-names>M. D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Yoo</surname>
            ,
            <given-names>J. J.</given-names>
          </string-name>
          (
          <year>2015</year>
          ).
          <article-title>Rowan's Primer of EEG</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref38">
        <mixed-citation>
          [37]
          <string-name>
            <surname>Paszkiel</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Data acquisition methods for human brain activity. Analysis and Classification of EEG Signals for Brain-Computer Interfaces</article-title>
          ,
          <fpage>3</fpage>
          -
          <lpage>9</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>