<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>COLINS-</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Developing an Application with Sensors in Smart Phones</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Atahan Tufekci</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Muhammet C. Colak</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Anar Gurbanov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Pinar Kirci</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Bursa UludagUniversity</institution>
          ,
          <addr-line>Gorukle, Bursa</addr-line>
          ,
          <country country="TR">Turkey</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2024</year>
      </pub-date>
      <volume>8</volume>
      <fpage>12</fpage>
      <lpage>13</lpage>
      <abstract>
        <p>In the study, the functions of sensors in smartphone hardware, how data can be collected from sensors during the coding phase, and healthy driving, fainting detection applications will be developed and the results will be discussed. With the advances and developments in smartphones, various sensors such as accelerometers, gyroscopes, magnetometers and similar sensors have been included in the hardware of these devices. Thanks to these sensors, it is possible to obtain various data about the person using the device and its environment. These data can be accessed with the libraries used in the mobile software development phase. Applications were developed by processing these data accessed with mathematical methods or machine learning algorithms. With these applications, it has been possible to obtain results in many areas about people using smartphones. In the age of technology we live in, smartphones are perhaps the devices that people use, spend time with and keep with them the most. At first glance, smartphones are thought to be used only for communication and communication, but with the advancement of technology, new methods and new directions have been discovered in which these devices can be used. Over time, with the development of the hardware of smartphones, sensors such as accelerometers and gyroscopes (1) have been included in the hardware of these devices and new usage opportunities have emerged for smartphones. With the help of these sensors, it is possible to obtain various information about the person using the phone and the changes in their environment. By analyzing the data collected from the activities and movements of the person during the day with mathematical calculations or machine learning algorithms, results have emerged in the field of health, where people can detect and track their own fainting, driving, stepping, running, stopping, sitting (2). It has been possible to develop applications related to this subject.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;sensors</kwd>
        <kwd>smartphone</kwd>
        <kwd>driving 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <sec id="sec-1-1">
        <title>2.1. Accelerometer Sensor</title>
        <p>
          The accelerometer sensor in the smartphone hardware is used to measure the acceleration applied
to the device. The data obtained from this sensor gives the acceleration value affecting the
smartphone in the x, y, z axes in plus and minus directions. These values are in g (
          <xref ref-type="bibr" rid="ref1">1</xref>
          ). Based on these
data, the movements made by the person using the smartphone and the shaking of the device analyzes
can be made regarding their states and vibrations (
          <xref ref-type="bibr" rid="ref3">3</xref>
          ). Based on these analyzes, certain inferences can
be made. In the coding phase, the variables and functions of the Accelerometer sensor type and
SensorEventListener interface from the SensorManager library (
          <xref ref-type="bibr" rid="ref1">1</xref>
          ) can be used in the Android system
to access the data collected by this sensor. In this way, accelerometer data can be obtained with
smartphones that people carry with them all day long without the need to use an extra environment,
device or sensor (
          <xref ref-type="bibr" rid="ref4">4</xref>
          ).
        </p>
      </sec>
      <sec id="sec-1-2">
        <title>2.2.Gyroscope Sensor</title>
        <p>
          The gyroscope sensor in the smartphone hardware is used to measure the orientation angle of the
device in the x, y, z axes in the plus and minus directions (
          <xref ref-type="bibr" rid="ref5">5</xref>
          ). With the data obtained from this sensor,
the orientation state, rotation angle and angular velocity of the smartphone can be determined (
          <xref ref-type="bibr" rid="ref3">3</xref>
          ).
By using the data collected from the gyroscope sensor, certain inferences can be made from the
analysis of the physical activities of the person using the smartphone in daily life, as used in the
accelerometer sensor. In order to access the data collected by this sensor during the coding phase, the
variables and functions of the Gyroscope sensor type and SensorEventListener interface from the
SensorManager library (
          <xref ref-type="bibr" rid="ref1">1</xref>
          ) can be used in the Android system. In this way, gyroscope data can be
obtained from smartphones that people often carry with them without the need for an auxiliary
factor.
        </p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>3. Areas of Use of Sensors</title>
      <sec id="sec-2-1">
        <title>3.1. Step Detector</title>
        <p>
          One of the most important aspects of personal health is to stay active during the day. It is necessary
for a healthy life for people to get out of situations where they remain stationary and move, walk and
run. Thanks to the studies on the step detector, people have had the opportunity to track how many
steps they take during the day. Although different tools can be used for step detection and step
counting, accelerometer sensors are generally used. For this process, basically 4 stages can be
mentioned as data acquisition from the accelerometer sensor, noise reduction, detection of stepping
and whether to count it as a step or not (
          <xref ref-type="bibr" rid="ref6">6</xref>
          ). Although the most common method is to calculate the
magnitude of the x, y, z data from the accelerometer sensor and compare this value with a threshold
value, many different methods have also been used. In some studies, the x, y, z data from the
accelerometer sensor were sent into a mathematical function and the output value was used together
with the zero crossing method (
          <xref ref-type="bibr" rid="ref6">6</xref>
          ) (
          <xref ref-type="bibr" rid="ref7">7</xref>
          ). In some studies, the user's walking frequency was also included
in the calculations (
          <xref ref-type="bibr" rid="ref8">8</xref>
          ). In addition to the different methods used, studies have also taken into account
the activity in which the data from the accelerometer was obtained while the person was performing
the activity or the position in which the smartphone was in during the determination of stepping. In
a previous example study on this topic, the values of the three axes obtained from the accelerometer
sensor were processed with various formulas and a net magnitude value was calculated. In the next
step, a dynamically running peak detection algorithm and an algorithm for detecting false peaks and
real peaks were used using this calculated magnitude value. In the following steps, step detection was
performed using the start vector and end vector values obtained by these algorithms. Using the data
obtained with the algorithms and methods used, the step length was determined and the distance
traveled was estimated accordingly (
          <xref ref-type="bibr" rid="ref9">9</xref>
          ).
        </p>
        <p>In another study on step detector and step detection, values such as sampling rate, stepping rate
and average of orthogonal accelerations are taken into account while determining the threshold
value.</p>
        <p>
          The algorithm was designed and tested at different walking speeds and different phone positions.
As a result of the tests, very high success percentages were obtained (
          <xref ref-type="bibr" rid="ref10">10</xref>
          ).
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>3.2. Vehicle Use Behaviors</title>
        <p>
          Although it is not directly related to human health, both physical and psychological health of a
person can be injured in a traffic accident that may occur. The majority of these traffic accidents are
caused by human errors and the effects of these errors. In order to prevent or minimize the
negativities that may occur as a result of traffic accidents, it has been possible to make applications
using the sensors in the smartphone hardware. In these applications, two issues related to driving
behavior can be examined: the behavior of the driver using the vehicle and the way the vehicle is used
(
          <xref ref-type="bibr" rid="ref11">11</xref>
          ). The driver's behavior can include whether the driver is focused on the road or whether the
driver is on the phone while driving. The manner in which the vehicle is driven can include
aggressively pressing the gas and brakes or making very sharp turns on bends. In a study on this topic,
the data obtained from the accelerometer and gyroscope sensors on the smartphone in the vehicle
were used, and the driver's behaviors such as aggressive cornering, aggressive brake pedal use and
aggressive accelerator pedal use were detected by comparing the sensor data using the DTW
algorithm. With a certain number of signals, the driver's maneuvers were identified (
          <xref ref-type="bibr" rid="ref12">12</xref>
          ). When
another study was examined, it was seen that classes such as aggressive driving, normal driving,
aggressive braking and normal braking were determined in order to detect driving behaviors with
the accelerometer sensor and datasets of the data obtained from the accelerometer sensor related to
these classes were created. In the following stages, classification processes were performed and
tested with the help of MLP (Multilayer Perceptron), RF (Random Forest), KNN (K-Nearest
Neighbors) and GNB (Gaussian Naïve Bayes) algorithms and methods.
        </p>
        <p>
          After the tests, F-scores for the classification of different event types were observed (
          <xref ref-type="bibr" rid="ref13">13</xref>
          ). Various
studies have been conducted with different classifications, methodologies and methods, such as using
the ANN (Artificial Neural Network) algorithm (
          <xref ref-type="bibr" rid="ref14">14</xref>
          ). It has been observed that various success
percentages have been observed with different classification algorithms and different scenario
classes.
        </p>
      </sec>
      <sec id="sec-2-3">
        <title>3.3. Activity Detection</title>
        <p>
          Another topic related to the field of smartphone sensors is activity detection. With the
accelerometer and gyroscope sensors detecting information such as vibration, rotation, acceleration,
deceleration, tilt, etc., the orientation of smartphone users' activities such as running, walking and
climbing can be determined. Thanks to the sensors of systems such as smartphones, it has been
possible to receive data, process the data and perform motion detection without the need to integrate
another external sensor (
          <xref ref-type="bibr" rid="ref15">15</xref>
          ). For activity detection, data is first collected from mobile sensors such
as accelerometers and gyroscopes during different activities. After collecting the data, certain features
such as speed and orientation are extracted for each activity. After these features are extracted, tests
are performed and the activity is detected according to the result of the test. For activity detection,
machine learning and classification algorithms such as kNN (
          <xref ref-type="bibr" rid="ref2">2</xref>
          ), Logistic Regression (
          <xref ref-type="bibr" rid="ref2">2</xref>
          ), SVM, Random
Forest, Naive Bayes, Bayesian Networks, Multilayer Neural Network, Ameva, K-Means Clustering (
          <xref ref-type="bibr" rid="ref16">16</xref>
          )
are important. In a study on this subject, activity detection was performed by classifying with SVM,
LR and J48 decision tree algorithms. Activities were used as standing, walking, running, running, lying
down, getting up, and getting down.
        </p>
        <p>
          As a feature, mean, energy, entropy, standard deviation and correlation were used. Accelerometer
and gyroscope sensors were used to measure and collect the data. The highest percentage results for
all three algorithms were obtained for the lying down activity. The lowest result was recorded in the
detection of sitting activity with the SVM algorithm. In general, when looking at the average of all, the
LR algorithm gave the highest result (
          <xref ref-type="bibr" rid="ref17">17</xref>
          ). (
          <xref ref-type="bibr" rid="ref18">18</xref>
          ) compared the same six activities with SVM and
HFSVM (Hardware Friendly SVM). 789 test instances were evaluated with approximately equal
instances per class and it was observed that although the percentage of detection of up and down
activities was low in both algorithms, the overall detection percentage was high. In the study by (19),
XGB, SVM, NN, Soft voting, Hard voting algorithms and gyroscope sensor were used. 2 different
numbers of features (195 and 304) were tested. Normal walking, fast walking, going down, going up,
going down, going up and sitting were used as activities. Testing was done for three different
situations. In the first case, bag walking and normal walking were combined as a single activity. In the
second case, normal, fast and bag walking were combined as a single activity. The results showed that
the Soft Voiting approach had the highest detection results.
        </p>
      </sec>
      <sec id="sec-2-4">
        <title>3.4. Condition Monitoring in Parkinson's Disease</title>
        <p>Among the many studies conducted in the field of health with integrated sensors used in
smartphones, one of the most important ones is the monitoring of the condition of Parkinson's
patients. Using accelerometer and gyroscope sensors in smartphone hardware, measurements such
as tremor, slow movement and loss of balance can be used to evaluate motor functions and diagnose
as well as severity (20). Neural networks including back-propagation algorithms are frequently used
in the problem of classifying data about patients and in solving this problem. In addition, Levenberg
Markard algorithm and conjugate gradient algorithms have also been used (21). In a study on this
subject, during the data analysis and preprocessing process using Python, the sensor data of each
participant were organized by side (right or left), sensor mode, activity and session, and then
organized according to the timestamp with the BioStampRC application. Left (right) side sensor data
were matched to the corresponding side clinical scores for bradykinesia, tremor and dyskinesia, and
all sensor data obtained from the accelerometer and gyroscope were segmented into 5-second clips
with 50% overlap. To remove the limb orientation effect and to detect bradykinesia, high or low pass
filters were applied to the accelerometer and gyroscope data at specific frequencies. These filters
were found to be helpful in symptom detection. Each of the accelerometer and gyroscope data from a
total of 41,802 clips obtained in this process was matched with the corresponding patient ID, side,
activity, session and clinical score data (22).</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>4. Project Presented</title>
      <sec id="sec-3-1">
        <title>4.1. Healthy Driving Practice</title>
        <p>
          Within the scope of healthy driving, our application aims to focus the driver on the road and driving
in situations where the driver is talking on the phone or interested in the phone. In this application, 3
classes have been determined and these classes include the state of the phone in the holder / in front
of the user while driving, the state of the phone in the pocket while driving, and the state of talking on
the phone while driving (23). In order to perform the classification process, KNN (K-Nearest
Neighbors) algorithm (
          <xref ref-type="bibr" rid="ref13">13</xref>
          ) was used with a dataset and classification was performed with different K
values such as 3, 5, 7, 9, 11. The success of the classification process performed with different K values
in the KNN algorithm.
        </p>
        <p>The classified scenarios were realized with the application and our testing processes were carried
out in order to determine its status. For the data collection process to be used in the classification
process of our application, sample scenarios belonging to 3 classes were realized through the
application we wrote with Android Studio and Java, and the x, y, z axis data obtained from the
accelerometer sensor with a frequency of 0.2 Hz were collected and saved in a file to create our data
set in the specified format. Figures 1, 2 and 3 below show the graphs of the accelerometer data
collected for the sample scenarios that constitute our dataset.</p>
      </sec>
      <sec id="sec-3-2">
        <title>4.2. Fainting Detection App</title>
        <p>
          Within the scope of fainting detection of the application made by us, it is aimed to send
notifications to the relevant people in cases where we detect the user fainting. In this application, 4
classes have been determined and these include the user's sitting state, the user's walking state, the
user's standing state and the user's fainting state (24). In order to perform the classification process,
KNN (K-Nearest Neighbors) algorithm (
          <xref ref-type="bibr" rid="ref10">10</xref>
          ) was used with a dataset and classification was performed
with different K values such as 3, 5, 7, 9, 11. In order to determine the success of the classification
process performed with different K values in the KNN algorithm, the classified scenarios were
realized with the application and our testing processes were carried out.
        </p>
        <p>For the data collection process to be used in the classification process of our application, we
realized sample scenarios of 4 classes through the application we wrote with Android Studio and Java
and collected data from the accelerometer and gyroscope sensor.</p>
        <p>With a frequency of 0.2 Hz, the x, y, z axis data were collected and saved in a file with xls extension
in the specified format to form our data set. Figures 4 and 5 below show the graphs of the
accelerometer and gyroscope data collected for the sample fainting scenario that constitutes our
dataset.</p>
      </sec>
      <sec id="sec-3-3">
        <title>4.3. Step Counter App</title>
        <p>Within the scope of the step counter of the application made by us, it is aimed to detect the user's
stepping motion with step detection methods and display it with a counter and accordingly display
information such as distance, speed, energy expenditure.</p>
        <p>
          Each time the data coming from the accelerometer with a certain frequency changes, we get a
result from the functions of formula (4.1) and formula (4.2) given below (
          <xref ref-type="bibr" rid="ref7">7</xref>
          ).
        </p>
        <p>(4.1)
(4.2)</p>
        <p>
          The step will be counted when we detect that the result obtained from the functions has
transitioned from negative to positive or from positive to negative using the zero crossing method. In
order to eliminate the problems that may arise from the detection sensitivity of the accelerometer
sensor (
          <xref ref-type="bibr" rid="ref9">9</xref>
          ), the validity of a step is determined by the requirement of a minimum of 0.25 seconds more
than the previous valid step (
          <xref ref-type="bibr" rid="ref7">7</xref>
          ).
        </p>
        <p>On the one hand, while the number of steps is determined, the distance information is calculated
using the height information and the number of steps we receive from the user, the average speed
information is calculated using the distance and time change, and the amount of energy consumed is
calculated using the weight information we receive from the user (25). We calculate the average speed
information with the formula (4.3) given below.</p>
        <p>(4.3)</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>5. Research Results</title>
      <sec id="sec-4-1">
        <title>5.1. Step Counter App Results</title>
        <p>
          In the step detection part, which is the basis of our step counter application, the x, y, z values
obtained from the accelerometer sensor were calculated with the formulas and zero crossing method
we have previously mentioned (
          <xref ref-type="bibr" rid="ref7">7</xref>
          ). In order to determine the accuracy of our operations, we tested
the success of the step detector application in cases where the user holds the phone steady in his/her
hand, the user shakes the phone in his/her hand, and the user's phone is in the pocket (
          <xref ref-type="bibr" rid="ref10">10</xref>
          ) and the
following results were obtained.
        </p>
        <p>Success percentage of the step detection algorithm:
Phone held steady in hand, 99.0%
Phone shaken in hand, 87.0%</p>
        <p>Phone in pocket, 93.0%
In addition to step detection, the user can track and monitor the values of time, distance, speed andenergy
expenditure with the information we receive from the user and other calculations we have made.</p>
      </sec>
      <sec id="sec-4-2">
        <title>5.2. Fainting Detection Implementation Results</title>
        <p>
          In our fainting detection application, we performed classification using the KNN (KNearest
Neighbor) algorithm (
          <xref ref-type="bibr" rid="ref10">10</xref>
          ) on the data we obtained. In order to find the appropriate K value to be used
in the KNN algorithm, the classification success of different values were tested and our specific results
were obtained. With the data obtained by the accelerometer sensor on the smartphone, the
classification process at a frequency of 1 Hz was performed with the algorithm we coded and analyzed
by us by viewing it on the application. Classification processes and classified scenarios of the user's
sitting state, the user's standing state, the user's walking state and the user's fainting state (24), which
are the 4 classes we have determined, were performed with the application and our testing processes
were carried out.
        </p>
        <p>The results we have obtained are shown in our success percentage graph displayed in Figure 6
below with the data visualization methods we have made through Python.
As a result of our tests, in order to determine the K value to be used in the KNN algorithm, the
average classification success of each K value was examined by us. As a result of our examination, we
found that when K is 3, there is an average success of 71.25%, when K is 5, there is an average success
of 76.67%, when K is 7, there is an average success of 77.92%, when K is 9, there is an average success
of 72.92%, when K is 11, there is an average success of 65.42%, and based on the results of our
observations, the value to be used in the algorithm was determined.</p>
        <p>As a result of the data we collected and the classification we made with the KNN algorithm, if the
user is detected to be unconscious, an application that sends an SMS notification to a specific phone
number was written by us using Android Studio and Java.</p>
      </sec>
      <sec id="sec-4-3">
        <title>5.3. Healthy Driving Practice Results</title>
        <p>
          In our healthy driving application, the classification process was performed using the KNN
(KNearest Neighbor) algorithm (
          <xref ref-type="bibr" rid="ref13">13</xref>
          ) on the data we obtained.
        </p>
        <p>KNN in order to find the appropriate K value to be used in the algorithm, the classification success
of different values was tested and our specific results were obtained. With the data obtained by the
accelerometer sensor on the smartphone, the classification process at a frequency of 1 Hz was
performed with the algorithm we coded and analyzed by us by displaying it on the application.</p>
        <p>Classification processes and classified scenarios for the 3 classes we have determined, which are
the state of having the phone in the holder/front while driving, the state of having the phone in the
pocket while driving, and talking on the phone while driving (23), were carried out together with the
application and our testing processes were carried out.</p>
        <p>The results obtained are shown in our success percentage graph displayed in Figure 7 below with
the data visualization methods we made through Python.</p>
        <p>As a result of our tests, in order to determine the K value to be used in the KNN algorithm, the
average classification success of each K value was examined by us. As a result of the examination, we
found that when K is 3, the average success rate is 54.44%, when K is 5, the average success rate is
68.89%, when K is 7, the average success rate is 65.56%, when K is 9, the average success rate is
60.55%, when K is 11, the average success rate is 59.44% and based on the results of our
observations, the value to be used in the algorithm was determined. As a result of our data set and the
classification process we performed with the KNN algorithm, the actions to be taken and the actions
to be taken depending on the user's use of the smartphone while driving are provided by the mobile
application written by us with Android Studio and Java.</p>
      </sec>
      <sec id="sec-4-4">
        <title>5.4. Application Overview</title>
        <p>Figure 8 below shows the interface of the step counter application, where the number of steps
taken by the users is determined and the users can observe various information as a result of the
other calculations we have made. When users specify their weight and height and open the
application, they will be able to access the relevant information on this interface.</p>
        <p>Figure 9 below shows the KNN algorithm we used, the classes we determined and the data sets we
obtained as a result of the data collection process we carried out; There is an interface that allows us
to observe how much success percentage we have achieved in the classification process, which is the
basis of our application.</p>
        <p>An audible warning is given to indicate that people using our application are not focusing on the
road during driving activity. Again, when the fainting condition of the users of our application is
detected, a notification is sent to the mobile phone number specified in the application. Figures 10
and 11 below show the interfaces where information appears on the screen according to the
classifications made while the application is running.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>6. Conclusion</title>
      <p>
        In our study, different procedures in previous studies or information obtained from other sources
have produced different results. The zero-crossing method and formulas (
        <xref ref-type="bibr" rid="ref7">7</xref>
        ), which we used as valid,
obtained more successful and useful results than statically determining a limit value and comparing
the magnitudes of the axis values obtained from the accelerometer sensor with the limit value and in
the zero-crossing method, the measurement accuracy of the accelerometer sensor is not paid
attention.
      </p>
      <p>The average success percentages of previous studies and our study are shown below for
comparison and evaluation purposes.</p>
      <p>
        Average success percentages of step detector studies: Average Success in our study is 93.00%,
Average Success in the study of (
        <xref ref-type="bibr" rid="ref7">7</xref>
        ) is 97.09%, Average Success in the study of (
        <xref ref-type="bibr" rid="ref10">10</xref>
        ) is 97.97%.
      </p>
      <p>Although the main purpose of our application is a step counter, we have created an application
where the user can observe the distance traveled, the average speed and the energy consumed, along
with the calculations we have made. Thus, it is possible to observe different features that users may
want to know.</p>
      <p>In the general scope of our fainting detection application, we believe that detecting the fainting of
the person using the smartphone, sending a notification with this detection and tracking the other
activities of the user is useful for people who need or may need help in these matters. In the general
scope of our healthy driving application, we believe that it would be useful to detect and warn the
user if the person using the smartphone is interested in the phone or talking on the phone while
driving.</p>
      <p>
        The KNN (K-Nearest Neighbor) algorithm was used for both applications in the classification
sections of our application. With the use of this algorithm, we tested the success of classification
scenarios with K values in a certain range (
        <xref ref-type="bibr" rid="ref11 ref3 ref5 ref7 ref9">3, 5, 7, 9, 11</xref>
        ). When we examined the success percentages
after our tests, we observed that some of the different K values and different classes had a high success
percentage, some had an average success percentage, and some had a low success percentage.
      </p>
      <p>Below, the highest success percentages of the classifications made in our faint detection and
healthy driving applications in different studies and in our study are given for comparison and
evaluation purposes.</p>
      <p>In our study, healthy driving had the highest classification success: Phone in Holder/Front 88.33%,
Phone in Pocket 68.33%, Talking on the Phone 56.67%.</p>
      <p>Fainting detection application highest classification success:
In our study; 76.67% in sitting, 96.67% in walking, 98.33% in standing, 58.33% in fainting.</p>
      <p>
        In the study of (
        <xref ref-type="bibr" rid="ref17">17</xref>
        ); 94.2% in the Sitting state, 97.2% in the Walking state, 94.8% in the Standing
state, - in the Fainting state.
      </p>
      <p>
        In the study of (
        <xref ref-type="bibr" rid="ref18">18</xref>
        ); 96.4% in the Sitting state, 95.6% in the Walking state, 93.0% in the Standing
state, in the Fainting state.
Computer Science, vol 7657. Springer, Berlin, Heidelberg.
https://doi.org/10.1007/978-3-64235395-6_30, 2012.
[19] Alruban Abdulrahman, Al-obaidi Hind, Clarke Nathan, Li Fudong, “ Physical ActivityRecognition
by Utilizing Smartphone Sensor Signals.“, 10.5220/0007271903420351, 2019.
[20] F. Onay, B. Karacali, Accelerometer-Based Timing Analysis for Parkinson's Disease Classification,
in: 2023 31st Signal Processing and Communications Applications Conference (SIU),Istanbul,
Turkey, 2023, pp. 1-4, doi: 10.1109/SIU59756.2023.10223916
[21] Shichkina Yulia, Kataeva Galina, Yulia Irishina, Elizaveta Stanevich. “The Architecture of the
System for Monitoring the Status in Patients with Parkinson's Disease Using Mobile
Technologies.”10.1007/978-3-030-32258-8_62, 2020.
[22] Lonini, L., Dai, A., Shawen, N. et al.” Wearable sensors for Parkinson's disease: which data are
worth collecting for training symptom detection models.” npj Digital Med 1, 64 (2018).
https://doi.org/10.1038/s41746-018-0071-z
[23] Fu Li, Hai Zhang, Huan Che and Xiaochen Qiu, Dangerous driving behavior detection using
smartphone sensors, in: 2016 IEEE 19th International Conference on Intelligent Transportation
Systems (ITSC), Rio de Janeiro, 2016, pp. 1902-1907, doi: 10.1109/ITSC.2016.7795864.
[24] A. Ferrari, D. Micucci, M. Mobilio and P. Napoletano, "On the Personalization of Classification
Models for Human Activity Recognition," IEEE Access, vol. 8, pp. 32066-32079, 2020, doi:
10.1109/ACCESS.2020.2973425.
[25] Okmayura Finanta, Effendi Noverta, Ramadhani Witri, Jefiza Adlian, (2019). Analysis and Design
of Calories Burning Calculation in Jogging Using Thresholding Based Accelerometer Sensor.
10.2991/iccelst-st-19.2019.3, 2019.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>E.</given-names>
            <surname>Sagbas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ballı</surname>
          </string-name>
          ,
          <article-title>The Use of Smartphone Sensors and Access to Raw Sensor Data, in : XVII. Akademik Bilisim Konferansi</article-title>
          ,
          <string-name>
            <surname>AB</surname>
          </string-name>
          <year>2015</year>
          ,
          <article-title>4-6 february 2015</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>E.</given-names>
            <surname>Sagbas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ballı</surname>
          </string-name>
          ,
          <article-title>Comparison of Logistic Regression and kNN Methods in ActionRecognition with Smartphone Sensor Data</article-title>
          ,
          <source>in: 1st International Conference on Engineering Technology and Applied Science</source>
          <year>2016</year>
          , pp
          <fpage>21</fpage>
          -
          <lpage>22</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>R.</given-names>
            <surname>Kusuma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mukheerjee</surname>
          </string-name>
          ,
          <article-title>Health Monitoring with Smartphone Sensors and Machine Learning Techniques</article-title>
          ,
          <source>in: 2023 2nd International Conference on Applied Artificial Intelligence and Computing (ICAAIC)</source>
          , Salem, India,
          <year>2023</year>
          , pp.
          <fpage>576</fpage>
          -
          <lpage>581</lpage>
          , doi: 10.1109/ICAAIC56838.
          <year>2023</year>
          .
          <volume>10140210</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Aydemir</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Karslioglu</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          “
          <article-title>Gait Pattern Analysis with the Help of Accelerometer Sensor of Smartphones</article-title>
          .” Afyon Kocatepe University Journal of Science and Engineering Sciences,
          <volume>21</volume>
          (
          <issue>2</issue>
          ),
          <fpage>283</fpage>
          -
          <lpage>299</lpage>
          . https://doi.org/10.35414/akufemubid.856995
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Alruban</given-names>
            <surname>Abdulrahman</surname>
          </string-name>
          , Al-obaidi
          <string-name>
            <surname>Hind</surname>
          </string-name>
          , Clarke Nathan, Li Fudong, “
          <article-title>Physical Activity Recognition by Utilizing Smartphone Sensor Signals</article-title>
          .”
          <volume>10</volume>
          .5220/0007271903420351, (
          <year>2019</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Seo</given-names>
            <surname>Jungryul</surname>
          </string-name>
          , Chiang Yutsai, Laine Teemu, Khan Adil, “
          <article-title>Step counting on smartphonesusing advanced zero-crossing and linear regression</article-title>
          .
          <source>” ACM IMCOM 2015 - Proceedings.10.1145/2701126</source>
          .2701223.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S. Y.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. J.</given-names>
            <surname>Heo</surname>
          </string-name>
          and
          <string-name>
            <given-names>C. G.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <article-title>Accelerometer-based smartphone step detection using machine learning technique</article-title>
          , in: 2017 International Electrical Engineering Congress (iEECON), Pattaya, Thailand,
          <year>2017</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>4</lpage>
          , doi: 10.1109/IEECON.
          <year>2017</year>
          .
          <volume>8075875</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Kang</given-names>
            <surname>Xiaomin</surname>
          </string-name>
          , Huang Baoqi,
          <string-name>
            <given-names>Qi</given-names>
            <surname>Guodong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A Novel</given-names>
            <surname>Walking</surname>
          </string-name>
          <article-title>Detection and Step Counting Algorithm Using Unconstrained Smartphones</article-title>
          .
          <source>Sensors</source>
          (Basel, Switzerland).
          <volume>18</volume>
          . 10.3390/s18010297,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Alabadleh</given-names>
            <surname>Ahmad</surname>
          </string-name>
          , Hawari Eshraq,
          <article-title>Alkafaween Esra'a, Alsawalqah Hamad, Step Detection Algorithm For Accurate Distance Estimation Using Dynamic Step Length</article-title>
          .
          <fpage>324</fpage>
          -
          <lpage>327</lpage>
          .
          <fpage>10</fpage>
          .1109/MDM.
          <year>2017</year>
          .
          <volume>52</volume>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>H.</given-names>
            <surname>Muhsen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Al-Amaydeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Al-Hamlan</surname>
          </string-name>
          ,
          <article-title>"Algorithm Design for Accurate Steps Counting Based on Smartphone Sensors for Indoor Applications"</article-title>
          ,
          <source>Advances in Science, Technology and Engineering Systems Journal</source>
          , vol.
          <volume>5</volume>
          , no.
          <issue>6</issue>
          , pp.
          <fpage>811</fpage>
          -
          <lpage>816</lpage>
          ,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Mantouka</surname>
            <given-names>Eleni</given-names>
          </string-name>
          , Barmpounakis Manos, Vlahogianni Eleni, Golias John, “
          <article-title>Smartphone Sensing for Understanding Driving Behavior: Current Practice and Challenges</article-title>
          .”
          <source>International Journal of Transportation Science and Technology. 10. 10</source>
          .1016/j.ijtst.
          <year>2020</year>
          .
          <volume>07</volume>
          .001,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>D. A.</given-names>
            <surname>Johnson and M. M. Trivedi</surname>
          </string-name>
          ,
          <article-title>Driving style recognition using a smartphone as a sensor platform</article-title>
          ,
          <source>in: 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC)</source>
          ,Washington, DC, USA,
          <year>2011</year>
          , pp.
          <fpage>1609</fpage>
          -
          <lpage>1615</lpage>
          , doi: 10.1109/ITSC.
          <year>2011</year>
          .
          <volume>6083078</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>M. R.</given-names>
            <surname>Carlos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. C.</given-names>
            <surname>González</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Wahlstrom</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Ramírez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Martínez</surname>
          </string-name>
          and
          <string-name>
            <given-names>G.</given-names>
            <surname>Runger</surname>
          </string-name>
          ,
          <string-name>
            <surname>How Smartphone Accelerometers Reveal Aggressive Driving Behavior</surname>
          </string-name>
          ?
          <article-title>-The Key is the Representation</article-title>
          ,
          <source>in: IEEE Transactions on Intelligent Transportation Systems</source>
          , vol.
          <volume>21</volume>
          , no.
          <issue>8</issue>
          , pp.
          <fpage>3377</fpage>
          -
          <lpage>3387</lpage>
          , Aug.
          <year>2020</year>
          , doi: 10.1109/TITS.
          <year>2019</year>
          .
          <volume>2926639</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>S. K.</given-names>
            <surname>Sonbhadra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Agarwal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Syafrullah</surname>
          </string-name>
          and
          <string-name>
            <given-names>K.</given-names>
            <surname>Adiyarta</surname>
          </string-name>
          ,
          <article-title>Aggressive driving behaviour classification using smartphone's accelerometer sensor</article-title>
          ,
          <source>in: 2020 7th International Conference on Electrical Engineering</source>
          ,
          <source>Computer Sciences and Informatics (EECSI)</source>
          , Yogyakarta, Indonesia,
          <year>2020</year>
          , pp.
          <fpage>77</fpage>
          -
          <lpage>82</lpage>
          , doi: 10.23919/EECSI50503.
          <year>2020</year>
          .
          <volume>9251913</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>K.</given-names>
            <surname>Erin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Bayilmis</surname>
          </string-name>
          , and
          <string-name>
            <surname>B. BORU</surname>
          </string-name>
          , “
          <article-title>Accelerometer and Internet of Things Based Real-Time Human Activity Detection"</article-title>
          ,
          <source>APJES</source>
          , vol.
          <volume>9</volume>
          ,
          <issue>sy</issue>
          . 1, pp.
          <fpage>194</fpage>
          -
          <lpage>198</lpage>
          ,
          <year>2021</year>
          , doi: 10.21541/apjes.809777
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Iskanderov</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guvensan</surname>
            ,
            <given-names>M. A.</given-names>
          </string-name>
          “
          <article-title>Activity recognition with smartphones and wearables: Classical approaches</article-title>
          , new solutions.” Pamukkale University Journal of Engineering Sciences,
          <volume>25</volume>
          (
          <issue>2</issue>
          ),
          <fpage>223</fpage>
          -
          <lpage>239</lpage>
          ,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>W. C.</given-names>
            <surname>Hung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Shen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.L.</given-names>
            <surname>Wu</surname>
          </string-name>
          , M.
          <article-title>-</article-title>
          K. Hor and
          <string-name>
            <given-names>C. Y.</given-names>
            <surname>Tang</surname>
          </string-name>
          ,
          <article-title>Activity Recognition with sensors on mobile devices</article-title>
          ,
          <source>in: 2014 International Conference on Machine Learning and Cybernetics</source>
          , Lanzhou,China,
          <year>2014</year>
          , pp.
          <fpage>449</fpage>
          -
          <lpage>454</lpage>
          , doi: 10.1109/ICMLC.
          <year>2014</year>
          .7009650
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Anguita</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ghio</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Oneto</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Parra</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reyes-Ortiz</surname>
            ,
            <given-names>J.L.</given-names>
          </string-name>
          <string-name>
            <surname>Human</surname>
          </string-name>
          <article-title>Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine</article-title>
          , in: Bravo,
          <string-name>
            <given-names>J.</given-names>
            ,
            <surname>Hervás</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Rodríguez</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>(eds) Ambient Assisted Living and Home Care</article-title>
          .
          <source>IWAAL 2012</source>
          . Lecture Notes in
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>