<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Studies: A Managed Solution for Standardized, Scalable, Privacy‑First Smartphone Sensor Data Collection for Indoor Positioning Research</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Kelvin Tsz Hei Choi</string-name>
          <email>sensorlogger@tszheichoi.com</email>
        </contrib>
      </contrib-group>
      <abstract>
        <p>Smartphone‑based indoor positioning research demands high‑quality, time‑synchronized sensor streams under diverse environmental and hardware conditions, yet many researchers face challenges in their data collection campaigns owing to device heterogeneity, sampling inconsistency, consent management, and privacy compliance. We introduce Studies, a fully managed, end‑to‑end Software as a Service (SaaS) framework built on top of the popular Sensor Logger app (iOS, Android, WatchOS, WearOS). Studies formalizes roles for investigators and participants, automates sensor configuration distribution across platforms, orchestrates data capture, and integrates built‑in consent and customizable questionnaires. Sensor Logger and Studies have already supported numerous research eforts presented at previous IPIN conferences, including pedestrian trajectory reconstruction via bidirectional Kalman filtering [ 1] and magnetometer calibration during SLAM [2]. In this paper, we present the system architecture, cross-platform implementation, managed workflows, and real-world case studies, demonstrating how Studies enables researchers to collect the right data at the right time-while ensuring participant privacy and data consistency.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;smartphone-based</kwd>
        <kwd>positioning</kwd>
        <kwd>crowd-sourcing</kwd>
        <kwd>reproducible research</kwd>
        <kwd>privacy compliance</kwd>
        <kwd>managed data workflows</kwd>
        <kwd>data collection</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Indoor positioning research relies on rich, multi‑sensor datasets to achieve high accuracy in complex
environments. However, researchers frequently encounter challenges including device heterogeneity
and sampling variation [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], consent and privacy concerns [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], and manual workflow overhead [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ],
especially in large-scale studies. To address these issues, we introduce Studies, a service‑oriented
framework within the Sensor Logger app that helps researchers specify precise sensor suites, orchestrate
recordings, and automate secure data uploads, allowing focus on experimental design and analysis.
Studies has already gained popularity and recognition in the IPIN community, including enhanced
pedestrian trajectory reconstruction [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] and SLAM magnetometer calibration [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], demonstrating its
adaptability and impact.
      </p>
      <p>In Section 2, we first outline the key challenges researchers face in data collection campaigns.
In Section 3, we review some existing solutions for data logging in research context. Section 4
introduces the Sensor Logger app, and from there we dive into the Studies architecture in Section 5
to demonstrate how it helps reduce friction for researchers. Finally, in Section 6, we showcase how
Studies have been used in practice across academia already.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Challenges for Researchers</title>
      <p>
        Collecting reliable, large‐scale smartphone sensor data involves overcoming numerous technical,
logistical, and ethical hurdles. Prior tools lack scalable, integrated workflows for crowd-sourced sensor
data collection with privacy control [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Existing crowd‑sourcing platforms also often require custom
development, which hinders reproducibility [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. In the following, we review the principal challenges
researchers face.
      </p>
      <sec id="sec-2-1">
        <title>2.1. Heterogeneous Devices</title>
        <p>
          Smartphones difer widely in sensor quality, sampling APIs, and coordinates / units definitions, leading
to non‐uniform data that undermines reproducibility and comparability, especially in large-scale studies
[
          <xref ref-type="bibr" rid="ref12 ref13">12, 13</xref>
          ]. Researchers often resort to procuring a uniform device pool, adding tens of thousands of dollars
in hardware costs, to ensure consistency, or they resort to writing custom code to mitigate diferences,
taking time away from valuable research eforts[
          <xref ref-type="bibr" rid="ref12">12</xref>
          ].
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2. Custom Sensing Logic</title>
        <p>Many data collection campaigns involve custom sensing logic – whether it is geo-fencing or rules
based on time of day or day of week. Incorporating custom sensing logic often necessitates bespoke
application development, increasing the technical burden on research teams.</p>
      </sec>
      <sec id="sec-2-3">
        <title>2.3. Ad-Hoc Data Transfer</title>
        <p>
          Crowd-sourced studies often rely on ad-hoc uploads to Google Drive folders or Dropbox links,
burdening participants with manual steps [
          <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
          ] and researchers with reminder overhead, let alone
keeping track of files and their naming conventions. Moreover, these methods frequently lack robust
privacy protections, raising concerns about the secure handling of sensitive participant data.
More advanced research groups may use cloud object stores (e.g., Amazon S3) with buckets
via SDKs or REST APIs [
          <xref ref-type="bibr" rid="ref16">16</xref>
          ]. However, this comes with engineering overhead [
          <xref ref-type="bibr" rid="ref16">16</xref>
          ] and working with
devops tools that researchers may not be familiar with. Indeed, managing tens of participants is
perhaps manageable with manual oversight, but studies with thousands of contributors will have
unbearable clerical bottlenecks, leading to delays in research schedule [
          <xref ref-type="bibr" rid="ref17">17</xref>
          ].
        </p>
      </sec>
      <sec id="sec-2-4">
        <title>2.4. Privacy, Consent, and Ethical Compliance</title>
        <p>
          Smartphone sensor streams (GPS, microphone, Bluetooth) can reveal sensitive personal behaviors,
creating high privacy risks [
          <xref ref-type="bibr" rid="ref25">25</xref>
          ]. Institutions enforce diverse ethics reviews—e.g., University of
Washington’s IRB, Stanford’s Human Subjects Committee – each with unique consent form requirements,
data retention policies, and anonymization standards [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ].
        </p>
      </sec>
      <sec id="sec-2-5">
        <title>2.5. Capturing Rich Contextual Data</title>
        <p>
          While raw sensor data (e.g., accelerometer, magnetometer, GPS) captures physical phenomena, it often
lacks essential contextual information such as user intent, environmental conditions, or semantic labels.
These are often critical to researchers. To address this, many frequently supplement sensor streams with
participant-reported data via additional surveys, daily diaries, or momentary assessments. However,
most resort to separate tools like Google Forms or Qualtrics to administer surveys [
          <xref ref-type="bibr" rid="ref22">22</xref>
          ], introducing
significant friction for participants who must switch between platforms or apps. This added complexity
can reduce response rates, break temporal alignment with sensor events, and increase dropout [
          <xref ref-type="bibr" rid="ref23">23</xref>
          ].
Furthermore, merging survey data with sensor logs typically requires post-hoc synchronization based
on timestamps, which is error-prone—especially when sensor and survey data are collected on separate
threads or apps with difering clock sources.
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Existing Solutions</title>
      <p>A number of mobile sensing platforms have been developed to facilitate research using smartphone and
wearable data. We briefly review representative examples and highlighting potential shortfalls with
respect to challenges outlined in the previous section.</p>
      <sec id="sec-3-1">
        <title>3.1. AWARE Framework</title>
        <p>
          The AWARE Framework [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ] provides a comprehensive SDK and server backend for Android devices,
enabling passive data logging of sensors such as accelerometers, location, and app usage. While AWARE
includes plugin support and centralized dashboards, it lacks seamless cross-platform support (very
limited iOS capabilities), and consent flows must be managed externally.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. RADAR-base 3.3. Beiwe</title>
        <p>
          RADAR-base [
          <xref ref-type="bibr" rid="ref28">28</xref>
          ] is an open-source platform originally developed for health-related sensing in clinical
studies. It ofers integration with wearable sensors (e.g., Fitbit, Biovotion) and collects active and passive
data using a modular architecture. However, it requires significant DevOps setup, including Kafka and
PostgreSQL servers.
        </p>
        <p>
          The Beiwe Research Platform [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ] emphasizes privacy-preserving smartphone sensing in clinical trials.
It supports customizable surveys and sensor capture but is primarily Android-focused and requires
institutional review board (IRB) configuration before deployment.
        </p>
      </sec>
      <sec id="sec-3-3">
        <title>3.4. SensingKit and OpenSensing</title>
        <p>
          Other toolkits such as SensingKit [
          <xref ref-type="bibr" rid="ref27">27</xref>
          ] and OpenSensing aim to simplify mobile sensor data access for
developers but remain limited in deployment management, standardization, and participant oversight.
They serve more as low-level libraries than managed platforms.
        </p>
      </sec>
      <sec id="sec-3-4">
        <title>3.5. GetSensorData</title>
        <p>
          GetSensorData is an Android-based sensor acquisition tool used in the IPIN community, particularly
for competition datasets [
          <xref ref-type="bibr" rid="ref29">29</xref>
          ]. It enables straightforward collection of raw smartphone sensor data
but lacks integrated participant management, and does not support cross-platform harmonization or
built-in privacy and consent workflows.
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. The Sensor Logger App</title>
      <p>Sensor Logger is a free, versatile, cross-platform mobile application designed to simplify sensor logging
from smartphones, tablets, and wearables. As shown in Figure 1, it supports a comprehensive array of
on-device sensors—including accelerometers, gyroscopes, magnetometers, barometers, GPS, audio, and
camera streams—as well as device metadata such as battery level, network state, and screen brightness.
Of note to the IPIN community is support for WiFi and Bluetooth beacons.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Studies Architecture</title>
      <p>The Studies sub-system is designed to tackle the challenges of running data collection campaigns as
outlined in Section 2. Figure 2 shows the outline of the Studies system, which will be explained below.</p>
      <sec id="sec-5-1">
        <title>5.1. Participant–Investigator Model</title>
        <p>The corner stone of Studies is the participant-investigator model. In this context, researchers play the
role of an Investigator, and Participants contribute data to the research. All parties use the same Sensor
Logger app on their respective smartphone throughout the life-cycle of a Study.</p>
        <p>As shown in step (A) of figure 2, investigators begin by defining a Study directly within the
Sensor Logger app, controlling exactly how and what data should be collected. Each Study
configuration includes the following:
• Defined Sensor Suite: Accelerometer, gyroscope, magnetometer, barometer, GPS, audio, camera,
microphone, Bluetooth beacons, Wi‑Fi and any custom sensor decoder plugins.
• Defined Sampling Rules: sampling frequencies, time based or geo-fence/location based conditional
recording, sensor based triggers. Investigators can test out these rules on their own device and
iterate until satisfaction.
• Export Formats: JSON, CSV, Excel, KML, SQLite etc that suits the researcher’s downstream
analysis scripts.
• Metadata &amp; Privacy: privacy statements, approved consent text, contact details. To help
investigators, templates are provided as part of the creation workflow.
• Contextual Questionnaire: Study-specific questionnaire templates (text, numeric, multiple-choice,
signature fields), which can be shown on joining a study and on the ending of each recording
session. See figure ?? in the appendix.</p>
        <p>The investigator submits this Study configuration to a remote server, also known as Sensor Logger
Cloud, as shown in step (B) of figure 2. A backend then validates the Study configuration to ensure
integrity and compliance. Thereafter, a single source of truth defining what and how data should be
collected exists in the cloud.</p>
      </sec>
      <sec id="sec-5-2">
        <title>5.2. One Tap Join</title>
        <p>Once a Study is created, a simple QR code (or link) is generated, shown as step (C) and (D) of figure 2.
This QR code points to the aforementioned created Study. Participants can then enroll seamlessly by
simply scanning with their own phones. To ensure privacy compliance, all participants can first preview
exactly what they will be collecting on behalf of the Study investigator before joining. Depending on
the Study configuration, participants will be asked to complete a consent form before joining. Once
joined, shown as step (E) of figure 2, each participant automatically receives the exact configuration
as defined in the previous section, regardless of platform. They can also join, pause, or switch studies
in-app, filter recordings by study, and remain anonymous by design. It is also noted that once joined,
the participant’s devices do not require further internet connection until they are ready to share their
recordings back with the investigator – this is important for participants with limited cellular data.</p>
      </sec>
      <sec id="sec-5-3">
        <title>5.3. Standardization &amp; Harmonization</title>
        <p>
          Since participants may use a variety of recording devices, the raw data collected can difer in units and
coordinate systems. For example, as shown in Figure 3, Android’s SensorManager and iOS’s Core Motion
define coordinate systems for acceleration data in opposite directions. In particular, note that iOS does
not follow the standard right-handed coordinate system due to legacy reasons. Additionally, Android
and iOS apply diferent strategies for location fusion, each making distinct trade-ofs between
networkbased and GNSS-based positioning. Studies employs a unified standardization mechanism to resolve
these inconsistencies. This standardization process (Step F in Figure 2) is performed automatically in
the background and requires no configuration or understanding from participants. Standardization
includes:
• Unit Normalization: Converts all measurements to SI units (m, m/s2, rad/s,  T, hPa); GPS
coordinates in decimal degrees.
• Coordinate System Alignment: Unified definition of x, y and z coordinate systems for all
vector-valued sensor readings.
• Absolute Reference Frames: All GPS and fused location data are aligned to the WGS84 geodetic
reference system.
• Timestamp Synchronization: To resolve misalignment across sensors (e.g., gyroscope vs. GPS)
caused by internal platform scheduling jitter or bufer diferences, Studies interpolates or aligns
samples to the nearest common sampling grid as defined by the investigator. All timestamps are
referenced to UNIX epoch in milliseconds and are corrected for known device clock skew where
possible.
• Metadata Standards: For traceability and reproducibility, all exported data is bundled with
device identifiers (model, manufacturer, OS version), sensor vendor details, and platform-specific
lfags (e.g., fused vs. raw GNSS, estimated accuracy).
• BLE Decoding: A core component of Sensor Logger is its open-source initiative to provide a
unified interface for Bluetooth Low Energy (BLE) data acquisition and decoding. BLE plays a
crucial role in indoor positioning due to its widespread availability and fine-grained proximity
sensing capabilities. However, BLE data collection varies significantly between platforms, with
diferences in scanning APIs, advertisement formats, and signal reporting. Studies fully leverages
this unified BLE interface to ensure consistent decoding [
          <xref ref-type="bibr" rid="ref30">30</xref>
          ].
        </p>
        <p>
          For reproducibility, all transformations performed as part of the Standardization are documented in
detail in [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ]. This ensures full transparency for reproducibility, debugging, and peer review.
        </p>
      </sec>
      <sec id="sec-5-4">
        <title>5.4. One Tap Upload</title>
        <p>With a single tap, participants can upload their recordings securely to Sensor Logger Cloud, via an
encrypted connection (Step (G) of figure 2). The backend infrastructure is hosted on industry-grade
providers such as Cloudflare and Backblaze, ofering distributed object storage with configurable
redundancy, automatic deduplication, and secure TLS channels for data in transit and at rest.
Data is stored in the cloud as per a configured data retention policy. During this period, the
investigator can download all contributed recordings securely from a web portal or via an API as shown
in step (H) of figure 2. Throughout the Study life cycle, the investigator can monitor the progress of the
Study through the app (Step (G) of figure 2) – such as how many participants have enrolled and how
many recordings have been uploaded.</p>
        <p>While the current implementation emphasizes asynchronous, privacy-first uploads to preserve
participant control and minimize cellular data usage, Studies has been designed with real-time
streaming capabilities in mind. Preliminary internal testing supports low-latency MQTT-based data
relay for scenarios requiring continuous monitoring (e.g., infrastructure-aware SLAM, live mobility
tracking).</p>
      </sec>
      <sec id="sec-5-5">
        <title>5.5. Data Parsing and Analysis Tools</title>
        <p>
          To facilitate downstream data analysis, Studies exports sensor data in widely-used formats including
JSON, CSV, Excel, KML, and SQLite. Recognizing the importance of easy data ingestion for researchers,
we provide open-source parsing libraries and example scripts compatible with popular scientific
computing environments. For Python users, a repository is available on GitHub [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ], which ofers utilities
for loading, filtering, and synchronizing sensor streams, including handling of the standardization
metadata and timestamps.
        </p>
      </sec>
      <sec id="sec-5-6">
        <title>5.6. Software as a Service</title>
        <p>
          The principle of Studies is Software as a Service (SaaS). In this paradigm, researchers don’t configure and
manage storage, data transfer and communication infrastructure with participants. Researchers simply
declare a study, and pay for the resources (such as storage) they consume [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ]. This subscription-based
model reduces upfront costs and lowers barriers to entry for smaller labs or pilot studies – often free
for suficiently small studies.
        </p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Case Studies &amp; Adoption</title>
      <p>
        Studies has been adopted by leading institutions (MIT Media Lab, ETH Zurich Robotics Systems Lab,
KU Leuven, University of Duisburg‑Essen, University of Cambridge) for applications including:
• Pedestrian trajectory reconstruction with bidirectional Kalman filters [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
• Retail footfall analytics using BLE/RSSI logs [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ].
• SLAM magnetometer calibration across devices [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
• Mental health biometrics in longitudinal clinical trials [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ].
      </p>
      <p>
        • Agricultural incident detection via multi-sensor wearables [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
      </p>
    </sec>
    <sec id="sec-7">
      <title>7. Conclusion</title>
      <p>We present Studies as a comprehensive, privacy‑first service that automates end‑to‑end sensor data
collection for indoor positioning research. The benefits of Studies compared to typical manual workflow
for researchers are summarized in table 7. By formalizing roles, standardizing configurations, and
ofering dynamic scalability, Studies accelerates reproducible experiments. Future work includes
extending Studies to real-time streaming contributions, and even more throughout cross-platform
harmonization, such as unified sensor fusion algorithms. Anyone can try Studies for free in the Sensor
Logger app. Learn more at https://www.tszheichoi.com/studies.
Typical Manual Workflow Studies Workflow
Researcher writes custom scripts or apps per de- Investigator declares study parameters in single
vice/OS. app.</p>
      <p>Create separate consent forms on external plat- In-app consent flows and questionnaires embedded
forms (e.g., Google Forms, Qualtrics). in the study configuration.</p>
      <p>Participants install and configure manually. One-tap enrollment via QR code or short link
applies exact configuration.</p>
      <p>Participants record data manually; start/stop re- Automated background recording with smart rules
minders require manual follow-up. (time, geofence, motion) and on-device reminders.
Participants upload files via email, Google Drive, One-tap, encrypted uploads to cloud; automated
or custom S3 scripts. retry and reminder logic.</p>
      <p>Researcher merges, cleans, and standardizes het- Data delivered in normalized, SI-unit JSON/CSV;
erogeneous exports ofline. metadata and transformations documented
automatically.</p>
      <p>Maintain custom server infrastructure or ad-hoc Fully managed SaaS backend with dynamic scaling,
storage buckets. SLAs, and REST/MQTT APIs.</p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>The author has not employed any Generative AI tools.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Y.</given-names>
             
            <surname>Dong</surname>
          </string-name>
          et al.,
          <article-title>“Enhanced Pedestrian Trajectory Reconstruction Using Bidirectional Extended Kalman Filter</article-title>
          and Automatic Refinement,” in IPIN 
          <year>2024</year>
          ,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>I.</given-names>
             Vallivaara et al.,
            <surname>“Magnetometer Calibration During</surname>
          </string-name>
          <string-name>
            <surname>SLAM</surname>
          </string-name>
          ,” in IPIN 
          <year>2024</year>
          ,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J. </given-names>
            <surname>Smith</surname>
          </string-name>
          and
          <string-name>
            <surname>A.</surname>
          </string-name>
           Lee, “Sensor Heterogeneity in Mobile Devices,” Sensors,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>M.</surname>
          </string-name>
           Jones, “Privacy and Consent in Crowdsourced Data,” Sci. Direc.,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>K.</surname>
          </string-name>
           Nguyen, “Manual Workflows in Sensor Data Collection,” MDPI,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>P.</given-names>
             
            <surname>Barsocchi</surname>
          </string-name>
          et al., “Challenges in Crowdsourced Indoor Positioning,” Springer,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>L.</given-names>
             
            <surname>Zhao</surname>
          </string-name>
          et al., “Reproducibility Barriers in Mobile Sensing,”
          <string-name>
            <surname>Eurasip</surname>
            <given-names>J. Adv. Signal</given-names>
          </string-name>
          <string-name>
            <surname>Process</surname>
          </string-name>
          .,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>R.</surname>
          </string-name>
           Yonetani et al., “RetailOpt:
          <article-title>Opt‑in Easy‑to‑Deploy Trajectory Estimation from Smartphone Motion Data</article-title>
          and Retail Facility Information,” in ACM ISWC '
          <volume>24</volume>
          ,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>N.</given-names>
             
            <surname>Loecher</surname>
          </string-name>
          et al.,
          <article-title>“Assessing the Eficacy of a Self-Stigma Reduction Mental Health Program with Mobile Biometrics,”</article-title>
          <source>in IEEE FG '23</source>
          ,
          <year>2023</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>A.</surname>
          </string-name>
           J. Etienne et al.,
          <article-title>“Testing the Feasibility of Commercial Wearables in Agricultural Incident Detection</article-title>
          ,” J. 
          <source>Agric. Safety Health</source>
          ,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>T.</surname>
          </string-name>
           Choi, “Awesome Sensor Logger,” GitHub repository,
          <year>2025</year>
          . [Online]. Available: https://github. com/tszheichoi/awesome-sensor-logger.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>O.</given-names>
            <surname>Vallivaara</surname>
          </string-name>
          et al.,
          <article-title>“Comparative Assessment of Multimodal Sensor Data Quality …,”</article-title>
          <string-name>
            <surname>Sensors</surname>
          </string-name>
          ,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>S.</given-names>
            <surname>Joo</surname>
          </string-name>
          et al.,
          <article-title>“Ensemble Models for Human Activity Recognition Independent of Device Configuration</article-title>
          ,” Springer,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>AWARE</given-names>
            <surname>Framework</surname>
          </string-name>
          , “
          <string-name>
            <surname>Use</surname>
            <given-names>AWARE</given-names>
          </string-name>
          ,”
          <year>2025</year>
          . [Online]. Available: https://awareframework.com/ use-aware/.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>B.</given-names>
            <surname>Struminskaya</surname>
          </string-name>
          et al.,
          <article-title>“Augmenting Surveys With Data From Sensors and Apps: Opportunities</article-title>
          and Challenges,”
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>M. K. Boehnke</surname>
          </string-name>
          et al.,
          <source>“Collecting and Analyzing Smartphone Sensor Data for Health,” ACM</source>
          ,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>R.</given-names>
            <surname>Harari</surname>
          </string-name>
          et al.,
          <article-title>“Using Smartphones to Collect Behavioral Data in Psychological Studies,” Psychol</article-title>
          . Methods,
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Wikipedia</surname>
          </string-name>
          , “Crowdsensing,”
          <year>2024</year>
          . [Online]. Available: https://en.wikipedia.org/wiki/ Crowdsensing.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>C.</given-names>
            <surname>West</surname>
          </string-name>
          et al.,
          <article-title>“Sharing Data Collected with Smartphone Sensors: Willingness, Participation,</article-title>
          and Nonparticipation Bias,
          <source>” Res. Methods</source>
          ,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>N.</given-names>
            <surname>Harari</surname>
          </string-name>
          et al.,
          <article-title>“Lessons from First‐Generation Smartphone Sensing Studies,”</article-title>
          <source>Social Sci. Comput</source>
          . Rev.,
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>J. D.</given-names>
            <surname>Biersdorfer</surname>
          </string-name>
          , “Google Drive Privacy,” New York Times,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Struminskaya</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          et al.,
          <article-title>“Augmenting Surveys With Data From Sensors and Apps: Opportunities</article-title>
          and Challenges,” in Social Science Computer Review,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <surname>West</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          et al.,
          <article-title>“Sharing Data Collected with Smartphone Sensors: Willingness, Participation,</article-title>
          and Nonparticipation Bias,” in Research Methods,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Harari</surname>
            ,
            <given-names>G. M.</given-names>
          </string-name>
          et al.,
          <article-title>“Lessons From First‐Generation Smartphone Sensing Studies</article-title>
          ,” in Social Science Computer Review,
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Cisco</surname>
          </string-name>
          , “What Is SaaS?,”
          <year>2025</year>
          . [Online]. Available: https://www.cisco.com/c/en/us/products/software/whatis
          <article-title>-software-as-a-service-saas</article-title>
          .
          <source>html :contentReference[oaicite:1]index=1</source>
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>J.</given-names>
            <surname>Torous</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Kiang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lorme</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.-P.</given-names>
            <surname>Onnela</surname>
          </string-name>
          , “
          <article-title>The Beiwe platform: a high-throughput smartphone-based digital phenotyping tool,” JMIR Ment</article-title>
          . Health, vol.
          <volume>3</volume>
          , no.
          <issue>1</issue>
          , p.
          <fpage>e6434</fpage>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>C.</given-names>
            <surname>Perera</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Jayaraman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Zaslavsky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Christen</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Georgakopoulos</surname>
          </string-name>
          , “
          <article-title>SensingKit: Evaluating the Sensor Power Consumption in iOS Devices,”</article-title>
          <source>in Proc. IEEE Int. Conf. Intell. Sensors, Sensor Networks Inf. Process. (ISSNIP)</source>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <surname>M. Van Velthoven</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Mastellos</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Majeed</surname>
          </string-name>
          , “
          <article-title>RADAR-base: Open Source Mobile Health Platform for Collecting, Monitoring, and Analyzing Data Using Sensors, Wearables,</article-title>
          and Mobile Devices,”
          <string-name>
            <given-names>J.</given-names>
            <surname>Med</surname>
          </string-name>
          . Internet Res., vol.
          <volume>23</volume>
          , no.
          <issue>7</issue>
          , p.
          <fpage>e21830</fpage>
          ,
          <year>2021</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>J.</given-names>
            <surname>Blanco</surname>
          </string-name>
          , E. Garcia, and
          <string-name>
            <given-names>M. L.</given-names>
            <surname>Perez</surname>
          </string-name>
          , “
          <article-title>GetSensorData: A Smartphone Sensor Data Acquisition Tool for Indoor Positioning,” Pervasive and Mobile Computing</article-title>
          , vol.
          <volume>77</volume>
          ,
          <year>2022</year>
          . Available: https: //www.sciencedirect.com/science/article/pii/S2352711022001121
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <surname>K. T. H. Choi</surname>
          </string-name>
          , “Sensor Logger BLE Decoder,” GitHub repository,
          <year>2025</year>
          . [Online]. Available: https: //github.com/tszheichoi/sensor-ble
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>