<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>November</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Development and Validation of an Instrument Assessing Physics Teachers' Beliefs, Competence, and Attitudes in Smartphone-Based Physics Experiments*</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Valentyna Pleskach</string-name>
          <email>v.pleskach64@gmail.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Frank Angelo Pacala</string-name>
          <email>frankpacala@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mary Jane Cinco</string-name>
          <email>maryjane.cinco@ssu.edu.ph</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Rezy Mendaño</string-name>
          <email>rezy.medano@ssu.edu.ph</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Edelyn</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Echapare</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>College of Arts and Sciences, Samar State University</institution>
          ,
          <addr-line>6700, Catbalogan City</addr-line>
          ,
          <country country="PH">Philippines</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Science and Mathematics Education Department, University of San Carlos</institution>
          ,
          <addr-line>Cebu City, 6000</addr-line>
          ,
          <country country="PH">Philippines</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Taras Shevchenko National University of Kyiv</institution>
          ,
          <addr-line>Bohdan Hawrylyshyn str. 24, Kyiv, 04116</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2024</year>
      </pub-date>
      <volume>2</volume>
      <fpage>0</fpage>
      <lpage>21</lpage>
      <abstract>
        <p>Much scientific research in physics education has been dedicated to using smartphone sensors in physics experiments. However, a dependable and valid instrument for evaluating physics teachers' beliefs, skills, knowledge, and perspectives regarding physics experiments that employ smartphone technology has been lacking. This research aims to develop a robust survey instrument that accurately measures these various dimensions of physics teachers' engagement with smartphone-based experiments. The instrument development utilized the Technology Acceptance Model (TAM) and the Technological Pedagogical Content Knowledge (TPACK frameworks. Initially, 72 developed statements were developed and were divided into four sections. The instrument was subjected to initial expert review and was cut down to 63 items. Then, the instrument was sent for a pilot study in a high school with six science teachers (N = 6), and their comments were used to revise the instrument. The final administration was conducted in Catbalogan City Division and Samar Division with 87 teachers (N = 87) as total participants. The data gathered was used to subject the instrument to content validity, internal consistency, and construct validity analyses. During the content validity, 20 statements were deemed appropriate by the five-panel experts, 26 were omitted, and 17 were revised. The questionnaire shows strong internal consistency and reliability with an average A. Knowledge and competence statements were highly correlated under Factor 1 with high eigenvalue. These two statements were merged. The study has produced 34 valid and reliable statements to assess physics teachers' beliefs, competence, and attitudes toward smartphone-based physics experiments.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Smartphone learning</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>, exploratory factor analysis, content validity index</p>
    </sec>
    <sec id="sec-2">
      <title>1. Introduction</title>
      <p>Incorporating smartphone technology into education is not just a trend; it's a potential
gamechanger. It provides new ways to improve teaching and learning experiences. In physics education,
using smartphones for experiments offers a flexible and easy way to conduct hands-on activities,
promote interactive learning, and improve conceptual comprehension. This approach especially
benefits teachers looking to incorporate modern, technology-based techniques. However, the
effective use of smartphone-based physics experiments relies on teachers' attitudes, expertise, and
confidence in using these technologies, which can significantly impact the quality of physics
education.</p>
      <p>
        Studies on smartphone-based physics experiments are mostly related to the use of smartphone
sensors for physics experiments, for example, the review by Sirisathitkul and Sirisathitkul [1]. Some
literature also focused on determining the effect of smartphone-related physics experiments on the
g skills, for instance, the studies of Colt et al.
[2] and Kaps et al. [
        <xref ref-type="bibr" rid="ref27">3</xref>
        ]. The research of Lahme et al. [4] shed new light on the understudied physics
teachers' beliefs, competence, knowledge, and attitudes toward smartphone-based physics
experiments. They found that lab instructors generally had a favorable outlook on utilizing digital
technologies in physics laboratory classes, mainly because of their ability to conduct experiments
and enhance students' skills, motivation, and relevance. While developing digital competencies was
considered less crucial than traditional learning goals, the ability to collect and analyze data using
digital tools was identified as an essential skill for students to attain. On top of this, Seifert [5]
revealed that college instructors were amazed by the capabilities of technology. They viewed
augmented reality in smartphones as mesmerizing and recognized its potential, but they felt that
additional training was necessary before they could use it effectively. They were pragmatic and
utilized the workshop to strategize projects across different subjects that could leverage the benefits
of mobile technologies.
      </p>
      <p>Studies on physics teachers' beliefs, competence, knowledge, and attitudes toward
smartphonebased physics experiments are scarce. Understanding the factors influencing teachers' adoption and
effective use of smartphone-based experiments is necessary. The lack of comprehensive tools for
systematically measuring these multidimensional aspects is a gap that needs to be filled. Considering
the growing incorporation of digital resources in the educational realm, it is crucial to comprehend
educators' viewpoints and skills to facilitate the successful implementation and use of these tools. A
reliable and valid tool to capture these variables is essential. The study of Lahme et al. [4] did not use
on the level of interest and the
perception of self-efficacy. A valid and reliable instrument could offer a systematic and dependable
approach to evaluating physics teachers' beliefs, competence, knowledge, and attitudes toward
smartphone-based physics experiments, providing valuable information that can guide the
development of professional training initiatives and policy-making. The current study aims to bridge
this gap by developing and validating a survey instrument to assess physics teachers' beliefs,
knowledge, competence, and attitudes toward smartphone-based physics experiments.</p>
      <p>An instrument that measures physics teachers' beliefs, competence, knowledge, and attitudes
toward smartphone-based physics experiments is crucial for providing customized support and
training for teachers. It helps identify areas where teachers excel and where they need more
assistance, allowing educational leaders to create tailored professional development programs. The
-efficacy as a basis for creating a
continuing professional development program (CPD). According to Ahmed [7], academic institutions
that have a deeper grasp of how students decide to adopt and utilize a particular technology and the
reasons behind the acceptance and use of mobile learning will be better equipped to implement
effective and original technology solutions. This approach improves physics education by enhancing
teachers' skills and knowledge and addressing any barriers to using smartphone-based experiments
effectively.</p>
      <p>Additionally, a standardized tool on physics teachers' beliefs, competence, knowledge, and
attitudes toward smartphone-based physics experiments adds to the extensive research on
integrating educational technology. Ahmed [7] argued that more comprehensive and systematic data
on mobile learning is needed to capture the full potential of smartphone-based experiments. Hence,
this present undertaking is a novel task that ventures into the
incompetence, knowledge, and attitudes toward smartphone-based physics experiments. This survey's
dependable and valid data can help identify patterns, connections, and effects of smartphone-based
experiments on educational outcomes. This information can guide future research and advancements
in educational technology, promoting ongoing enhancements and adjustments to teaching methods
to address the changing needs of modern classrooms.</p>
      <p>This research aims to develop a robust survey tool that accurately measures the various
dimensions of physics teachers' engagement with smartphone-based experiments. The study
evaluated the survey's internal consistency, construct validity, and reliability. This study seeks to
provide a valuable resource for researchers and educators looking to enhance physics education
through innovative technological integration by ensuring that the survey instrument is valid and
reliable. The research questions are: What is the reliability of the physics teachers' beliefs,
competence, knowledge, and attitudes toward smartphone-based physics experiment instruments
How many factors can this instrument measure based on the exploratory factor analysis?</p>
    </sec>
    <sec id="sec-3">
      <title>2. Methodology</title>
    </sec>
    <sec id="sec-4">
      <title>2.1 Instrument Development</title>
      <p>The formulation of the instrument on physics teachers' beliefs, competence, knowledge, and attitudes
toward smartphone-based physics experiments was informed by both the Technology Acceptance
Model (TAM) and the TPACK framework (Technological Pedagogical Content Knowledge), thereby
ensuring that robust theoretical principles underpinned the questionnaire. The TAM is a commonly
utilized framework for comprehending user adoption and utilization of technology. Moslehpour et
al. [8] explained that it encompasses fundamental constructs such as perceived usefulness (PU) and
perceived ease of use (PEOU), which impact users' perspectives toward technology and their
inclination to use it. The TPACK framework underscores the amalgamation of technological
knowledge (TK), pedagogical knowledge (PK), and content knowledge (CK) for the proficient use of
technology in teaching [9].</p>
      <p>Moreover, the teachers' beliefs included their overall attitudes towards smartphone-based physics
experiments, indicating their inclination and preparedness to incorporate them into their teaching.
This concept was founded on the TAM, highlighting the significance of attitudes in shaping
technology adoption.</p>
      <p>Competence was broadly defined to encompass various forms of knowledge, including
technological knowledge (TK), pedagogical knowledge (PK), content knowledge (CK), and the
integrated knowledge represented by TPACK. Technological knowledge refers to teachers' expertise
in smartphone-based physics experiments, while pedagogical knowledge pertains to their
comprehension of effective teaching methods. Content knowledge involves their understanding of
physics concepts, essential for crafting and executing relevant experiments. The TPACK construct
addresses the convergence of these knowledge domains, illustrating teachers' capacity to seamlessly
3
integrate technology into their teaching to improve student learning outcomes.</p>
      <p>Knowledge was categorized into two main areas: subject-specific and technological domains,
acknowledging teachers' need to be knowledgeable in both the content they teach and the
technological tools they utilize. This dual emphasis ensures teachers can proficiently utilize
smartphone technology to enhance their physics instruction. Attitudes stemming from the TAM
were assessed through perceived usefulness (PU) and perceived ease of use (PEOU). PU reflects
teachers' perspectives on the benefits of smartphone-based experiments for improving teaching and
learning. At the same time, PEOU evaluates their perceptions of the ease of implementing these
experiments in their classrooms. Together, these elements offered a comprehensive framework for
comprehending the various facets of physics teachers' involvement with smartphone-based
experiments, guiding the development of a reliable and comprehensive survey instrument.</p>
      <p>Table 1 is an example of a questionnaire statement and its corresponding construct. Initially, 72
statements were made. The survey statements were matched with corresponding theoretical
constructs based on the definitions in the Technology Acceptance Model (TAM) and the
Technological Pedagogical Content Knowledge (TPACK) framework. Each construct was matched
with a questionnaire statement that represented its core concept. Table 1 methodically aligns each
survey item with its corresponding construct, ensuring comprehensive coverage of teachers' beliefs,
competence, knowledge, and attitudes toward smartphone-based physics experiments.</p>
      <p>The total number of items in Table 2 is the final number of statements per construct after the
validity and reliability measures. Table 2 illustrates how the questionnaire items are distributed
across different constructs, revealing the focus on various aspects of teachers' involvement with
smartphone-based physics experiments. The construct with the highest number of items is PU, with
16 items. This emphasis is justified by the critical need to understand teachers' perceptions of the
effectiveness and benefits of using smartphones in physics education. Teachers' beliefs about the
usefulness of this technology are crucial for its acceptance and integration into teaching practices,
hence the larger number of items aimed at capturing these beliefs.</p>
      <p>PEOU and TK contain five items. PEOU is important because the easier it is for teachers to use
smartphone-based experiments, the more likely they are to adopt and consistently use them.
Similarly, Technological Knowledge is vital because teachers need to be skilled in using smartphone
technologies and applications to implement these experiments effectively. TPACK comprises eight
items, reflecting the significance of integrated knowledge for effectively combining technology,
pedagogy, and content. TPACK is a comprehensive construct that captures the intersection of
different knowledge areas, crucial for successful technology integration in teaching.</p>
      <p>PK and CK have fewer items, 2 and 1, respectively. While these constructs are important, they
represent more general and foundational aspects of teaching that may not require as many specific
items in a survey focusing on smartphone-based experiments. PK involves general teaching
strategies, and CK involves subject-specific knowledge; both are critical but might be less dynamic
in integrating new technology than PU and TPACK.</p>
      <p>Overall, 34 items remained after the validation and reliability analysis: 10 in Section 1 (beliefs), 9
in Section 2 (competence), eight statements in Section 3 (knowledge), and 10 in Section 4 (attitude).
or each section. However, in the final section,
knowledge and competence were merged.</p>
    </sec>
    <sec id="sec-5">
      <title>2.2 Participants and Data Collection Procedure</title>
      <p>When the 72 items were assembled, they were sent to three-panel experts. These experts have
experience formulating and validating questionnaires. Upon initial review, the items were trimmed
down to 63.</p>
      <p>The revised instrument was then sent to a five-panel expert for further review (N = 5). The
fivepanel expert comprises an educational technologist, a physics education expert, a psychometrician,
a high school physics teacher, and an educational researcher. The educational technologist is skilled
in integrating technology into teaching and offers valuable insights into the practical application and
relevance of PU and PEOU-related items. The physics education expert with a strong background in
subject matter and innovative teaching methods. The psychometrician or survey methodologist has
a proven track record in developing and validating educational assessments, which was invaluable
for assessing the overall structure and wording of the items. Furthermore, a high school physics
teacher who regularly uses technology in their classroom can provide practical, real-world insights.
Lastly, an educational researcher specializing in teacher professional development and continuous
professional development programs can offer a comprehensive perspective. This time, their marks
and remarks on the questionnaire were used to validate the content validity. The content validity
index was used to determine the instrument's content validity. A more detailed discussion of this
content validity is in the analysis section.</p>
      <p>After the content validation, the items were piloted in a high school in Catbalogan City Division,
Philippines. Six science teachers participated (N = 6). They had at least two years of experience
teaching physics. These teachers provided comments at the end of the questionnaire. Most suggested</p>
      <p>The final administration occurred in the Catbalogan City Division and Samar Division.
Eightyseven teachers (N = 87) participated in this final administration and were chosen using a convenience
sampling technique. This strategy entails selecting participants based on their availability and ease
of access. When distributing the instrument to online teachers, the researcher targeted those who
are readily available or have voluntarily agreed to participate instead of randomly selecting from the
entire pool of physics teachers. The challenge for this study is to locate the teachers. Friends,
colleagues, and acquaintances help the researcher find science teachers via online platforms.</p>
      <p>The instrument was encoded into a Google form and sent to the teachers via Facebook
Messenger. The first part of the instrument asked the participants to read and agree with the
instructions, confidentiality and voluntary clauses, an informed consent form, and a data security
form. Once they clicked "Y agreed to join the study voluntarily.</p>
    </sec>
    <sec id="sec-6">
      <title>2.3 Data Analysis</title>
      <p>The validity and reliability of the instrument, called physics teachers' beliefs, competence,
knowledge, and attitudes toward smartphone-based physics experiments, were assessed through
several methods. Content validity was measured through the content validity index (CVI), and
construct validity was evaluated through exploratory factor analysis (EFA). Meanwhile, the</p>
      <p>
        CVI was employed to ascertain content validity. CVI is a tool used to evaluate the relevance of
items in a questionnaire or test. According to Jeldres et al. [
        <xref ref-type="bibr" rid="ref17 ref2 ref22">10</xref>
        ], CVI assesses how well the items on
an instrument represent the construct being studied based on the experts' evaluation. It
quantitatively evaluates content validity, guaranteeing that the instrument effectively covers the
intended content domain. A group of experts assesses the relevance of each item using a 4-point
scale. A higher CVI score indicates strong content validity, suggesting that the items effectively
capture the intended construct. The tool utilized was modified from Waltz and Bussel [11]. It
comprises four categories: relevance, clarity, simplicity, and ambiguity. The expert assigned a rating
ranging from 1 to 4 for each category. The expert's ratings for the relevance section were 1 for not
relevant and 4 for highly suitable.
relevance, clarity, simplicity, and ambiguity. This involved summing up the ratings provided by all
ten experts and then dividing by the total number of experts. The overall CVI was then derived from
the average of these four sections.
      </p>
      <p>Another method to ensure the study's instrument validity is the EFA. Exploratory Factor
Analysis is utilized in validating instruments to reveal the fundamental structure of data by
identifying the hidden factors that elucidate the correlations among observed variables [12]. It aids
in enhancing the instrument's precision by determining which items are related, thus confirming the
construct and improving the instrument's trustworthiness and validity.</p>
      <p>EFA provides valuable insights into the construct being measured, helping researchers refine
their measurement tools, develop theories, and guide future research efforts. The EFA and
-source software, while the
descriptive statistics, such as mean and standard deviation, were obtained from Microsoft Excel.</p>
    </sec>
    <sec id="sec-7">
      <title>3. Results and Discussion</title>
    </sec>
    <sec id="sec-8">
      <title>3.1 Content Validity</title>
      <p>When evaluating the questionnaire's CVI with input from five experts, the researcher employed a
methodical approach to ensure the questionnaire effectively covered the intended content domain.
The researcher computed each item's CVI. The CVI is derived from the proportion of experts who
provide a rating of 3 or 4 for the item. For instance, if all five experts rate an item as 3 or 4, the CVI
for that item would be 1.0, demonstrating unanimous agreement on its relevance. Conversely, if only
four out of five experts rate it as 3 or 4, the CVI would be 0.8. This approach ensures that each item
receives an individual assessment for its relevance from the expert panel.</p>
      <p>Table 3 illustrates how the five experts rated the questionnaire items based on four criteria:
relevance, clarity, simplicity, and ambiguity. Most items received score ratings between 3.00 and 3.99
for relevance, clarity, and simplicity, indicating generally positive but not perfect scores. However,
a notable number of items scored low for ambiguity (1.00-1.99), highlighting potential issues with
clarity or confusion. Additionally, items scoring between 2.00 and 2.99 for simplicity suggest areas
that could benefit from further simplification to improve overall comprehension and effectiveness.</p>
      <p>Haron et al. [13] devised a strategy to analyze the CVI value. According to them, an item is
deemed appropriate if its CVI is higher than 0.79. If its CVI value falls between 0.70 and 0.79, it
requires revision. Items with a CVI less than 0.70 should be omitted.
Table 4 presents each questionnaire item's CVI and expert interpretation based on relevance and
ambiguity. Each item's relevance and ambiguity were rated on a scale from 1 to 5, and the CVI values
were calculated accordingly. Most items have a high CVI (0.80 or 1.00), indicating that experts
generally agree on the appropriateness and clarity of these items. However, one item (item 3) has a
lower CVI of 0.60 for relevance, suggesting that it may not be suitable and was omitted. The
consistently high CVI scores for ambiguity demonstrate that the items are generally clear and not
confusing to the experts. Of the 63 items during the final administration, 26 were omitted, and 17
were revised. Only 37 items were subjected to the internal consistency analysis and EFA.</p>
    </sec>
    <sec id="sec-9">
      <title>3.2 Reliability of the Instrument</title>
      <p>and 0.976, respectively), indicating excellent reliability. Although the attitudes section has a
y lower than the other section, it still shows good
consistency and reliability for the questionnaire.</p>
    </sec>
    <sec id="sec-10">
      <title>3.3 Construct Validity</title>
      <p>Two assumptions should be cleared before an EFA is conducted. These are the Kaiser Meyer Olkin
(KMO) and Bartlett sphericity tests. The KMO test should yield a Measure of Sampling Adequacy
(MSA) value greater than 0.500, and the Bartlett sphericity tests should have an alpha value less than
0.05 test significance level [15]. The EFA analysis of this present questionnaire yielded an MSA of
0.770 (MSA = 0.770), above the threshold of 0.500 set by Almeida et al. [15]. The Bartlett sphericity
test found the p-value less than 0.001 (p = &lt;0.001) lower than the significance test level of 0.05. Since
all assumptions have been met, the EFA analysis must proceed.</p>
      <p>Table 6
Eigenvalues and Explained Variance for Unrotated and Rotated Solutions in Factor Analysis</p>
      <p>Unrotated Solution Rotated Solution
Factor Eigenvalues Sum Sq Proportion Sum Sq Proportion</p>
      <p>Loadings Variance Loadings Variance
1 16.244 16.027 0.433 13.211 0.357
2 7.882 7.263 0.206 7.166 0.194
3 2.978 2.684 0.073 5.957 0.61
The data from Table 6 shows the variance explained by factors in a factor analysis before and
after rotation. This rotation process simplifies and clarifies the factor structure, making it easier to
interpret the underlying constructs measured by the factors by distributing the variance more evenly
across them. The eigenvalues indicate each factor's variance, with higher values representing a larger
contribution. In the unrotated solution, Factor 1 has an eigenvalue of 16.027, explaining 43.3% of the
variance, while Factor 2 and Factor 3 have eigenvalues of 7.263 (20.6% variance explained) and 2.684
(7.3% variance explained), respectively. Post-rotation, the eigenvalue for Factor 1 decreases to 13.211,
explaining 35.7% of the variance, signifying a redistribution of the explained variance. Factor 2
experiences minimal change, with its eigenvalue decreasing slightly to 7.166 (19.4% variance
explained). Conversely, Factor 3 demonstrates a notable increase in its eigenvalue to 5.957 (6.1%
variance explained).</p>
      <p>The scree plot in Figure 1 affirms the three factors identified in the exploratory factor
nonrotation and rotation analysis. Therefore, the analysis can only measure three factors, with Factor 1
The factor loading used in this study is 0.60 since the number of participants is 87. This process
is based on Hair et al. [16]. The data in Table 7 displays the factor loadings and uniqueness values
for different statements in factor analysis. The statements primarily load onto one of three distinct
factors, illustrating the main dimension they represent. For instance, items KS5 to CS2 are mostly
associated with Factor 1, with strong loadings (e.g., KS5 with a loading of 0.988) and low uniqueness
values, indicating their strong representation of Factor 1. BS3 to BS2 mostly load onto Factor 2,
demonstrating significant loadings (e.g., BS3 with a loading of 0.933) and moderate uniqueness
values, suggesting a good fit for Factor 2. Likewise, items AS2 to AS1 load onto Factor 3 with high
loadings (e.g., AS2 with a loading of 0.873), while AS6 to AS8 exhibit high uniqueness values,
implying they might not align well with any of the three factors. The statements AS6, AS7, and AS8
were omitted.</p>
      <p>Notice that only three factors were recognized by the EFA, and four questionnaire sections exist.
The statements coming from knowledge and competence merge into one factor only. This merging
only means that competence and knowledge are highly correlated. The close relationship between
the assessment items for these two attributes may indicate that participants see competence and
knowledge as closely linked or that the measurement items are conceptually alike. According to
Taniredja and Abduh [17], competencies encompass a complex blend of knowledge, skills,
understanding, values, and affective attitudes demonstrated through actions in specific situations.
Bekere and Tekerel [18] found that knowledge is part of competence, attitude, awareness, and skills.
However, this study found that attitude is another factor. Therefore, in the final instrument, both
statements of knowledge and competencies were merged. The final number of statements in the
questionnaire is 34.
limitation of this research. It may be necessary to conduct additional analysis, such as confirmatory
factor analysis (CFA) or qualitative assessment, to gain a more comprehensive understanding of the
framework of your tool and enhance its measurement characteristics.</p>
      <p>The main implication of this valid and reliable instrument is that it can now be used for a
continuing professional development CPD) program. Effective professional development programs
can benefit from thoroughly assessing physics teachers' beliefs, competence, and attitudes toward
smartphone-based physics experiments. By pinpointing areas where teachers may lack confidence
or harbor negative attitudes, the program could customize its content to address these gaps and
bolster teachers' skills and confidence in utilizing smartphone technology in their teaching.
Additionally, Krabenick and Noda [19] argued that grasping teachers' beliefs can aid in aligning the
program with their values and teaching philosophies, promoting greater engagement and
applicability. Ultimately, targeted professional development initiatives can lead to enhanced
instructional methods and more advanced, technology-integrated physics education [20].</p>
    </sec>
    <sec id="sec-11">
      <title>4. Conclusion</title>
      <p>The development and validation of a new instrument specifically targeting the use of smartphones
in physics teaching may offer original evaluation criteria for future research. Such instruments could
cover general pedagogical excellence and specialized indicators of effective smartphone use.
Smartphones provide opportunities for conducting experiments using built-in sensors
(accelerometers, gyroscopes, etc.). Assessing teachers' ability and willingness to use such features
could become a new area of pedagogical research, emphasizing the importance of digital literacy in
physics teaching. This may offer new approaches to studying how teachers perceive smartphones in
education and their impact on the effectiveness of physics teaching an may include an assessment of
the barriers and motivations that affect integrating such methods into the learning process. The
considered approach with smartphones may consist of new dimensions of professional evaluation of
the skills related explicitly to using digital tools in physics teaching.</p>
      <p>Much empirical research in physics education has focused on using smartphone sensors in
physics experiments. This phenomenon suggests the increasing importance of smartphone sensors
in classroom physics experiments. Despite this emergence, a reliable and valid tool to examine
physics teachers' beliefs, competence, knowledge, and attitudes toward smartphone-based physics
experiments has not been found. This gap is the focus of this research. The first step was to create
the questionnaire using the TAM and TPACK theories. The instrument was subjected to initial expert
review, pilot study, and final administration.</p>
      <p>The data from the final administration was collected for content validity, construct validity, and
internal consistency. The CVI is derived from the proportion of experts who provide a rating of 3 or
4 for the item. Most items have a high CVI (0.80 or 1.00), indicating that experts generally agree on
their appropriateness and clarity. Of the 63 items during the final administration, 26 were omitted,
and 17 were revised. Only 37 items were subjected to the internal consistency analysis and EFA.</p>
      <p>The alpha values represent the internal consistency measure for each statement of the 37-item</p>
      <sec id="sec-11-1">
        <title>Alpha across all sections is 0.951,</title>
        <p>demonstrating strong internal consistency and reliability for the questionnaire.</p>
        <p>However, three statements under the attitude section were omitted because their factor loading
is below the threshold value of 0.600. It was concluded that knowledge and competence correlated
as they merged into factors with very high eigenvalue. This only means that they have similar
concepts, and according to the literature, knowledge is part of competence. On the other hand,
attitude did not correlate with competence, as cited by many in the literature. As a result, attitude
was independent of competence. Finally, the study developed 34 valid and reliable statements that
-based physics
experiments.</p>
      </sec>
    </sec>
    <sec id="sec-12">
      <title>Acknowledgments</title>
      <p>We want to express my profound appreciation to all the teachers who dedicated their time,
knowledge, and personal experiences to help validate our tool. Your involvement was precious, and
your input was crucial in improving this work. Additionally, we sincerely thank the respected team
of experts for their direction, input, and encouragement during this process. Your knowledge and
considered feedback played a crucial role in guaranteeing the thoroughness and applicability of our
tool.</p>
    </sec>
    <sec id="sec-13">
      <title>Declaration of Generative AI</title>
      <p>While preparing this research article, the authors used Grammarly to enhance the paragraphs'
grammar, spelling. Figure 1 is captured using the JASP software. The authors reviewed the improved
courses under digital transformation: A trinational survey among university lab instructors
12</p>
      <sec id="sec-13-1">
        <title>Elements,</title>
        <p>-TPACK: Exploring the
(2024), 978. doi:</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>review</surname>
          </string-name>
          ,
          <source>International Journal of Electrical and Computer Engineering</source>
          ,
          <volume>13</volume>
          (
          <year>2023</year>
          ),
          <fpage>651</fpage>
          -
          <lpage>651</lpage>
          . doi:
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          10.11591/ijece.v13i1.
          <fpage>pp651</fpage>
          -
          <lpage>657</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <article-title>activities to real experiments in physics</article-title>
          . Romanian Reports in Physics,
          <volume>72</volume>
          (
          <year>2020</year>
          ),
          <fpage>905</fpage>
          . URL:
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>https://rrp.nipne.ro/2020/AN72905.pdf</mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <source>for physics courses at universities, Physics education</source>
          ,
          <volume>56</volume>
          (
          <year>2021</year>
          ),
          <volume>035004</volume>
          . doi:
          <volume>10</volume>
          .1088/
          <fpage>1361</fpage>
          -
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <volume>6552</volume>
          /abdee2
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <source>Education Research</source>
          ,
          <volume>19</volume>
          (
          <year>2023</year>
          ),
          <volume>020159</volume>
          . doi:
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>https://doi.org/10.1103/PhysRevPhysEducRes.19.020159</mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          teachers and pupils' perspectives,
          <source>International Journal of Mobile and Blended Learning</source>
          ,
          <fpage>7</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          (
          <year>2015</year>
          ),
          <fpage>1</fpage>
          -
          <lpage>16</lpage>
          . URL: https://files.eric.ed.gov/fulltext/ED557222.pdf
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <surname>Education</surname>
          </string-name>
          ,
          <volume>13</volume>
          (
          <year>2002</year>
          ),
          <fpage>189</fpage>
          -
          <lpage>220</lpage>
          . doi: https://doi.org/10.1023/A:1016517100186
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          University of Canterbury, New Zealand,
          <year>2006</year>
          . doi: http://dx.doi.org/10.26021/9486
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          <string-name>
            <surname>Sustainability</surname>
          </string-name>
          ,
          <volume>10</volume>
          (
          <year>2018</year>
          ),
          <volume>234</volume>
          . doi: https://doi.org/10.3390/su10010234
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>https://doi.org/10.3390/su16030978</mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          <string-name>
            <surname>Sustainability</surname>
          </string-name>
          ,
          <fpage>16</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          <article-title>validity in the social sciences</article-title>
          ,
          <source>Frontiers in Education</source>
          ,
          <volume>8</volume>
          (
          <year>2023</year>
          ),
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          . doi:
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <source>DOI=10</source>
          .3389/feduc.
          <year>2023</year>
          .
          <volume>1271335</volume>
          [11]
          <string-name>
            <given-names>C.F.</given-names>
            <surname>Waltz</surname>
          </string-name>
          &amp;
          <string-name>
            <given-names>B.R.</given-names>
            <surname>Bausell</surname>
          </string-name>
          , Nursing research: design statistics and computer analysis, Davis Fa.,
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          <string-name>
            <surname>Philadelphia</surname>
          </string-name>
          , USA,
          <year>1981</year>
          . [12]
          <string-name>
            <given-names>V.</given-names>
            <surname>Baradaran</surname>
          </string-name>
          &amp; E. Ghorbani,
          <article-title>Development of fuzzy exploratory factor Analysis for designing</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          <article-title>an E-learning service quality assessment model</article-title>
          ,
          <source>International Journal of Fuzzy Systems</source>
          ,
          <volume>22</volume>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          (
          <year>2020</year>
          ),
          <fpage>1772</fpage>
          -
          <lpage>1785</lpage>
          . doi: https://doi.org/10.1007/s40815-020-00901-1 [13]
          <string-name>
            <given-names>S.</given-names>
            <surname>Haron</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.S.</given-names>
            <surname>Ariffin</surname>
          </string-name>
          &amp;
          <string-name>
            <surname>D. Idrus</surname>
          </string-name>
          ,
          <article-title>Validating the development of instrument for measuring</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          <article-title>nurse's performance scale</article-title>
          ,
          <source>Journal of Management Info</source>
          ,
          <volume>6</volume>
          (
          <year>2019</year>
          ),
          <fpage>31</fpage>
          -
          <lpage>38</lpage>
          . doi:
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          10.31580/jmi.v6i1.
          <volume>495</volume>
          [14]
          <string-name>
            <given-names>J.T.</given-names>
            <surname>Shemwell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.C.</given-names>
            <surname>Chase</surname>
          </string-name>
          , &amp;
          <string-name>
            <surname>D.L Schwartz</surname>
          </string-name>
          ,
          <article-title>Seeking the general explanation: A test of inductive</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          <article-title>activities for learning and transfer</article-title>
          ,
          <source>Journal of Research in Science Teaching</source>
          ,
          <volume>52</volume>
          (
          <year>2015</year>
          ),
          <fpage>58</fpage>
          -
          <lpage>83</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          doi: https://doi.org/10.1002/tea.21185 [15]
          <string-name>
            <given-names>L.N.</given-names>
            <surname>Almeida</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Behlau</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. dos Santos</given-names>
            <surname>Ramos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.K.</given-names>
            <surname>Barbosa</surname>
          </string-name>
          &amp;
          <string-name>
            <given-names>A.A.</given-names>
            <surname>Almeida</surname>
          </string-name>
          , Factor analysis of
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          <string-name>
            <surname>Voice</surname>
          </string-name>
          ,
          <volume>36</volume>
          (
          <year>2022</year>
          ),
          <fpage>736</fpage>
          -
          <lpage>e17</lpage>
          . doi: https://doi.org/10.1016/j.jvoice.
          <year>2020</year>
          .
          <volume>08</volume>
          .
          <volume>033</volume>
          [16]
          <string-name>
            <given-names>J.F.</given-names>
            <surname>Hair</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Black</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Babin &amp; R. Anderson</surname>
          </string-name>
          ,
          <article-title>Multivariate Data Analysis</article-title>
          , 7th ed., Prentice Hall,
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          <string-name>
            <given-names>Upper</given-names>
            <surname>Saddle River</surname>
          </string-name>
          , New Jersey,
          <year>2009</year>
          . URL: https://files.pearsoned.de/inf/ext/9781292035116 [17]
          <string-name>
            <given-names>T.</given-names>
            <surname>Taniredja</surname>
          </string-name>
          &amp;
          <string-name>
            <surname>M. Abduh</surname>
          </string-name>
          , Pedagogical, personality, social and professional competence in
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          <article-title>SMPN 3 Purwokerto)</article-title>
          .
          <source>Proceedings of the 2nd International Conference on Science, Technology,</source>
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          <source>and Humanity</source>
          , Arau, Malaysia,
          <year>2016</year>
          , pp.
          <fpage>264</fpage>
          -
          <lpage>272</lpage>
          . URL: http://hdl.handle.net/11617/7485 [18]
          <string-name>
            <given-names>T.A.</given-names>
            <surname>Bekere</surname>
          </string-name>
          &amp;
          <string-name>
            <given-names>S.Z.</given-names>
            <surname>Teketel</surname>
          </string-name>
          ,
          <article-title>Validation of intercultural competence scale in the Ethiopian</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          <source>Journal of Education and Health Promotion</source>
          ,
          <volume>13</volume>
          (
          <year>2024</year>
          ),
          <volume>50</volume>
          . doi:
          <volume>10</volume>
          .4103/jehp.jehp_
          <volume>838</volume>
          _
          <fpage>23</fpage>
          [19]
          <string-name>
            <given-names>S.A. Karabenick &amp; P.A.</given-names>
            <surname>Noda</surname>
          </string-name>
          ,
          <article-title>Professional development implications of teachers' beliefs and</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          <article-title>attitudes toward English language learners</article-title>
          ,
          <source>Bilingual Research Journal</source>
          ,
          <volume>28</volume>
          (
          <year>2024</year>
          ),
          <fpage>55</fpage>
          -
          <lpage>75</lpage>
          . doi:
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          https://doi.org/10.1080/15235882.
          <year>2004</year>
          .
          <volume>10162612</volume>
          [20]
          <string-name>
            <given-names>E.</given-names>
            <surname>Hofer</surname>
          </string-name>
          &amp;
          <string-name>
            <surname>A. Lembens</surname>
          </string-name>
          ,
          <article-title>Putting inquiry-based learning into practice: How teachers changed</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          <string-name>
            <surname>International</surname>
          </string-name>
          ,
          <volume>1</volume>
          (
          <year>2019</year>
          ),
          <volume>20180030</volume>
          . doi: https://doi.org/10.1515/cti-2018
          <source>-0030</source>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>