<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>November</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Automatic Cluster Detection in Hyperspectral Images using Quantum Walrus Optimizers</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Tulika Dutta</string-name>
          <email>munai.tulika@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Siddhartha Bhattacharyya</string-name>
          <email>dr.siddhartha.bhattacharyya@gmail.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Bijaya Ketan Panigrahi</string-name>
          <email>bijayaketanpanigrahi@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jan Platos</string-name>
          <email>jan.platos@vsb.cz</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ivan Zelinka</string-name>
          <email>ivan.zelinka@vsb.cz</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vaclav Snasel</string-name>
          <email>vaclav.snasel@vsb.cz</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Electrical Engineering</institution>
          ,
          <addr-line>IIT Delhi, New Delhi</addr-line>
          ,
          <country country="IN">India</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Electrical Engineering</institution>
          ,
          <addr-line>IIT Delhi, New Delhi</addr-line>
          ,
          <country country="IN">India</country>
          ,
          <institution>and Department of Information Technology, Manipal Institute of Technology Bengaluru, Manipal Academy of Higher Education</institution>
          ,
          <addr-line>Manipal</addr-line>
          ,
          <country country="IN">India</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>VSB Technical University of Ostrava</institution>
          ,
          <addr-line>Ostrava</addr-line>
          ,
          <country country="CZ">Czech Republic</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2024</year>
      </pub-date>
      <volume>2</volume>
      <fpage>8</fpage>
      <lpage>29</lpage>
      <abstract>
        <p>The task of determining the correct number of clusters in hyperspectral images is not a straightforward procedure as the images feature complex spectral data, redundancy, and lack of validation images. Hence, automatic clustering methods are often preferred in practical applications, as they can efectively handle the complexity of spectral data and the absence of validation images. This study introduces two novel algorithms viz., the Qubit Walrus Optimizer (QbWaO) and Qutrit Walrus Optimizer (QtWaO) to leverage quantum principles to address the limitations of classical optimization techniques, which are inspired by the breeding behavior of walruses. QbWaO and QtWaO are designed to automatically detect clusters in hyperspectral images (HSI) by efectively balancing exploration and exploitation. This makes them particularly suited for high-dimensional clustering tasks in a real time environment. The concept of quantum Hadamard gates is used to initialize the population and induce diversity. The optimal clusters are then determined using the Adjusted Rand Index as the fitness function.  score and the  ′ score are used to determine the quality of the grouping. The comparative results suggest that the proposed methods outperform classical approaches in most scenarios, demonstrating their efectiveness in automatically clustering hyperspectral images.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;walrus optimizer</kwd>
        <kwd>qutrit</kwd>
        <kwd>qubit</kwd>
        <kwd>Hadamard gate</kwd>
        <kwd>hyperspectral image</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The emergence of spectral cameras and sensors has significantly improved the ability to collect detailed
information about ground data in various fields, such as agriculture, law enforcement, geography, and
military applications [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Hyperspectral sensors capture data across hundreds of contiguous spectral
bands. This broader spectral and wide area coverage helps represent a combination of multiple materials,
each contributing to the pixel’s overall spectral signature with varying abundance levels. Hyperspectral
imaging (HSI) is a powerful technology, but the vast amount of data in contiguous spectral bands
presents a major computational challenge. Band Selection (BS) techniques are commonly employed as a
preprocessing step to identify unique spectral bands. A key challenge is designing efective criteria for
informative bands while retaining important spectral information. Recently, information theory-based
methods [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] have demonstrated great potential in HSI band selection.
      </p>
      <p>
        One critical task in HSI processing is automatic cluster detection [
        <xref ref-type="bibr" rid="ref17 ref2">2</xref>
        ]. This aims to identify
underlying patterns, classes, or regions without ground-truth images. Traditional clustering algorithms,
such as k-means [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] or fuzzy c-means [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], often struggle with HSI’s high-dimensional, nonlinear
nature. Optimization-based clustering techniques have gained popularity to overcome these challenges.
Nature-inspired metaheuristic optimization techniques, such as Genetic Algorithms [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], Particle Swarm
Optimization [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], and Tabu Search [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], have emerged as powerful tools in HSI clustering. These
methods efectively explore the vast solution space, adapt to HSI data’s complex spectral signatures and
high-dimensional structure, and avoid local minima, thus ofering superior performance over traditional
clustering algorithms. The Walrus Optimizer (WaO) [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] is a relatively new optimization algorithm
ofering eficient global search capability while balancing navigating and leveraging the search space.
WaO [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] can sufer from slow and premature convergence when solving complex problems.
The No Free Lunch Theorem afirms that no one-size-fits-all algorithm can efectively address all
NPhard problems [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Researchers are creating improved metaheuristic algorithms to efectively address
the diverse array of real-world challenges. In recent years, quantum-based metaheuristic methods have
transpired as potential solutions to these problems [
        <xref ref-type="bibr" rid="ref17 ref2">2</xref>
        ]. These algorithms enhance search eficiency by
employing quantum computing principles such as superposition and entanglement. In recent years,
qubit or bi-level quantum metaheuristics have drawn substantial research attention, but developing
higher-order quantum metaheuristics remains a complex challenge [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. However, applying
quantumbased approaches to complex clustering problems requires innovative designs and frameworks.
In this work, qubit and qutrit versions of the Walrus Optimizer algorithm viz., Qubit Walrus Optimizer
(QbWaO) and the Qutrit Walrus Optimizer (QtWaO) are introduced. The breeding behavior of walruses
has inspired these, and quantum principles have enhanced their limitations. QbWaO and QtWaO are
designed to automatically detect clusters in hyperspectral images by leveraging quantum-inspired
mechanisms to balance exploration and exploitation efectively. This makes it ideal for complex
highdimensional HSI clustering. Furthermore, QbWaO and QtWaO incorporate breeding-inspired dynamics
and foraging behavior to maintain population diversity.
      </p>
      <p>The primary contributions are as follows.</p>
      <p>
        • emphQubit and Qutrit-Based WaO: This is the first instance of developing and applying multilevel
quantum versions of the Walrus Optimizer Algorithm for clustering hyperspectral images.
• Enhanced Exploration Phase with Global Best Selection: Instead of selecting a random walrus, the
global best solution or the best walrus is utilized during the exploration phase. This efectively
guides the entire population towards the leader.
• Band Selection Based on Spectral Variability (BSSV) [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]: The proposed method selects the most
dissimilar bands by utilizing spectral variability similarity, ensuring the removal of redundant
information.
• Population Diversity Enhancement with qubit and qutrit Hadamard Gate: The population is
initialized using either qubit or qutrit Hadamard gate, which equally distributes individuals across
the search space, increasing diversity in the population.
      </p>
      <p>The paper is organized as follows: Section 2 presents a brief review of the literature. Relevant concepts
are discussed in Section 3. Section 4 contains the details of the main proposed work. Section 5 thoroughly
describes the dataset used, the experimental design, and the result anyalysis. Finally, Section 6 ofers a
concise conclusion of the proposed methodology.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Brief Survey of Related Works</title>
      <p>
        Hyperspectral imaging has revolutionized the way detail information about ground targets is captured
and analyzed in various fields such as agriculture, geographical monitoring, and defense-based
applications. However, this vast amount of data leads to significant computational challenges, necessitating
efective band selection techniques to identify informative bands while preserving essential spectral
information [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. Recent studies have shown that information theory-based methods ofer promising
results in HSI band selection, allowing for the extraction of critical features from complex datasets [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
In [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], superpixel segmentation is performed to find the heterogeneous area in region-specific
hypergraphs, and then a consensus matrix is built to select bands eficiently for hyperspectral imagery.
Clustering assessment metrics are employed to determine the optimal number of clusters. Adjusted
Rand Index (AIndex) [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], and PBM index [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] are some well-known metrics.
      </p>
      <p>
        Estimating optimal clusters in HSI is important as ground truth data is often absent [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. A lot of HSI
clustering methods have been reviewed in [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. Although an efective process, unsupervised Artificial
DNA Spectral Matching proposed in [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] takes a huge amount of time for execution. Optimization-based
clustering techniques have gained traction in addressing these challenges. The Walrus Optimization
Algorithm (WaO)[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] is a recent addition to this suite of optimization algorithms, distinguished by
its eficient global search capabilities and its balanced approach to exploration and exploitation of
the search space, but it sufers from slow and premature convergence. As no individual optimization
algorithm can universally solve all NP-hard problems[
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], developing enhanced metaheuristic algorithms
that can adapt to a wide range of real-world challenges is needed.
      </p>
      <p>
        In light of these considerations, quantum-based metaheuristic methods have emerged as potential
solutions, harnessing the quantum computing principles, such as superposition, coherence and
decoherence—to improve search eficiency [
        <xref ref-type="bibr" rid="ref17 ref2">2</xref>
        ]. While qubit-based quantum metaheuristics have garnered
significant attention, advancing higher-order quantum metaheuristics, such as those utilizing qutrits,
remains a complex challenge [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. In [17], a qubit-based Grey Wolf Optimizer (QBGWO) is introduced,
incorporating quantum rotation and NOT gates to enhance solution quality. A memetic
quantuminspired evolutionary algorithm was proposed in [18] which combines quantum genetic algorithms
with tabu search. It balances global exploration via quantum rotation gates and local exploitation
through directional mutations, achieving faster convergence and improved performance on benchmark
functions. In [19], a qubit based Diferential Evolution algorithm is developed for eficiently identifying
optimal clusters without prior knowledge and outperforming competitive algorithms in accuracy and
convergence speed.
      </p>
      <p>
        While qubit-based algorithms generally outperform classical metaheuristics, they remain constrained
by the No Free Lunch Theorem [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Consequently, there is significant research interest in developing
higher-order quantum metaheuristics, specifically qudit-based methods. A three-valued quantum or
( qutrit) based Genetic Algorithm was developed in [20]. In [
        <xref ref-type="bibr" rid="ref17 ref2">2</xref>
        ], the Diferential Evolution algorithm
has been introduced for the automated clustering of hyperspectral images. Additionally, six quantum
methodologies based on the Artificial Hummingbird Algorithm, Particle Swarm Optimization, and
Genetic Algorithms have been presented in [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], utilizing both bi-level and tri-level quantum logic for
unsupervised clustering of HSI data.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. Significant Related Concepts</title>
      <p>
        This section discusses several key concepts, viz., band selection method used, foundational principles
of quantum computing, and the WaO[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] algorithm.
      </p>
      <sec id="sec-3-1">
        <title>3.1. Band Selection Based on Spectral Variability [11]</title>
        <p>Spectral Information Divergence (SID) measures the diference between two spectral bands by evaluating
their probability distributions. To compute the SID first, the probability vectors are defined as follows.
For two spectral bands   and  , we first define their probability vectors as
- For band  :
- For band   :
  =
  =</p>
        <p>∑︀=1</p>
        <p>∑︀=1  
,  = 1, 2, . . . , 
,  = 1, 2, . . . , 
where,  is the spectral dimension of the HSI dataset. The SID between the two bands   and
  are defined using the Kullback-Leibler divergence (relative entropy) as</p>
        <p>( ,   ) = ( ||  ) + (  || ),
where, the Kullback-Leibler divergence ( ||  ) is given by
The channel transition probability is defined as

( ||  ) = ∑︁   log
=1
  .</p>
        <p>( ,   )
| = ∑︀
=1 ( ,   )
,
The bands with the lowest SID values are selected as the distinct bands because they exhibit the least
similarity in spectral information.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Quantum Computing Principles</title>
        <p>
          Quantum-based machines represent a powerful computational paradigm that merges quantum
mechanics with fundamentals of computing [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]. The basic unit of quantum machines are the qubit. Qubits exist
in two distinct states: |0⟩ and |1⟩ [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]. These states can be expressed mathematically in the following
column vector notation [21].
        </p>
        <p>|0⟩ =
︂( 1)︂
0 ,
|1⟩ =
︂( 0)︂
1
Qubits exist in a superposition of its basis states (|0⟩ and |1⟩). The number of states that can be
represented by n qubits in superposition grows exponentially (2) and thus provides increased computational
eficacy with fewer resources. The superposition state of a qubit, denoted as |⟩ (where  = 2 for
qubit), is given by</p>
        <p>|⟩ = 0|0⟩ + 1|1⟩
The probabilities that the qubits are in states |0⟩ or |1⟩ are represented by the complex coeficients 0
and 1, respectively. These probabilities also satisfy the following condition.</p>
        <p>The normalization condition for a qubit is expressed as follows [21].</p>
        <p>
          0 ⩽ ||2 ⩽ 1, for  = 0, 1
02 + 12 = 1
(1)
(2)
(3)
(1)
(2)
(3)
(4)
(5)
(6)
Quantum systems can exhibit a wider range of discrete energy levels and exist in more than two states,
known as qudits [
          <xref ref-type="bibr" rid="ref17 ref2">2</xref>
          ]. The smallest multilevel quantum system is called a qutrit [
          <xref ref-type="bibr" rid="ref17 ref2">2</xref>
          ]. A qutrit can similarly
be represented as
⎛1⎞
⎛0⎞
        </p>
        <p>
          ⎛0⎞
|0⟩ = ⎝0⎠ ,
0
|1⟩ = ⎝1⎠ ,
0
|2⟩ = ⎝0⎠
1
Qudits ofer better performance than qubits [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]. They require fewer units to achieve superior results
than a larger number of qubits, thus reducing computation time. Mathematically, to represent the

same information as  qubits, only log2(3) qutrits are needed, resulting in an eficiency increase of
approximately log2(3) ≈ 1.6 [
          <xref ref-type="bibr" rid="ref17 ref2">2</xref>
          ]. This reduction significantly lowers decoherence in qudit-based
systems, enhancing their computational power. A qutrit can be expressed in a superposition state as
follows.
        </p>
        <p>|⟩ = 0|0⟩ + 1|1⟩ + 2|2⟩</p>
        <sec id="sec-3-2-1">
          <title>The normalization condition for a qutrit is given by</title>
          <p>︂[ 01 02 03 . . . 0]︂</p>
          <p>11 12 13 . . . 1
⎡01 02 03 . . . 0⎤
⎣11 12 13 . . . 1⎦</p>
          <p>21 22 23 . . . 2
Furthermore, the probabilities associated with the states of a qutrit must satisfy the following condition.</p>
        </sec>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Walrus Optimizer Algorithm [8]</title>
        <p>
          The WaO [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] algorithm is based on the behavior of walruses. Their ways of migrating, breeding,
roosting, and foraging are considered by Han et al.. This approach models walruses social structures
and role divisions, assuming populations interpret behavior through danger and safety signals. The
steps of WaO [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] are briefly described in Algorithm 1.
        </p>
        <p>
          The WaO [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] uses a population of "walruses" (agents) to explore and exploit the search space for optimal
solutions to optimization problems. The algorithm divides the population into adults (90%) and juveniles
(10%), with diferent roles for males, females, and juveniles.
        </p>
        <p>
          Key features of WaO [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] include danger and safety signals, which encompasses the phases of exploration
and exploitation in the algorithm. In high-risk conditions, walruses migrate causing enhanced searching
in the entire solution space, while in safer conditions, they reproduce causing exploitation of the
search space. Migration is achieved by adjusting positions based on other walruses’ locations, while
reproduction involves updating positions through male influence (using a Halton sequence) and female
influence from both the male and the best solution. Juveniles update their positions to avoid predators.
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Proposed Work</title>
      <p>The proposed methodology can be divided into two sections viz., BSSV based Band Depletion and
Quantum Walrus Optimizer Algorithms. Band minimization is done as stated in Section 3.1.</p>
      <sec id="sec-4-1">
        <title>4.1. Quantum Walrus Optimizers (QbWaO and QtWaO)</title>
        <p>The quantum versions viz., Qubit Walrus Optimizer and Qutrit Walrus Optimizer algorithms follow the
same steps as mentioned in Algorithm 1. The modifications introduced are as follows:
• Qubit or Qutrit Encoding and Observation [20] : The qubit walrus population is represented as
follows.
0 ⩽ ||2 ⩽ 1, for  = 0, 1, 2 (qutrit)
(7)
(8)
Similarily, the qutrit walrus population is represented in the following manner.</p>
        <p>As Hadamard gate initializes all the states with equal amplitude values, qubit states are initialized
in the following manner.</p>
        <p>, = 1/(2) + 1/(2)
(23)
where, 01 = 1/(2) and 11 = 1/(2) for each individual qubit. Similarly, the following
method is used for a qutrit.</p>
        <p>, = 1/(3) + 1/(3) + 1/(3)
(24)</p>
        <sec id="sec-4-1-1">
          <title>Algorithm 1 Walrus Optimizer Algorithm [8]</title>
          <p>1: Input: Population   consisting of  Walrus of  dimension, Maximum iterations   , 1, 2, 3,4 and 5 are random
numbers between (0,1),   is Migration step,  is a distress coeficient designated by random number between (0,1).
2:   is divided into 45% Male Walrus (MW) with top fitness values, 45% Female Walrus (FW) and 10% Young Walrus (YW) in</p>
          <p>Exploitation Phase. Best Male Walrus (BW1) has the highest fitness, and (BW2) has the second highest fitness value.
3: Initialize the population   using
 , = random(0, 1)
  =  · 
 = 1 −</p>
          <p>,</p>
          <p>= 2 × ,  = 2 × 1 − 1
4: Calculate fitness values  for each walrus
5: while  ≤   do
6: Calculate danger signal
where, , , and  are given by
if | | ≥ 1 then</p>
          <p>Exploration phase
Update positions of each walrus
where, the Migration step is calculated as
where,  is a randomly selected position, and</p>
          <p>= 1 −
else</p>
          <p>Exploitation phase
if Safety signal ≥ 0.5 then</p>
          <p>Breeding behavior
for each MW do</p>
          <p>Use the Halton sequence to update the position
end for
for each FW do</p>
          <p>Update</p>
          <p>,+1 =  , + 
  = , −  , ·  · 3</p>
          <p>1
1 + exp (︁ − − ×1/02 )︁
7:
8:
9:
10:
11:
12:
13:
14:
15:
16:
17:
18:
19:
20:
21:
22:
23:
24:
25:
26:
27:
28:
29:
30:
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
endforis generated using Lévy distribution to imitate Lévy movement
else</p>
          <p>Foraging behavior
if | | ≥ 0.5 then</p>
          <p>Gathering behavior
Update positions
 It,+1 =  It, +  · ( It1,1 −  It,)</p>
          <p>+ (1 − ) · ( 1It −  It,)
 ,+1 = ( −  ,) · 
 =  1 +  , · 
 ,+1 =
 1, +  2,</p>
          <p>2
 1,+ 1 =  1, − 1 · 1 · |  1, −  ,|
 2,+ 1 =  2, − 2 · 2 · |  2, −  ,|</p>
          <p>=  · 5 − ,  = tan( ),  = 1, 2
 ,+1 =  , ·  − |  1 −  ,| · 4
end for
for each YW do</p>
          <p>Update position
where:
where,</p>
          <p>Calculate  and 
else</p>
          <p>Fleeing behavior</p>
          <p>Update positions
else if 0, ≤  &lt; 1, then
population is equally distributed in the solution space.</p>
          <p>Equations (23) and (24) sufice the normalization principles stated in Section 3.2. The following
Algorithms 2 and 3 are used for finding the probable classical state observations [ 20]. The notations
of ,  and   are same as specified in Algorithm 1. The resulting classical representation in
binary and ternary form after observations for each walrus are
• Equqation (13) is replaced by the equation given below.</p>
          <p>︀[ 0 1 0 1 1 1 1 0 0 · · ·</p>
          <p>︀]
︀[ 0 2 0 1 1 2 1 0 0 · · ·</p>
          <p>
            ︀]
  =  −  , ·  · 3
(25)
(26)
(27)
Instead of considering any random Walrus, the best Walrus with the highest fitness value enhances
the exploration phase.
• At the end of every iteration, it is checked whether the quantum normalization principles are
adhered to. When it does not stand true Equations (23) and (24) are used to reinitialize those
walruses.
• In every iteration, a random count of zeros is incorporated into the population for which  
is true. The non-zero values designate the cluster centers using k-mean++ [
            <xref ref-type="bibr" rid="ref3">3</xref>
            ] algorithm, the
appropriate clusters are constructed. The AIndex [
            <xref ref-type="bibr" rid="ref13">13</xref>
            ] is used as the Cluster Validity Index to
identify the optimal cluster numbers.
          </p>
          <p>The complexity of the band minimization is as follows:</p>
          <p>( ×  × )
Here  bands are present each of dimensions  ×  pixels. For the QbWaO, if  is a randomly
chosen pixel intensity value,  represnts the length of individual population,   is the total number
of iterations then the worst-case time complexity is:</p>
        </sec>
        <sec id="sec-4-1-2">
          <title>For QtWaO, the time complexity is as follows:</title>
          <p>(2 ×  ×  ×   )
(3 ×  ×  ×   )</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Findings and Analysis</title>
      <p>This section presents the various parameters used, the dataset employed, the statistical tests conducted,
and the analysis of the experimental results.</p>
      <sec id="sec-5-1">
        <title>5.1. HSI Dataset</title>
        <p>The WHU-Hi-LongKou dataset (WHU) [22] [23] was collected in 2018, featuring scenes of Longkou
Town in Hubei province, China. It has 9 classes spread over an agricultural landscape with six types of
crops. It has a spatial dimension of 550 × 400 pixels and a spectral dimension of 270 spectral bands. The
UAV-borne hyperspectral imagery has a spatial resolution of approximately 0.463 meters.</p>
      </sec>
      <sec id="sec-5-2">
        <title>5.2. Experimental Configurations and Analysis</title>
        <p>
          QbWaO and QtWaO are compared with their classical version WaO [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] using the WHU-Hi-LongKou
dataset [22] [23]. To conduct an impartial study, all algorithms are executed for 50 times over 100
iterations. An Intel Core i7-8700 processor and a Windows 11 machine were used. All simulations were
conducted on MATLAB 2023b. In WaO [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ], each Walrus has dimensions of 20 and 20 walruses were
considered. The dimensions were kept the same for both QbWaO and QtWaO. The number of walruses
taken was 16 for QbWaO and 14 for QtWaO, respectively. For QbWaO, 8 male and female walruses
and 2 young walruses were taken. In QtWaO, 7 male and female walruses and 2 young walruses were
taken. The proposed methods are also compared with Kmeans [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ] algorithm using taking the predefined
cluster number as 6.
        </p>
        <p>
          Various statistical tests like mean and standard deviation are recorded in Table 1 for all three algorithms.
The time required for the convergence of each algorithm is noted in Table 1. From Table 1 and Figure 1
it can be observed that QtWaO takes far lesser time to converge than the other two algorithms.
Table 2 contains the optimal AIndex [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] values, automatically detected cluster numbers,  score [24]
and  ′ score [24] are presented. The  score [24] and  ′ score [24] help determine the clustering
quality. QtWaO produces 10 clusters nearest to the classes in the WHU Dataset [22] [23]. The clustered
images, along with the minimized image produced by using BSSV [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ] and the ground truth image of
WHU Dataset [22] [23] are presented in Figure 2.
        </p>
        <p>The null hypothesis is evaluated using the One-Way ANOVA test [25], which assesses whether the
results originate from the same probability distribution. The null hypothesis is rejected if the  value
is below the 1% significance level, indicating support for the alternative hypothesis. The results are
summarized in Table 3. Additionally, Tukey’s post hoc test [25] is conducted. The box plots for both
tests are shown in Figure 3.</p>
        <p>Based on the various tests conducted and the parameters evaluated, QtWaO generally outperforms
QbWaO in most cases. Also, the population size and convergence speed of QtWaO are far better than
those of both QbWaO and WaO.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Conclusion</title>
      <p>Qubit and Qutrit Walrus Optimizer algorithms significantly advance unsupervised clustering for
hyperspectral imagery. By integrating quantum principles and biological inspirations, QbWaO and QtWaO
efectively address the challenges associated with the slow and premature convergence of the Walrus
Optimizer algorithm. The enhanced exploration phase integrated with the Hadamard gate and band
selection strategies improves clustering performance and ensures the robustness of the algorithms
in high-dimensional spaces. The qutrit version performs better in almost all aspects, producing a
near-optimal number of clusters in less time. Future work could explore enhancements to these
quantum algorithms, including implementing more advanced quantum techniques and their application to
various domains requiring sophisticated data clustering solutions. Additionally, parameter reduction
can be implemented in the Walrus Optimizer.</p>
    </sec>
    <sec id="sec-7">
      <title>Acknowledgments</title>
      <p>Thanks to the developers of ACM consolidated LaTeX styles https://github.com/borisveytsman/acmart
and to the developers of Elsevier updated LATEX templates https://www.ctan.org/tex-archive/macros/
latex/contrib/els-cas-templates.</p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>The author(s) have not employed any Generative AI tools.
[17] Y. Liu, C. Li, J. Xiao, Z. Li, W. Chen, X. Qu, J. Zhou, Qegwo: Energy-eficient clustering approach
for industrial wireless sensor networks using quantum-related bioinspired optimization, IEEE
Internet of Things Journal 9 (2022) 23691–23704.
[18] A. S. Hesar, M. Houshmand, A memetic quantum-inspired genetic algorithm based on tabu search,</p>
      <p>Evolutionary Intelligence 17 (2024) 1837–1853.
[19] A. Dey, S. Bhattacharyya, S. Dey, J. Platos, V. Snasel, A quantum inspired diferential evolution
algorithm for automatic clustering of real life datasets, Multimedia Tools and Applications 83
(2024) 8469–8498.
[20] V. Tkachuk, Quantum genetic algorithm based on qutrits and its application, Mathematical</p>
      <p>Problems in Engineering 2018 (2018) 8614073.
[21] T. Dutta, S. Bhattacharyya, S. Mukhopadhyay, Automatic clustering of hyperspectral images using
qutrit exponential decomposition particle swarm optimization, in: 2021 IEEE International India
Geoscience and Remote Sensing Symposium (InGARSS), 2021, pp. 289–292.
[22] Y. Zhong, X. Wang, Y. Xu, S. Wang, T. Jia, X. Hu, J. Zhao, L. Wei, L. Zhang, Mini-uav-borne
hyperspectral remote sensing: From observation and processing to applications, IEEE Geoscience
and Remote Sensing Magazine 6 (2018) 46–62.
[23] Y. Zhong, X. Hu, C. Luo, X. Wang, J. Zhao, L. Zhang, Whu-hi: Uav-borne hyperspectral with high
spatial resolution (h2) benchmark datasets and classifier for precise crop identification based on
deep convolutional neural network with crf, Remote Sensing of Environment 250 (2020) 112012.
[24] M. Borsotti, P. Campadelli, R. Schettini, Quantitative evaluation of color image segmentation
results, Pattern Recognition Letters 19 (1998) 741–747.
[25] E. Liberto, D. Bressanello, G. Strocchi, C. Cordero, M. R. Ruosi, G. Pellegrino, C. Bicchi, B. Sgorbini,
Hs-spme-ms-enose coupled with chemometrics as an analytical decision maker to predict in-cup
cofee sensory quality in routine controls: Possibilities and limits, Molecules 24 (2019).
(a)</p>
      <p>(a) (b)
Figure 3: (a) One-Way ANOVA test [25], (b) Tukey’s post hoc test [25] - results for WHU Dataset [22] [23]</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>C.-I.</given-names>
            <surname>Chang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-M.</given-names>
            <surname>Kuo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.-C. Liang</surname>
            , K. Y. Ma,
            <given-names>P. F.</given-names>
          </string-name>
          <string-name>
            <surname>Hu</surname>
          </string-name>
          ,
          <article-title>Self-mutual information-based band selection for hyperspectral image classification</article-title>
          ,
          <source>IEEE Transactions on Geoscience and Remote Sensing</source>
          <volume>59</volume>
          (
          <year>2021</year>
          )
          <fpage>5979</fpage>
          -
          <lpage>5997</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>T.</given-names>
            <surname>Dutta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Bhattacharyya</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. K.</given-names>
            <surname>Panigrahi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Platos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Snasel</surname>
          </string-name>
          ,
          <article-title>Automatic hyperspectral image clustering using qutrit diferential evolution</article-title>
          , in: Y.
          <string-name>
            <surname>Tan</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          Shi (Eds.),
          <source>Advances in Swarm Intelligence</source>
          , Springer Nature Singapore, Singapore,
          <year>2024</year>
          , pp.
          <fpage>280</fpage>
          -
          <lpage>294</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>D.</given-names>
            <surname>Arthur</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          <article-title>Vassilvitskii, k-means++: the advantages of careful seeding</article-title>
          ,
          <source>in: Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms</source>
          , SODA '07,
          <string-name>
            <surname>Society</surname>
          </string-name>
          for Industrial and Applied Mathematics, USA,
          <year>2007</year>
          , p.
          <fpage>1027</fpage>
          -
          <lpage>1035</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Bezdek</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Ehrlich</surname>
          </string-name>
          , W. Full,
          <string-name>
            <surname>FCM:</surname>
          </string-name>
          <article-title>The fuzzy c-means clustering algorithm</article-title>
          ,
          <source>Computers &amp; Geosciences</source>
          <volume>10</volume>
          (
          <year>1984</year>
          )
          <fpage>191</fpage>
          -
          <lpage>203</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>J. Holland,</surname>
          </string-name>
          <article-title>Adaptation in natural and artificial systems: An introductory analysis with application to biology, control</article-title>
          ,
          <source>and artificial intelligence</source>
          , University of Michigan Press,
          <year>1975</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Shi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Eberhart</surname>
          </string-name>
          ,
          <article-title>A modified particle swarm optimizer</article-title>
          ,
          <source>in: 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360)</source>
          ,
          <year>1998</year>
          , pp.
          <fpage>69</fpage>
          -
          <lpage>73</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>F.</given-names>
            <surname>Glover</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Laguna</surname>
          </string-name>
          , Tabu Search,
          <string-name>
            <surname>Springer</surname>
            <given-names>US</given-names>
          </string-name>
          , Boston, MA,
          <year>1998</year>
          , pp.
          <fpage>2093</fpage>
          -
          <lpage>2229</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>M.</given-names>
            <surname>Han</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Du</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. F.</given-names>
            <surname>Yuen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Zhu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Yuan</surname>
          </string-name>
          ,
          <article-title>Walrus optimizer: A novel nature-inspired metaheuristic algorithm</article-title>
          ,
          <source>Expert Systems with Applications</source>
          <volume>239</volume>
          (
          <year>2024</year>
          )
          <fpage>122413</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>S.</given-names>
            <surname>Mirjalili</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Mirjalili</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Lewis</surname>
          </string-name>
          , Grey wolf optimizer,
          <source>Advances in Engineering Software</source>
          <volume>69</volume>
          (
          <year>2014</year>
          )
          <fpage>46</fpage>
          -
          <lpage>61</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>T.</given-names>
            <surname>Dutta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Bhattacharyya</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. K.</given-names>
            <surname>Panigrahi</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Zelinka</surname>
          </string-name>
          ,
          <string-name>
            <surname>L.</surname>
          </string-name>
          <article-title>Mrsic, Multi-level quantum inspired metaheuristics for automatic clustering of hyperspectral images</article-title>
          ,
          <source>Quantum Machine Intelligence</source>
          <volume>5</volume>
          (
          <year>2023</year>
          )
          <fpage>1</fpage>
          -
          <lpage>35</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>C.-I. Chang</surname>
          </string-name>
          ,
          <article-title>An information-theoretic approach to spectral variability, similarity, and discrimination for hyperspectral image analysis</article-title>
          ,
          <source>IEEE Transactions on Information Theory</source>
          <volume>46</volume>
          (
          <year>2000</year>
          )
          <fpage>1927</fpage>
          -
          <lpage>1932</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>B.</given-names>
            <surname>Fu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Cui</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Shang</surname>
          </string-name>
          ,
          <article-title>Structure-preserved and weakly redundant band selection for hyperspectral imagery</article-title>
          ,
          <source>IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing</source>
          <volume>17</volume>
          (
          <year>2024</year>
          )
          <fpage>12490</fpage>
          -
          <lpage>12504</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>S.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , H.-S. Wong,
          <article-title>Arimp: A generalized adjusted rand index for cluster ensembles</article-title>
          ,
          <source>in: 2010 20th International Conference on Pattern Recognition</source>
          ,
          <year>2010</year>
          , pp.
          <fpage>778</fpage>
          -
          <lpage>781</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>M. K. Pakhira</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Bandyopadhyay</surname>
          </string-name>
          , U. Maulik,
          <article-title>Validity index for crisp and fuzzy clusters</article-title>
          ,
          <source>Pattern recognition 37</source>
          (
          <year>2004</year>
          )
          <fpage>487</fpage>
          -
          <lpage>501</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>H.</given-names>
            <surname>Zhai</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <surname>L. Zhang,</surname>
          </string-name>
          <article-title>Hyperspectral image clustering: Current achievements and future lines</article-title>
          ,
          <source>IEEE Geoscience and Remote Sensing Magazine</source>
          <volume>9</volume>
          (
          <year>2021</year>
          )
          <fpage>35</fpage>
          -
          <lpage>67</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>H.</given-names>
            <surname>Jiao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Zhong</surname>
          </string-name>
          ,
          <string-name>
            <surname>L. Zhang,</surname>
          </string-name>
          <article-title>An unsupervised spectral matching classifier based on artificial dna computing for hyperspectral remote sensing imagery</article-title>
          ,
          <source>IEEE Transactions on Geoscience and Remote Sensing</source>
          <volume>52</volume>
          (
          <year>2014</year>
          )
          <fpage>4524</fpage>
          -
          <lpage>4538</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <article-title>Figure 2: (a) Ground Truth image of WHU Dataset [22] [23], (b) Resultant image using BSSV [11], (c) Clustered image using WaO [8], (d) Clustered image using QbWaO, (e) Clustered image using QtWaO with AIndex [13], (f) results using -means [3] (cluster number = 6) on</article-title>
          WHU Dataset [
          <volume>22</volume>
          ] [23]
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>