<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>A. Kudriashova);</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Information system for determining the priority of digital image quality factors⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alona Kudriashova</string-name>
          <email>alona.v.kudriashova@lpnu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vsevolod Senkivskyy</string-name>
          <email>vsevolod.m.senkivskyi@lpnu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Iryna Pikh</string-name>
          <email>iryna.v.pikh@lpnu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Bohdan Durnyak</string-name>
          <email>bohdan.v.durnyak@lpnu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Taras</string-name>
          <email>taras.i.oliiarnyk@lpnu.ua</email>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Lviv Polytechnic National University</institution>
          ,
          <addr-line>Stepan Bandera Str., 12, Lviv, 79013</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2026</year>
      </pub-date>
      <volume>000</volume>
      <fpage>0</fpage>
      <lpage>0002</lpage>
      <abstract>
        <p>The quality of digital images is determined by a set of factors, among which the key ones are resolution, color depth, color model, file format, file size, image size, compression, brightness, saturation, and sharpness. The paper presents a developed semantic model that represents the relationships between factors. Reachability matrices are formed for direct and indirect relationships. The rank and weight of each factor are determined. The calculation results are presented in a tabular format. The obtained results of the factor ranking confirm the hypothesis of the unequal influence of various parameters on the final quality of digital images and substantiate the feasibility of using the ranking method to determine the factor priority. An information system is developed to determine the factor priority using the ranking method based on semantic networks and reachability matrices. It provides a full cycle of analysis from input of primary data to visualization of final results. The constructed algorithm is implemented by software tools in the Python language using modular architecture and a graphical user interface. The experiment shows that the highest priority is given to parameters such as color model, file format, and resolution, while file size and sharpness have the lowest impact. The proposed system provides a comprehensive analysis and can be used to make informed decisions when processing and using digital images in various fields.</p>
      </abstract>
      <kwd-group>
        <kwd>Digital image</kwd>
        <kwd>information system</kwd>
        <kwd>quality factor</kwd>
        <kwd>semantic network</kwd>
        <kwd>ranking method</kwd>
        <kwd>factor priority</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        A digital image is a discrete form of visual information presentation based on a matrix
representation of a signal in the form of pixels with certain numerical values of brightness and
color. Its creation involves the discretization and quantization of a continuous visual signal, which
provides the ability to store, transmit and algorithmically process data with a high level of
reproducibility [
        <xref ref-type="bibr" rid="ref1 ref2 ref3">1–3</xref>
        ]. The main influence on the quality of digital images is exerted by such factors
as resolution, color depth, color model, file format, file size, image size, brightness, saturation,
sharpness.
      </p>
      <p>
        Resolution determines the number of pixels in an image or the density of pixels per unit area.
High-resolution images contain more information about small structures. High resolution is
especially important in areas where small details are of great importance (medical diagnostics,
security, video production, etc.) [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Color depth determines the number of color shades that can be
encoded in each pixel. Increasing the color depth enhances the number of possible gradations of
each channel brightness, i.e. the range of reproducible colors and the smoothness of their
transitions. This allows one to capture tone differences and avoid rapid posterization [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. The term
"color model" should be understood as a way to represent colors in digital images, usually through
coordinates in a selected color area. The most common color models are RGB, CMYK, HSV/HSL,
LAB [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. The image file format determines the way the data is encoded and compressed. Some
formats (e.g. PNG, TIFF) use lossless compression or store data without loss at all, while others
(JPEG, WebP, etc.) use lossy algorithms. Lossy compression reduces the file size by removing subtle
details, but can degrade the quality at high compression levels [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. The file size of an image
(number of bytes) is directly related to its size, resolution, and color depth [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. The image size
usually refers to its dimensions in pixels. The details also depend on this: images with more pixels
capture more information. Thus, the optimal image size is selected according to the context: a
larger size allows one to see fine details, but requires more memory and processing time [
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ].
Brightness characterizes the level of the image illumination. A standard brightness balance allows
one to convey details in both bright and dark areas. If the brightness is insufficient, the shadows
become excessively dark and the details are lost. If the brightness is excessive, “burned out” areas
appear with the information loss [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. Saturation describes how bright and “pure” the color is in
the image. Insufficient saturation makes the image light and less informative, while excessively
high saturation can cause oversaturation and color distortion. Balanced saturation provides
naturalness and expressiveness of color, which positively affects the perception of the image
quality [
        <xref ref-type="bibr" rid="ref10 ref11">10, 11</xref>
        ]. The image sharpness is the clarity degree of textures and contours of various
details, which affects the information perception [
        <xref ref-type="bibr" rid="ref12 ref13">12, 13</xref>
        ].
      </p>
      <p>
        Modern research indicates that digital images are the basis for the functioning of computer
vision systems [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], automated object classification and segmentation [
        <xref ref-type="bibr" rid="ref15 ref16">15, 16</xref>
        ], as well as intelligent
data analysis systems [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. Their quality determines the accuracy and reliability of the processing
results. The quality of digital images is influenced by many factors, prioritizing of which
contributes to improving the information presentation, which indicates the relevance of the
research conducted, the main goal of which is to develop an information system for analyzing
relationships between factors and determining priorities using the ranking method. At the same
time, the main tasks for achieving this goal are: developing a semantic model of relationships
between factors of the digital image quality, constructing a reachability matrix of factors,
determining the factor priority using the ranking method, developing an algorithm for the
operation of an information system to determine the factor priority, and implementing this system.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. Literature review</title>
      <p>A number of modern scientific works are devoted to the image processing and the study of factors
affecting its quality. In [18], a multi-scale image quality transformer MUSIQ is presented, which
processes illustrative material in native dimensions and allows capturing quality at different scales.
The model is also able to capture the image quality with different degrees of detail. However,
MUSIQ presents an integral quality assessment without explicit decomposition into individual
factors.</p>
      <p>The study [19] is devoted to the non-reference assessment of the image quality based on
SwinTransformer statistics and natural scene. The model performs multi-scale feature extraction and
uses a function improvement module. Natural scene statistics compensate for the information loss
caused by the image size changes. The advantage of such solutions is the effective aggregation of
local and global information, which increases the stability of predictions on heterogeneous
distortions. The disadvantage is the low interpretability of the results without presenting
causeand-effect relationships between factors.</p>
      <p>An important study [20] is devoted to a systematic analysis of current approaches to the quality
assessment of medical images. The authors summarize 42 studies and compare the image
assessment methods. They note that high-quality images provide improved visibility of anatomical
structures, anomalies and lesions, which leads to more accurate diagnosis. At the same time, noise,
resolution problems and artifacts are identified as the main problems of the image quality. The
need to develop consistent assessment procedures is indicated to improve diagnostic outcomes and
patient care.</p>
      <p>Human perceptual factors (age, experience, viewing context, content genre) are important
modulating variables in the quality interpretation, as evidenced by experimental studies [21].</p>
      <p>Single- and two-factor studies [22–24] (e.g., detailed analysis of the effects of blur [22],
brightness and structural characteristics [23], resolution and sharpness [24]) provide a deep
understanding of the influence mechanisms of individual characteristics and allow the creation of
specialized quality indicators for specific tasks. However, a narrow focus does not ensure the
assessment completeness. Instead, the use of multi-criteria analysis methods makes it possible to
identify cause-and-effect relationships between factors [25, 26]. In addition, methods aimed at
increasing the generalizability of quality representations and multitask approaches demonstrate
that additional related tasks (defect detection, semantic features, linguistic representations) can
significantly improve the quality of predictions [27, 28].</p>
    </sec>
    <sec id="sec-3">
      <title>3. Material and methods</title>
      <p>Visualization of relationships between factors is carried out using modeling based on the theory of
semantic networks. The conceptual essence of semantic networks is to represent a complex subject
area in the form of a structure formed by sets of nodes and arcs. Nodes correspond to individual
concepts or objects (in this case, digital image quality factors), and arcs denote the relationship
between them. Each relationship has a clearly defined semantic load, which sets its semantic
interpretation. That is, semantic networks provide the possibility of a comprehensive analysis of
interdependencies between concepts. Let X ={x1 , x2 , ... , xn} be a set of factors to be analysed.
Each factor xi represents a certain parameter or characteristic that can affect the image final
quality. The semantic network is presented in the form of a directed graph G=(V , E ), where the
set of vertices V corresponds to the set of factors X , i.e. V = X . Thus, each node of the graph is
equivalent to a specific factor. The set of arcs E represents all existing relationships between
factors. The existence of the arc e={xi , x j }∈ E means that factor xi influences factor x j or, on the
contrary, it is in a relationship of dependency with it [25, 26].</p>
      <p>Based on the relationships between factors, the reachability matrix M =[ mij ] is constructed.
The reachability matrix is square, and the rows and columns correspond to the ordered set of
factors X ={x1 , x2 , ... , xn}. The element mij, located at the intersection of the corresponding row
and column (i , j), represents the fact of the presence or absence of a reachability relationship from
one factor to another. If the relationship exists, one is written, if it is absent, then zero is indicated.
In this case, the main diagonal is always filled with ones, which represents the fundamental
property of reflexivity: each factor has a reachability relationship with itself [25, 28, 29].</p>
      <p>The factor importance (priority) is determined by the ranking method. Four subsets are formed:
direct influences, indirect influences, direct dependencies and indirect dependencies. The number
of relationships of each type is denoted as h1 , h2 , h3 , h4. Then the influences have a positive value,
because they enhance the factor significance. Dependencies are interpreted as negative values,
because they reduce its autonomy. It is natural to assume that by module, the weights of influence
and dependency are the same [29, 30]. Let the weight values for influences and dependencies be as
follows: w1=10 , w2=5 , w3=−10 , w4=−5, The final weight of each factor is determined by the
sum of all four components:</p>
      <p>4 n
X ij=∑ ∑ hij wi .</p>
      <p>i=1 j=1
(1)</p>
      <p>Since some of the coefficients have positive values (w1&gt;0 , w2&gt;0), and some have negative
values (w3&gt;0 , w4&gt;0), there is a need to shift the scale to the positive area. This is achieved by
normalization using the formula:</p>
      <p>Δ j=max|X 3 j|+max|X 4 j|,( j=1 , 2 , ... , n). (2)</p>
      <p>Taking into account the expressions (1) and (2), the final formula for calculating the weight
values of factors is as follows:</p>
      <p>4 10
X Fj=∑ ∑ ( xij wi+ Δ j).</p>
      <p>i=1 j=1
(3)</p>
      <p>The factor that receives the highest weight value corresponds to the highest rank R j. The
factor priority P j is interpreted as the inverse of the rank. That is, the rank with the highest value
corresponds to the first priority [25, 28].</p>
    </sec>
    <sec id="sec-4">
      <title>4. Experiment, results and discussion</title>
      <p>The main factors affecting the quality of digital images are: X 1 — resolution, X 2 — color depth, X 3
— color model, X 4 — file format, X 5 — file size, X 6 — image size, X 7 — compression, X 8 —
brightness, X 9 — saturation, X 10 — sharpness. The specified factors form the following set:
X ={X 1 , X 2 , X 3 , X 4 , X 5 , X 6 , X 7 , X 8 , X 9 , X 10}. The relationships between the factors are
demonstrated using the developed semantic network (Fig. 1).</p>
      <p>According to the semantic network (Fig. 1) reachability matrices are developed: M 1 is a matrix
of direct relationships, M 2 is a matrix of indirect relationships.
influences on the analyzed factor; h2 j is the number of 2nd-order influences (indirect); h3 j is the
number of direct dependencies; h4 j is the number of 2nd-order dependencies; X 1 j , X 2 j , X 3 j and
X 4 j are calculated according to the expression (1); X Fj characterizes the overall assessment of
factors taking into account all categories of relationships and the corrective mechanism and is
calculated according to the formula (3); the factor rank represents its number in the overall rating,
where rank 1 corresponds to the lowest priority; the factor priority is the inverse of the rank and
indicates the factor importance.</p>
      <p>Taking into account the theoretical principles of the ranking method presented in Section 3
and the experimental results, an information system algorithm is developed to determine the
factor priority (Fig. 2).</p>
      <p>The information system for determining the factor priority by the ranking method is
implemented using the object-oriented programming paradigm in the Python language. It
provides a full cycle of analysis from input of primary data to visualization of final results. The
software has a clearly structured modular architecture. The main class implements the
ModelView-Controller pattern, where the data model is represented by internal structures for storing
the information about the number of relationships for each factor and the calculation results. The
X Fj
105
90
135
110
0
70
75
40
45
30
10
8
7
9
1
5
6
3
4
2
10
3
4
1
2
6
5
8
7
9
presentation is carried out using a graphical user interface based on the Tkinter library. The
controller provides the coordinated interaction between the software components.</p>
      <p>The interface contains a parameter input component. Input data validation and warning about
possible input errors are performed. The dynamic table automatically adapts to the number of
input factors. The navigation between input fields is carried out using an extended system of
keyboard shortcuts. The calculation module contains an input data validator that checks the
correctness of the input values and their compliance with the mathematical requirements of the
algorithm.</p>
      <p>The ranking and priority determination module provides sorting of factors by the value of the
integral indicator and assigning the corresponding ranks and priorities. The visualization module
includes a statistical diagram generator using the Matplotlib library. The adaptive scaling system
automatically adjusts the visualization parameters depending on the factor number. Logarithmic
scaling of the diagram width and dynamic calculation of the caption rotation angle are performed.</p>
      <p>The software interface with an example of determining the priority of digital image quality
factors is presented in Fig. 3.</p>
      <p>The user enters the required number of factors and clicks "Create a table" button. If the data is
entered incorrectly, an error message appears. Logically, it is allowed to enter an integer greater
than zero. If the data is entered correctly, a table is created. The user enters data on the number of
relationships in columns h1 , h2 , h3 , h4 and clicks "Calculate" button. At the next stage, the user can
click "Visualize factor priority" button to display a column chart. In this case, the ordinal numbers
of the factors are presented on the abscissa axis, and the priority levels are on the ordinate axis.</p>
      <p>The developed software product can be used in decision-making under conditions of
uncertainty and incomplete information.</p>
      <p>The obtained results of the factor ranking confirm the hypothesis of unequal influence of
different parameters on the final quality of digital images and substantiate the feasibility of using
the ranking method to determine the factor priority.</p>
      <p>The constructed semantic network (Fig. 1) makes it possible to formalize the complex
relationships between parameters that are traditionally considered separately. This provides a
systematic understanding of their contribution to the final quality. It is found that the dominant
role is played by the color model, which determines the way color information is represented in the
image and directly affects the ability to reproduce shades and transmit visual details.</p>
      <p>
        The second and third places in importance are the file format and resolution, which correlates
with the results of previous studies in the area of computer vision and digital processing [
        <xref ref-type="bibr" rid="ref4 ref7">4, 7</xref>
        ]. It is
these parameters that most often determine the image suitability for further analysis or use in
critical applications (for example, in medical diagnostics or security systems). Factors
characterizing color depth, compression, image size, saturation, and brightness receive
intermediate positions in the formed hierarchy. This indicates their importance for subjective
perception and quality assessment. However, they are inferior to the fundamental parameters of
data encoding and reproduction. Sharpness and file size demonstrate the lowest priority. This
result indicates that these factors are rather derived characteristics that do not determine the image
essence.
      </p>
      <p>The developed algorithm (Fig. 2) and its software implementation in the form of an information
system (Fig. 3) demonstrate the efficiency of the transition from a theoretical model to an
interactive analysis tool. The integration of visualization modules makes it possible not only to
obtain numerical estimates, but also to present the results in a visual form, which simplifies the
interpretation for users. Compared with existing non-reference methods for quality assessment [18,
19, 22], the proposed approach is characterized by higher interpretability, since it provides clear
tracking of cause-and-effect relationships between parameters.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>The study presents a comprehensive ranking of ten key factors that determine the quality of digital
images. To formalize the system of interdependencies between resolution, color depth, color model,
file format and size, geometric image parameters, compression level, brightness, saturation and
sharpness, a hierarchical structure of influences and dependencies is constructed in the form of
semantic networks and reachability matrices. This allows one to clearly represent the relationship
between the characteristics and transform them into a quantitative form for further analysis.</p>
      <p>Based on the ranking results, a hierarchy of the factor priority is established. The most
significant factor that has the greatest impact on the digital image quality is determined to be the
color model ( R j=10 ; P j=1), while the lowest significance is demonstrated by the file size
( R j=1 ; P j=10). The results represent a consistent set of weight values and provide a holistic
view of the role of each parameter in the image reproduction. The obtained data can be used in
further scientific research and practical applications that require improving the digital image
quality.</p>
      <p>To implement the proposed approach, an information system for determining the factor priority
is developed, which automates the process from data input to result visualization. The system is
developed using the object-oriented programming paradigm in the Python language and contains
three main modules: settings, calculated data and factor ranking, visualization of factor priority.</p>
      <p>At the same time, the study has certain limitations related to the possible subjectivity of the
expert assessments at the stage of determining the set of factors and formalizing their
relationships.</p>
      <p>Further research can be aimed at refining the weight coefficients using multi-criteria
optimization and expanding the model by introducing additional parameters and linguistic
descriptions of relationship types.</p>
      <p>The practical significance of the development is to create a ready-made tool for determining the
priority of digital image quality factors, which can be used in computer vision systems, automated
image processing procedures, as well as in industries where it is necessary to make informed
decisions about improving visual data. In addition, this system is a universal tool and can be used
to determine the importance of factors in any technological process.</p>
    </sec>
    <sec id="sec-6">
      <title>Declaration on Generative AI</title>
      <p>The authors have not employed any Generative AI tools.
[18] J. Ke, Q. Wang, Y. Wang, P. Milanfar, F. Yang, Musiq: Multi-scale image quality transformer,
in: Proceedings of the IEEE/CVF international conference on computer vision (2021) 5148–
5157. URL:
https://openaccess.thecvf.com/content/ICCV2021/papers/Ke_MUSIQ_MultiScale_Image_Quality_Transformer_ICCV_2021_paper.pdf.
[19] Y. Yang, Z. Lei, C. Li, No-reference image quality assessment combining Swin-Transformer
and natural scene statistics, Sensors 24(16) (2024) 5221. doi:10.3390/s24165221.
[20] H. M. S. S. Herath, H. M. K. K. M. B. Herath, N. Madusanka, B.-I. Lee, A systematic review of
medical image quality assessment, Journal of Imaging 11(4) (2025) 100.
doi:10.3390/jimaging11040100.
[21] V. Senkivskyy, S. Babichev, I. Pikh, A. Kudriashova, N. Senkivska, I. Kalynii, Forecasting the
reader’s demand level based on factors of interest in the book, in Proceedings of the
CITRisk’2021: 2nd International Workshop on Computational &amp; Information Technologies for
Risk-Informed Systems, Kherson, Ukraine, 16–17 September (2021) 176–191. URL:
https://ceurws.org/Vol-3101/Paper12.pdf.
[22] Y. Yang, C. Liu, H. Wu, D. Yu, A distorted-image quality assessment algorithm based on a
sparse structure and subjective perception, Mathematics 12 (2024) 2531.
doi:10.3390/math12162531.
[23] J. Chen, S. Li, L. Lin, J. Wan, Z. Li, No-reference blurred image quality assessment method
based on structure of structure features, Signal Processing: Image Communication 118 (2023)
117008. doi:10.1016/j.image.2023.117008.
[24] B. Dugonik, A. Dugonik, M. Marovt, M. Golob, Image quality assessment of digital image
capturing devices for melanoma detection, Applied Sciences 10 (2020) 2876.
doi:10.3390/app10082876.
[25] I. Pikh, V. Senkivskyy, A. Kudriashova, N. Senkivska, Prognostic assessment of COVID-19
vaccination levels, Lecture Notes in Data Engineering, Computational Intelligence, and
Decision Making (2022) 246–265. doi:10.1007/978-3-031-16203-9_15.
[26] V. Senkivskyi, A. Kudriashova, I. Pikh, I. Hileta, O. Lytovchenko, Models of postpress
processes designing, in: Proceedings of the 1st International Workshop on Digital Content &amp;
Smart Multimedia (DCSMart 2019) (2019) 259–270. URL:
https://ceur-ws.org/Vol-2533/paper24.pdf.
[27] F. Chen, H. Fu, H. Yu, Y. Chu, No-reference image quality assessment based on a multitask
image restoration network, Applied Sciences 13 (2023) 6802. doi:10.3390/app13116802.
[28] V. Senkivskyy, I. Pikh, A. Kudriashova, N. Senkivska, L. Tupychak, Models of factors of the
design process of reference and encyclopedic book editions, Lecture Notes in Computational
Intelligence and Decision Making. ISDMCI 2021. Lecture Notes on Data Engineering and
Communications Technologies. Springer 77 (2022) 217–229. doi:10.1007/978-3-030-82014-5_15.
[29] O. Tymchenko, S. Vasiuta, O. Khamula, Optimization of the mathematical model of factors of
composite design of infographic, in: Proceedings of the 13th International Scientific and
Technical Conference on Computer Sciences and Information Technologies (IEEE 2018) 2
(2018) 58–61. doi:10.1109/STC-CSIT.2018.8526673.
[30] O. Zhulkovskyi, I. Zhulkovska, H. Vokhmianin, A. Firsov, I. Tykhonenko, Application of
SIMD-Instructions to increase the efficiency of numerical methods for solving SLAE,
Computer Systems and Information Technologies 4 (2024) 126–133.
doi:10.31891/csit-2024-415.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <article-title>Compressive sensing in image/video compression: Sampling, coding, reconstruction, and codec optimization</article-title>
          ,
          <source>Information</source>
          <volume>15</volume>
          (
          <issue>2</issue>
          ) (
          <year>2024</year>
          )
          <article-title>75</article-title>
          . doi:
          <volume>10</volume>
          .3390/info15020075.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>M.</given-names>
            <surname>Frackiewicz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Palus</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Prandzioch</surname>
          </string-name>
          ,
          <article-title>Superpixel-based PSO algorithms for color image quantization</article-title>
          ,
          <source>Sensors</source>
          <volume>23</volume>
          (
          <issue>3</issue>
          ) (
          <year>2023</year>
          )
          <article-title>1108</article-title>
          . doi:
          <volume>10</volume>
          .3390/s23031108.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M.</given-names>
            <surname>Frackiewicz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Palus</surname>
          </string-name>
          ,
          <article-title>Efficient color quantization using superpixels</article-title>
          ,
          <source>Sensors</source>
          <volume>22</volume>
          (
          <issue>16</issue>
          ) (
          <year>2022</year>
          )
          <article-title>6043</article-title>
          . doi:
          <volume>10</volume>
          .3390/s22166043.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>D. Y.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. Y.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. Y.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <article-title>Improving medical image quality using a super-resolution technique with attention mechanism</article-title>
          ,
          <source>Applied Sciences 15(2)</source>
          (
          <year>2025</year>
          )
          <article-title>867</article-title>
          . doi:
          <volume>10</volume>
          .3390/app15020867.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>L.</given-names>
            <surname>Liao</surname>
          </string-name>
          , W. Liu, S. Liu,
          <article-title>Effect of bit depth on cloud segmentation of remote-sensing images</article-title>
          ,
          <source>Remote Sensing</source>
          <volume>15</volume>
          (
          <issue>10</issue>
          ) (
          <year>2023</year>
          )
          <article-title>2548</article-title>
          . doi:
          <volume>10</volume>
          .3390/rs15102548.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , H. Su,
          <string-name>
            <given-names>T.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Tian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Fan</surname>
          </string-name>
          ,
          <article-title>Multi-scale fusion underwater image enhancement based on HSV color space equalization</article-title>
          ,
          <source>Sensors</source>
          <volume>25</volume>
          (
          <issue>9</issue>
          ) (
          <year>2025</year>
          )
          <article-title>2850</article-title>
          . doi:
          <volume>10</volume>
          .3390/s25092850.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>V. S.</given-names>
            <surname>Alfio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Costantino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Pepe</surname>
          </string-name>
          ,
          <article-title>Influence of image TIFF format and JPEG compression level in the accuracy of the 3D model and quality of the orthophoto in UAV photogrammetry</article-title>
          ,
          <source>Journal of Imaging</source>
          <volume>6</volume>
          (
          <issue>5</issue>
          ) (
          <year>2020</year>
          )
          <article-title>30</article-title>
          . doi:
          <volume>10</volume>
          .3390/jimaging6050030.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>V.-I.</given-names>
            <surname>Ungureanu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Negirla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Korodi</surname>
          </string-name>
          ,
          <article-title>Image-compression techniques: Classical and “Regionof-Interest-Based” approaches presented in recent papers</article-title>
          ,
          <source>Sensors</source>
          <volume>24</volume>
          (
          <issue>3</issue>
          ) (
          <year>2024</year>
          )
          <article-title>791</article-title>
          . doi:
          <volume>10</volume>
          .3390/s24030791.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>O.</given-names>
            <surname>Rukundo</surname>
          </string-name>
          ,
          <article-title>Effects of image size on deep learning</article-title>
          ,
          <source>Electronics</source>
          <volume>12</volume>
          (
          <issue>4</issue>
          ) (
          <year>2023</year>
          )
          <article-title>985</article-title>
          . doi:
          <volume>10</volume>
          .3390/electronics12040985.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>P.</given-names>
            <surname>Ling</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Tan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Jin</surname>
          </string-name>
          , E. Chen,
          <article-title>Single image dehazing using saturation line prior</article-title>
          ,
          <source>IEEE Transactions on Image Processing</source>
          <volume>32</volume>
          (
          <year>2023</year>
          )
          <fpage>3238</fpage>
          -
          <lpage>3253</lpage>
          . doi:
          <volume>10</volume>
          .1109/TIP.
          <year>2023</year>
          .
          <volume>3279980</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>D.</given-names>
            <surname>Karakaya</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Ulucan</surname>
          </string-name>
          , M. Turkan, (
          <year>2022</year>
          ).
          <article-title>Image declipping: Saturation correction in single images</article-title>
          ,
          <source>Digital Signal Processing</source>
          <volume>127</volume>
          (
          <year>2022</year>
          )
          <article-title>103537</article-title>
          . doi:
          <volume>10</volume>
          .1016/j.dsp.
          <year>2022</year>
          .
          <volume>103537</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>M.</given-names>
            <surname>Zhu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Ke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Zhi</surname>
          </string-name>
          ,
          <article-title>Review: A survey on objective evaluation of image sharpness</article-title>
          ,
          <source>Applied Sciences 13(4)</source>
          (
          <year>2023</year>
          )
          <article-title>2652</article-title>
          . doi:
          <volume>10</volume>
          .3390/app13042652.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>K.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ling</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>You</surname>
          </string-name>
          ,
          <article-title>Sharpness and brightness quality assessment of face images for recognition, Scientific Programming 1 (</article-title>
          <year>2021</year>
          )
          <article-title>4606828</article-title>
          . doi:
          <volume>10</volume>
          .1155/
          <year>2021</year>
          /4606828.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>N.</given-names>
            <surname>Manakitsa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. S.</given-names>
            <surname>Maraslidis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Moysis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. F.</given-names>
            <surname>Fragulis</surname>
          </string-name>
          ,
          <article-title>A review of machine learning and deep learning for object detection, semantic segmentation, and human action recognition in machine and robotic vision</article-title>
          ,
          <source>Technologies</source>
          <volume>12</volume>
          (
          <issue>2</issue>
          ) (
          <year>2024</year>
          )
          <article-title>15</article-title>
          . doi:
          <volume>10</volume>
          .3390/technologies12020015.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>P. J.</given-names>
            <surname>Navarro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Pérez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Weiss</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Egea-Cortines</surname>
          </string-name>
          ,
          <article-title>Machine learning and computer vision system for phenotype data acquisition and analysis in plants</article-title>
          ,
          <source>Sensors</source>
          <volume>16</volume>
          (
          <issue>5</issue>
          ) (
          <year>2016</year>
          )
          <article-title>641</article-title>
          . doi:
          <volume>10</volume>
          .3390/s16050641.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>H.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Xie</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhou</surname>
          </string-name>
          , J. Liu,
          <article-title>Object detection, segmentation and categorization in artificial intelligence</article-title>
          ,
          <source>Electronics</source>
          <volume>13</volume>
          (
          <issue>13</issue>
          ) (
          <year>2024</year>
          )
          <article-title>2650</article-title>
          . doi:
          <volume>10</volume>
          .3390/electronics13132650.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Rahman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. F.</given-names>
            <surname>Shahrior</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Iqbal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. A.</given-names>
            <surname>Abushaiba</surname>
          </string-name>
          ,
          <article-title>Enabling intelligent industrial automation: A review of machine learning applications with digital twin and edge ai integration</article-title>
          ,
          <source>Automation</source>
          <volume>6</volume>
          (
          <issue>3</issue>
          ) (
          <year>2025</year>
          )
          <article-title>37</article-title>
          . doi:
          <volume>10</volume>
          .3390/automation6030037.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>