<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Method of Immunohistochemical Slide Analysis⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Serhii Potapov</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vitaliy Gargin</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Kharkiv International Medical University</institution>
          ,
          <addr-line>38, Molochna, 61001, Kharkiv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Kharkiv National Medical University</institution>
          ,
          <addr-line>4, Nayky, 61022, Kharkiv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2026</year>
      </pub-date>
      <abstract>
        <p>Immunohistochemical (IHC) studies allow obtaining additional information about the state of tissues, in particular the presence of a tumor process. Methods of computer analysis of color digital images make it possible to reduce or even completely eliminate the subjectivity of the study, as well as obtaining reliable quantitative data, which makes IHC studies more objective when solving, for example, diagnostic, prognostic and research tasks. In that connection, the goal of our study was to develop a method for objective analysis of IHC in order to improve the interpretation of the obtained results. The authors propose a two-stage, colorimetry-grounded pipeline for objective analysis of immunohistochemical slides. Images are acquired with a light microscope and processed in MATLAB; RGB values are converted to CIE XYZ and then to CIE Lab, after which K-means clustering is applied first to segment marker, background, nuclei, and membranes, and then to stratify expression by lightness into weak, medium, and strong levels. The approach argues for hardware-independent color description and uses ΔE as the clustering metric, yielding a scalar “S” intended to reduce observer subjectivity. Qualitative comparisons indicate that grayscale and raw RGB segmentations confound marked and unmarked tissues, whereas Lab-space segmentation isolates immunopositive regions and grades expression on testicular tumor exemplars. Suggested algorithm when first stage applies K-means clustering in CIE Lab space to split an image into four classes, second stage 2 re-segments the “marker” class by lightness L* into three expression bands using fixed L* thresholds, and reports the relative positive area S as the quantitative readout (implementation is in MATLAB and relies on an RGB-XYZ-Lab conversion and Euclidean ΔE in Lab as the distance metric) allows to obtain objective data about IHC microspecimen.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;pathology</kwd>
        <kwd>image</kwd>
        <kwd>analysis</kwd>
        <kwd>immunohistochemistry</kwd>
        <kwd>CIE Lab 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Morphological research
has
been
a crucial
part
of
medicine for
many
years [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        Immunohistochemical studies (IHCS) allow obtaining additional information about the state of
tissues, in particular the presence of a tumor process [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. Modern treatment protocols require
objective interpretation of microscopic preparations, especially in relation to the genitals [
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ].
      </p>
      <p>
        Methods of computer analysis of color digital images make it possible to reduce or even
completely eliminate the subjectivity of the study, as well as obtaining reliable quantitative data,
which makes IHCS more objective when solving, for example, diagnostic, prognostic and research
tasks [
        <xref ref-type="bibr" rid="ref6 ref7">6, 7</xref>
        ]. However, there are also problems of quantitative assessment of digital images, which
are associated with the limitations of IHCS preparation techniques [
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ]. This is the quality of the
camera and microscope, and the different thickness of tissue sections, and the time of visualization
reaction, which is always selected empirically, and the lack of standardized indicators and
parameters for quantitative assessment of IHCS [
        <xref ref-type="bibr" rid="ref10 ref11">10, 11</xref>
        ].
      </p>
      <p>The main differences between the images are as follows: the expression of markers can be
determined in different cellular and tissue structures (nuclei, cytoplasm of cells, intercellular
substance, membrane structures), and the Relative area (S) of marker expression, as well as its
lightness (L), in different tumors can vary significantly. It is obvious that the immunopositive areas
in the images have different sizes, the labeled areas are characterized by different structural and
color properties, and the only constant feature of such areas remains the “brown color” in the
subjective perception.</p>
      <p>
        It should be noted that a feature of digital color images of histological preparations is the
significant instability of their color content. It is explained by the fact that in different preparations
under the influence of the same markers, chemical reactions proceed differently and the properties
of tissue structures in different preparations differ from each other, and these differences lead to
slightly different results of tissue reactions to markers. Therefore, in different digital images, the
color coordinates of the same types of tissues differ significantly. In addition, in the areas stained
with the marker, it is necessary to distinguish different levels of expression - from the lightest to
the darkest, and determine the level L of these areas. Automated methodic could be useful in that
area as it is developed with other medical aspects [
        <xref ref-type="bibr" rid="ref12 ref13 ref14">12-15</xref>
        ].
      </p>
      <p>Based on the above, the goal of our study was to develop a method for objective analysis of
immunohistochemical micropreparations in order to improve the interpretation of the obtained
results.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Methods</title>
      <p>The proposed method allows to separate the areas of labeled and unlabeled tissues in the image by
color features and to determine the levels of marker expression intensity. The color is encoded in
the coordinates of the CIE Lab space, which is designed according to the principles of the human
visual system. Therefore, the representation of the colors of labeled and unlabeled tissues in such
coordinates is similar to human visual perception. Unlike the representation in grayscale or RGB
values, the proposed method at the first stage of segmentation allows to separate the areas of
labeled and unlabeled tissues in the image, and during repeated segmentation – to separate the
labeled areas by the levels of marker expression precisely by the CIE Lab color coordinates. The use
of the automatic mode of digital image segmentation at the first and second stages allows to avoid
the subjective factor during the morphometric study, and at the same time accelerates the process
of obtaining the result, and also allows to reduce the number of errors associated with the
subjectivity of the researcher's perception, to develop clear morphometric criteria for evaluating
the results of the expression of certain markers.</p>
      <p>For segmentation of color digital images, the most widely used hierarchical method is the
kmeans clustering method [16, 17].</p>
      <p>At its core, the k-means algorithm is based on minimizing an objective function that is equal to
the sum of the squares of the distances from all points of a cluster to its center. The objective
function, which is based on the sum of the least squares criterion, is defined as follows (1):</p>
      <p>K
J    w C  C  CSi
i1CSi
2
(1)
where C¯Si means the center of the cluster, K is the number of clusters, w (C ) weight of point c,
‖C −C¯Si‖2 expression of the quadratic norm for calculating the distance between points.</p>
      <p>In classical segmentation methods [15], local features are smoothed and represented as vectors
in a metric space, thus describing each image region by an averaged feature vector (center). The
square of the (weighted) Euclidean distance is most often used as a measure of difference in this
approach.</p>
      <p>Given a dataset X, the K-means algorithm minimizes the objective function iteratively. This
process consists of several steps:</p>
      <p>Step 1. Select K initial centers C¯S 1 , C¯S 2 , ... , C¯Si.</p>
      <p>t 
Si = C

</p>
      <p>2
C-C t</p>
      <p>Si
 C-C t</p>
      <p>S j
2 




for all j=1,2 , ... , K , j≠i.</p>
      <p>Step 3. Based on the results of step 2, new cluster centers are calculated C¯tS+i1 such that the
objective function decreases. New centers are formed according to the relation (4):</p>
      <p>Step 2. At the t-th iterative step, distribute the elements of the set X between K clusters taking
into account the relation (2):</p>
      <p>for all j=1,2 , ... , K , j≠i, where Sit denotes the set of points for which C¯Si is the center of the
cluster. In other words, the cluster Sit is filled with points for which the condition (3)
C ∈Sit if ‖C −C¯tSi‖2&lt;‖C −C¯tSji‖2
C t 1 </p>
      <p>Si</p>
      <p>CSit
 w C   C
CSit
(2)
(3)
(4)</p>
      <p>Step 4. If all cluster centers have not changed as the iteration step increases, the procedure
stops. Go to step 2.</p>
      <p>The behavior of this algorithm strongly depends on the value of K, the choice of cluster centers,
and the geometric properties of the input data. Nevertheless, the simplicity of the method has
provided it with wide application in pattern recognition, image processing, and machine vision
problems. An overview of the capabilities of the method and algorithms for its implementation can
be found, for example, in the work of S.M.A. Burney, H. Tariq [18].</p>
      <p>One of the main criteria of homogeneity for a group of pixels in a cluster when performing
image segmentation is color. Historically, the earliest approaches relied on the use of the RGB
space to describe color data [19, 20]. However, this color space poorly describes the features of
human color vision, so its application for segmentation problems is not always effective.
Consequently, many alternative color spaces have been proposed and applied [21-24].</p>
      <p>Among the color spaces used are HSV, YʼIʼQʼ, XYZ, L*U*V*, and LAB. Due to their different
features, they are all used in segmentation and further analysis of medical images. It should be
noted that the color coordinates of the HSV, YʼIʼQʼ spaces are derived from the RGB values of the
brightness of the image pixels [24]. RGB is a hardware-dependent space, that is, the values of the
RGB color coordinates depend on the type of device that reproduces the color. Therefore, the use of
the HSV, YʼIʼQʼ spaces reduces the accuracy of segmentation methods. The hardware-independent
color spaces XYZ, L*U*V*, and LAB do not have this drawback – they are constructed on
descriptions of the properties of a standard observer in the form of color matching functions and
are associated only with the features of the human visual system.</p>
      <p>XYZ values are calculated from spectrometric measurements of visual stimuli and, in turn, are
the basis for calculating the color coordinates L*U*V* and LAB. For color specification, the most
widespread is the LAB space, the coordinates of which are calculated using the following formulas
(5-7) [25]:
(5)
(6)
(7)
(8)</p>
      <p> 16
 X 
a  500 
 X 0 
</p>
      <p> Y 
  Y 
0 
E  L2  a 2  b2 1/ 2
where X, Y, Z are the coordinates of the specified colors; X0, Y0, Z0 are the coordinates of the
nominal white color stimulus of the standard lighting source.</p>
      <p>In this space, the difference between color stimuli is calculated using the Euclidean metric (8)
[25]:</p>
      <p>This formula has undergone multiple improvements and modifications since its introduction in
1976, but in its current form it remains the principal standard in modern colorimetric technology
and is widely used to assess the accuracy of color reproduction. Therefore, the use of this color
difference formula as a metric in the color segmentation algorithm is natural and justified.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Results</title>
      <p>The method is carried out as follows. IHC-stained histological sections of the studied tissues are
recorded using a microscope and a digital camera. The obtained images are processed in the Matlab
software package using standard digital image processing tools. First, the auxiliary CIE XYZ color
coordinates are calculated based on the brightness values of the RGB color channels in each pixel
of the original image, and then the CIE Lab color coordinates. Thus, the original digital image
corresponds to a three-dimensional array of CIE Lab color coordinates, one of which is L, the
values of which can vary within 0–100. Next, the primary automatic segmentation is performed
using the K-means method with the calculation of the values of the color differences between the
image pixels and the selection of areas of marker expression, background, nuclei and membranes.
After that, these areas are visually assessed to determine in which regions the target marker color
is present. At the second stage of repeated automatic segmentation for the selected area using the
K-means method, the values of the differences between pixels in the CIE Lab color space are
calculated, due to which the marked area is divided into three levels of marker expression: weak,
medium, and strong, with the determination of the marker expression value (S). For morphometric
measurement of S, which is occupied by immunopositive structures, the ratio of the number of
pixels of the digital image of the immunopositive reaction area to the total number of pixels in the
image, expressed as a percentage, is automatically calculated in the selected area.</p>
      <p>Each pixel of the image in each of the three color channels of red – R, green – G and blue – B
has 256 possible intensity values, ranging from 0 (darkest) to 255 (lightest). To calculate the color
characteristics of each pixel of the image in the CIE Lab color space, the following steps are
performed:
1. Auxiliary CIE XYZ color coordinates are calculated using formulas (9-11):</p>
      <p>X  0.49R  0.31G  0.20B;
2. CIE Lab color coordinates are calculated using formulas:</p>
      <p>Where the coordinates X₀, Y₀, Z₀ correspond to the reference white color and are given as:
X0=95.04; Y0=100; Z0=108.89;</p>
      <p>3. Using the K-means method, automatic segmentation of the digital image into four areas is
performed. Each area combines pixels that differ significantly in color, represented in CIE Lab
coordinates. These areas correspond to the marker, background, nuclei, and membranes in the
image. The areas are displayed on the screen for observation.</p>
      <p>4. The observer visually evaluates these areas and determines which one contains the desired
marker color; this group of pixels will then undergo further secondary segmentation and analysis;
5. At the second stage (secondary automatic segmentation), the identified image area with
labeled tissues is further segmented using the K-means method based on lightness (L) into three
expression levels: L=0–40 corresponds to a strong marker expression level; L=40–50 corresponds to
a medium marker expression level; L=50–100 corresponds to a weak marker expression level;
6. The identified pixel groups are displayed on the screen as images of areas with different
marker expression levels.</p>
      <p>To illustrate the effectiveness of the proposed approach, sets of images (digital photographs) of
various histological types of testicular tumors are presented. These images show the results of
staining of histological sections with different IHC markers. Photography was performed using an
Olympus BX-41TF microscope with the Olympus DP-Soft software (Version 3.1). Some images
contained structures marked by the markers, and the S of these structures varied across different
images. Examples of the original images are shown in Fig. 1.</p>
      <p>As already noted, the main differences between the images are as follows: the expressed
markers may be identified in different cellular and tissue structures; the S value of the marker, as
well as its L value, may also vary significantly across different tumors.</p>
      <p>It is evident that the objects in the images vary in size, while the marked areas are characterized
by different structural and color properties. The only consistent feature of such areas remains the
“brown color” in subjective perception. The proposed two-stage segmentation algorithm using
Kmeans in the Lab space, as well as single-stage K-means segmentation with grayscale and RGB
image representations, were applied to the analyzed images (Fig. 2–6).</p>
      <p>The presented example shows that the use of grayscale representation does not allow the
separation of marked tissues from unmarked ones (Fig. 2, 3). The RGB representation leads to
segmented image components that, although differing in brightness (and consequently in L), still
contain a mixture of marked and unmarked tissues (Fig. 4).</p>
      <p>
        In contrast, the proposed algorithm, at the first stage of segmentation, makes it possible to
separate marked and unmarked tissue areas in the image (Fig. 5), while at the secondary
segmentation stage, it allows the division of marked regions according to marker expression levels
(Fig. 6). As a result, the determined S of the marker expression areas provides an objective
assessment of the main biological properties of different tumors and other pathological processes
[
        <xref ref-type="bibr" rid="ref10 ref11 ref2 ref3 ref4 ref5 ref6 ref7 ref8 ref9">2-11</xref>
        ].
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusions</title>
      <p>Suggested algorithm when first stage applies K-means clustering in CIE Lab space to split an image
into four classes, second stage 2 re-segments the “marker” class by lightness L* into three
expression bands using fixed L* thresholds, and reports the relative positive area S as the
quantitative readout (implementation is in MATLAB and relies on an RGB-XYZ-Lab conversion
and Euclidean ΔE in Lab as the distance metric) allows to obtain objective data about IHC
micropreparations.</p>
    </sec>
    <sec id="sec-5">
      <title>Declaration on Generative AI</title>
      <p>The author(s) have not employed any Generative AI tools.
Experience of Designing and Application of CAD Systems (CADSM), Lviv, Ukraine, Feb. 2021,
pp. 11–15, doi: 10.1109/CADSM52681.2021.9385267.
[15] V. V. Alekseeva, A. S. Nechiporenko, A. V. Lupyr, N. O. Yurevych, and V. V. Gargin, "A
method of complex evaluation of morphological structure of ostiomeatal complex components,
lower wall of maxillary and frontal sinuses," Wiad Lek., vol. 73, no. 12, pp. 2576–2580, 2020,
doi: 10.36740/WLek202012104.
[16] S. P. Lloyd, "Least squares quantization in PCM," IEEE Trans. Inf. Theory, vol. IT-28, no. 2, pp.</p>
      <p>129–137, Mar. 1982.
[17] J. MacQueen, "Some methods for classification and analysis of multivariate observations," in
Proc. 5th Berkeley Symp. Math. Stat. Probab., vol. 1, Berkeley, CA: Univ. California Press, 1967,
pp. 281–297. [Online]. Available: http://projecteuclid.org/euclid.bsmsp/1200512992
[18] S. A. Burney and H. Tariq, "K-means cluster analysis for image segmentation," Int. J. Comput.</p>
      <p>Appl., vol. 96, no. 4, pp. 1–8, 2014.
[19] H. D. Cheng, X. H. Jiang, Y. Sun, and J. Wang, "Color image segmentation: Advances and
prospects," Pattern Recognit., vol. 34, no. 12, pp. 2259–2283, 2001.
[20] L. Lucchese and S. K. Mitra, "Color image segmentation: A state of the art survey," Proc. Indian</p>
      <p>Nat. Sci. Acad., vol. 67, no. 2, pp. 207–221, 2001.
[21] B. Azam, R. J. Qureshi, Z. Jan, and T. A. Khattak, "Color based segmentation of white blood
cells in blood photomicrographs using image quantization," Res. J. Recent Sci., vol. 3, no. 4, pp.
34–39, 2014.
[22] P. J. Baldevbhai and R. S. Anand, "Color image segmentation for medical images using Lab*
color space," IOSR J. Electron. Commun. Eng. (IOSR-JECE), vol. 1, no. 2, pp. 24–45, 2012.
[23] S. Mohapatra, D. Patra, and S. Satpathy, "Unsupervised blood microscopic image segmentation
and leukemia detection using color based clustering," Int. J. Comput. Inf. Syst. Ind. Manag.</p>
      <p>Appl., vol. 4, pp. 477–485, 2012.
[24] S. M. Praveena and I. Vennila, "Optimization fusion approach for image segmentation using
Kmeans algorithm," Int. J. Comput. Appl., vol. 2, no. 7, pp. 18–25, 2010.
[25] M. D. Fairchild, Color Appearance Models, 3rd ed. Chichester, U.K.: Wiley, 2013, p. 439.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A.</given-names>
            <surname>Schabadasch</surname>
          </string-name>
          ,
          <article-title>"Intramurale nervengeflechte des darmrohrs,"</article-title>
          <string-name>
            <given-names>Z.</given-names>
            <surname>Zellforsch</surname>
          </string-name>
          ., vol.
          <volume>10</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>320</fpage>
          -
          <lpage>385</lpage>
          ,
          <year>1930</year>
          , doi: 10.1007/BF02450699.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>M.</given-names>
            <surname>Lyndin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Hyriavenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Sikora</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Lyndina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Soroka</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Romaniuk</surname>
          </string-name>
          ,
          <article-title>"Invasive breast carcinoma of no special type with medullary pattern: morphological and immunohistochemical features," Turk Patoloji Derg</article-title>
          ., vol.
          <volume>38</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>205</fpage>
          -
          <lpage>212</lpage>
          ,
          <year>2022</year>
          , doi: 10.5146/tjpath.
          <year>2021</year>
          .
          <volume>01559</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>A.</given-names>
            <surname>Romaniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Lyndіn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Sikora</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Lyndina</surname>
          </string-name>
          , and
          <string-name>
            <given-names>K.</given-names>
            <surname>Panasovska</surname>
          </string-name>
          ,
          <article-title>"Histological and immunohistochemical features of medullary breast cancer," Folia Medica Cracoviensia</article-title>
          , vol.
          <volume>2</volume>
          , pp.
          <fpage>41</fpage>
          -
          <lpage>48</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>I. O.</given-names>
            <surname>Vynnychenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>О. Pryvalova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. I.</given-names>
            <surname>Vynnychenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. S.</given-names>
            <surname>Lуndіn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Sikora</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Romaniuk</surname>
          </string-name>
          ,
          <article-title>"PIK3CA-mutant circulating tumor DNA in patients with breast cancer,"</article-title>
          <source>Azerbaijan Medical Journal (АТJ)</source>
          ,
          <source>vol. 3</source>
          , pp.
          <fpage>79</fpage>
          -
          <lpage>88</lpage>
          ,
          <year>2020</year>
          , doi: 10.34921/amj.
          <year>2020</year>
          .
          <volume>3</volume>
          .010.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A.</given-names>
            <surname>Romaniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Gyryavenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Lyndin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Piddubnyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Sikora</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Korobchanska</surname>
          </string-name>
          ,
          <article-title>"Primary cancer of the fallopian tubes: histological and immunohistochemical features," Folia Medica Cracoviensia</article-title>
          , vol.
          <volume>4</volume>
          , pp.
          <fpage>71</fpage>
          -
          <lpage>80</lpage>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>N.</given-names>
            <surname>Hyriavenko</surname>
          </string-name>
          et al.,
          <article-title>"Serous adenocarcinoma of fallopian tubes: histological and immunohistochemical aspects,"</article-title>
          <source>J. Pathol. Transl. Med</source>
          ., vol.
          <volume>53</volume>
          , pp.
          <fpage>236</fpage>
          -
          <lpage>243</lpage>
          ,
          <year>2019</year>
          , doi: 10.4132/jptm.
          <year>2019</year>
          .
          <volume>03</volume>
          .21.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>M.</given-names>
            <surname>Lyndin</surname>
          </string-name>
          et al.,
          <article-title>"COX2 effects on endometrial carcinomas progression,"</article-title>
          <source>Pathol. Res. Pract.</source>
          , vol.
          <volume>238</volume>
          , p.
          <fpage>154082</fpage>
          ,
          <year>2022</year>
          , doi: 10.1016/j.prp.
          <year>2022</year>
          .
          <volume>154082</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>N.</given-names>
            <surname>Hyriavenko</surname>
          </string-name>
          et al.,
          <article-title>"Neuroendocrine tumor of the fallopian tube and serous adenocarcinoma of the ovary: multicentric primary tumors," Turk Patoloji Derg</article-title>
          ., vol.
          <volume>39</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>161</fpage>
          -
          <lpage>166</lpage>
          ,
          <year>2023</year>
          , doi: 10.5146/tjpath.
          <year>2022</year>
          .
          <volume>01589</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>O. I. Kravtsova</surname>
          </string-name>
          et al.,
          <article-title>"The role of Hsp70 and Hsp90 in the endometrial carcinomas progression,"</article-title>
          <source>Azerbaijan Medical Journal (АТJ)</source>
          ,
          <source>vol. 3</source>
          , pp.
          <fpage>136</fpage>
          -
          <lpage>146</lpage>
          ,
          <year>2021</year>
          , doi: 10.34921/amj.
          <year>2021</year>
          .
          <volume>3</volume>
          .019.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>O.</given-names>
            <surname>Shevchenko</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Shevchenko</surname>
          </string-name>
          ,
          <article-title>"The role of inflammation in the development of dental diseases,"</article-title>
          <source>Kharkiv Dental Journal</source>
          , vol.
          <volume>2</volume>
          , no.
          <volume>1</volume>
          (
          <issue>32</issue>
          ), pp.
          <fpage>78</fpage>
          -
          <lpage>91</lpage>
          ,
          <year>2025</year>
          , doi: 10.26565/
          <fpage>3083</fpage>
          -5607- 2025-3-08.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>D. O.</given-names>
            <surname>Kovalchuk</surname>
          </string-name>
          and
          <string-name>
            <given-names>N. M.</given-names>
            <surname>Savielieva</surname>
          </string-name>
          ,
          <article-title>"Matrix metalloproteases and proinflammatory proteins in serum as markers for the efficiency of temporomandibular joint disorders therapy,"</article-title>
          <source>Kharkiv Dental Journal</source>
          , vol.
          <volume>1</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>134</fpage>
          -
          <lpage>142</lpage>
          ,
          <year>2024</year>
          , doi: 10.26565/
          <fpage>3083</fpage>
          -5607-2024-2-04.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>V.</given-names>
            <surname>Alekseeva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Nechyporenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Frohme</surname>
          </string-name>
          et al.,
          <article-title>"Intelligent decision support system for differential diagnosis of chronic odontogenic rhinosinusitis based on U-Net segmentation,"</article-title>
          <source>Electronics (Switzerland)</source>
          , vol.
          <volume>12</volume>
          , no.
          <issue>5</issue>
          ,
          <year>2023</year>
          , doi: 10.3390/electronics12051202.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>A.</given-names>
            <surname>Nechyporenko</surname>
          </string-name>
          et al.,
          <article-title>"Galvanic skin response and photoplethysmography for stress recognition using machine learning and wearable sensors,"</article-title>
          <source>Appl. Sci.</source>
          , vol.
          <volume>14</volume>
          , no.
          <issue>24</issue>
          , p.
          <fpage>11997</fpage>
          ,
          <year>2024</year>
          , doi: 10.3390/app142411997.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>A.</given-names>
            <surname>Nechyporenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Alekseeva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Nazaryan</surname>
          </string-name>
          , and
          <string-name>
            <given-names>V.</given-names>
            <surname>Gargin</surname>
          </string-name>
          ,
          <article-title>"Biometric recognition of personality based on spiral computed tomography data,"</article-title>
          <source>in Proc. IEEE 16th Int. Conf.</source>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>