<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Tumor Detection in Mammography Images using Wavelet Transform and Bayes Fusion Technique Discrete</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Abdelkader Zitouni</string-name>
          <email>a.zitouni@lagh-univ.dz</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Fatiha Benkouider</string-name>
          <email>fbenkouider@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Fatima Chouireb</string-name>
          <email>f.chouireb@lagh-univ.dz</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mourad Reggab</string-name>
          <email>m.reggab@lagh-univ.dz</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University Amar Telidji</institution>
          ,
          <addr-line>37G Route de Ghardaia, Laghouat, 03000</addr-line>
          ,
          <country country="DZ">Algeria</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This research presents supervised classification algorithm based on information fusion for detecting masses in mammography images. Discrete wavelet transform can preserve information regarding both high and low frequencies and offer great discriminatory power between areas with strong similarities. This motivates us to use this type of features to improve image segmentation. So, in the first stage, the suggested technique used this feature extraction approach on mammography images in order to obtain additional information. After that in the second stage, estimated feature vector of each pixel is sent to a neural network classifier for initial-labeling. Then, in the third stage of the suggested technique, Bayes fusion method is used to combine the scores, within a sliding window, obtained by the neural network for each pixel. The performance of the proposed segmentation algorithm was evaluated on mammography images from Mammography Image Analysis Society (MIAS) dataset. The achieved classification results by the proposed fusion system leads to higher classification precision in detecting masses on mammography images, which are one of breast cancer signs.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Image segmentation</kwd>
        <kwd>Masses Detection</kwd>
        <kwd>Breast Cancer</kwd>
        <kwd>Neural Network</kwd>
        <kwd>Wavelets</kwd>
        <kwd>Bayes fusion</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>advantages and disadvantages of the methods and algorithms that have been developed by previous
researchers, in this paper we attempt to use one type of feature extraction techniques with the purpose
of detecting masses on mammography images. This feature is a structural feature obtained by using
the wavelet transform coefficients. For the pre-processing process, which finds to improve image
quality such as contrast enhancement to obtain a better image visualization [1], there are different
types of filtering techniques. In this study the contrast of the mammogram images was regulated by
histogram adjustment which improves the contrast of the output image by spreading out the intensity
values.</p>
      <p>So, in the first stage of our work, we have used the discrete wavelet transform as feature extraction
strategy in order to get more information that enables the classifiers to discriminate between the
different areas in the mammography image. In the second stage, an appropriate classification
algorithm is applied using the set of extracted features obtained from the previous stage. The
Backpropagation Artificial Neural Networks classifier is chosen among the most well-known
classifiers, it was initiated in [19-20]. The estimated feature vector of each pixel is sent to the neural
networks classifier for initial-labeling.</p>
      <p>A sliding window, whose class is assigned to its central pixel, is used. However, this central pixel
belongs to other window neighbors that may be classified into other classes. Consequently, in the
third stage, in order to obtain a more precise segmentation result, a Bayes fusion method is used for
each pixel to combine the scores results of several windows that contain this central pixel. The
proposed segmentation algorithm performance was verified on mammography images from MIAS
dataset [21]. The obtained results lead to higher classification precision in detecting masses which are
one of breast cancer signs.</p>
      <p>The rest of this manuscript is divided in three sections. The first one describes the background
theory of several techniques used in this paper. Then, section 2 is dedicated to give in details the
suggested segmentation process and the reached performance of the proposed fusion technique.
Finally, in section 3, we conclude and recommend possibilities for future work.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Feature Extraction Algorithm and Fusion Theory</title>
      <sec id="sec-2-1">
        <title>2.1 The wavelet analysis</title>
        <p>Since the work of Grossman and Morlet [22], the wavelets transform has appeared as a powerful
tool to solve problems in different application. The wavelets transform decomposes the input signal
into a series of wavelet functions ( ) derived from a mother function ( ) given by dilatation
(factor a) and translation (factor b) operations. Figure1 illustrates some examples of wavelets that are
generally used in image processing [23].</p>
        <p>a
b</p>
        <p>c</p>
        <p>The most important advantage of wavelets as compared to other frequency methods, as Fourier
transform, is that it offers both frequency and spatial locality [25]. In 1989, Mallat [26] suggested a
multi-resolution decomposition algorithm based on wavelets transform. The algorithm decomposes an
input image into a set of detail images and an approximation image using a filter bank comprising a
high pass filter (HP) and a low pass filter (LP). At each decomposition level the size of the
transformed images is reduced by a factor of two [24]. The discrete wavelet transform of a 2-D image
can be obtained by performing the filtering consecutively along horizontal and vertical directions
(separable filter bank) [27]. Four images are then created at each level. Figure 2 shows an example of
decomposition of the image on one level.</p>
        <p>In Figure 2, the DWT decomposes the image into 4 orthogonal sub-band: low-low (LL), high low
(HL), low high (LH), and high-high (HH) consisting of approximation, horizontal, vertical, and
diagonal information. The approximations image is the smoothed version of the original image and it
contains global information that is similar to the original image with the number of rows and the
number of columns being half of the original image. Horizontal, vertical, and diagonal contain the
detail and represent the fluctuations of the pixel intensity in horizontal, vertical, and diagonal
directions and they have low-intensity areas, whereas areas with high intensity are only found on the
edges of the image object.</p>
        <p>The values of transformed coefficients in detail and approximation images (sub-band images)
represent the necessary features that capture useful discrimination information for masses
segmentation [28].</p>
        <p>1. Wavelet’s Choice: In our research, we have used a second order biorthogonal spline wavelet.
This wavelet is used for masses analysis due to its excellent location in the frequency and spatial
domains and its sensitivity to local singularity and correlation of the image [24].</p>
        <p>2. Indices’ Calculation: One of the most used indices for characterizing the masses in the
spatiofrequency plane is the measurement of energy. Because the transformed images have different
frequencies, scales and orientations, the energy index is a local measure of the wavelet coefficient
distribution according to the scale, the orientation and the frequency. It has been used successfully for
segmentation and classification of masses [24].</p>
        <p>The expression of energy is given by [24]:
∑ ( )
( )</p>
        <p>The second index used in conjunction with energy is the measurement of the local mean of the
wavelet coefficients given by [24]:
∑| ( )|
( )
where N denotes the number of pixels, designated by the indices ( ), and enclosed in the area R.</p>
        <p>The calculation of these indices was done on a sliding window W. The local mean and the energy
on the sliding window are calculated from the resultant sub-band images. So the feature vector of
each window is made of eight parameters V= [ELL, ELH, EHL, EHH, MLL, MLH, MHL, MHH], as seen in
Figure 3.</p>
        <p>Several tests were carried out on a series of window sizes going from 5×5 to 25×25. The highest
good classification rate was reached for a window of dimension 11×11. The produced features vector
of each window is used as an input to the neural network classifier for a primary labeling, and the
score for the window delivered by the neural network is assigned to its mid pixel.
( )
( )
⁄ ),
.</p>
        <p>( )</p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2 Theory of Bayes’ Fusion</title>
        <p>One of the first techniques used to combine images with decision-making was the Bayes fusion.
This model was chosen by several authors because it has a very well-defined context with known
mathematical properties [29].</p>
        <p>Consider ,
following conditions:
, ...</p>
        <p>to be a collection of mutually exclusive hypotheses. They satisfy the
where represents the hypothesis' space (that is to say, the set of fused image classes). The
hypotheses are mutually exclusive and form a partition of .</p>
        <p>Consider and to be two characteristic primitives from two different images, representing
the same object, or the same hypothesis , the Bayesian theory computes the likelihood of getting
the hypothesis from the two measures and through Bayes rule [30]:
)
( )</p>
        <p>(
∑
( )
(
⁄ )</p>
        <p>⁄ )
{</p>
        <p>(
where ( ⁄ )represents the joint probability of having both measures ( , ) once the
hypothesis is realized, and ( ) is the prior probability of the hypothesis , which shows the
possibility of occurrences of the hypothesis in the general case [30].</p>
        <p>
          If and are two independent random variables, the conditional probability (
also called the likelihood function, becomes a separable function of the two variables
and
(
⁄ )
(
)
(
)
Hence equation (
          <xref ref-type="bibr" rid="ref6">6</xref>
          ) takes the following form:
(
)
( )
        </p>
        <p>(
∑
(
)
(
)
)
(
(
)
)</p>
        <p>Consequently, in order to determine the posterior probabilities ( ), we first need to
compute the prior probabilities ( ) for all hypotheses , i going from 1 to N, and the likelihood
functions ( ) for each image primitive and for each hypothesis. To model the likelihood
functions, we work under the Gaussian hypothesis [30]:
(
)
√
(
(
̅̅̅)
)
( )
( )
where ̅̅̅ denotes the mean and</p>
        <p>is the standard deviation of the Gaussian expression.</p>
        <p>
          When the combination of the probabilities realized by equation (
          <xref ref-type="bibr" rid="ref8">8</xref>
          ), we have to select a decisio
criterion to decide which hypothesis is supposed to be chosen according to all posterior
probabilities. Many criteria are suggested in the literature: The maximum of posterior probability is
the most commonly used criterion, which selects the hypothesis having the highest probability
( ) [31].
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. The Proposed Segmentation Algorithm</title>
      <p>Data obtained from the mammographic images is often, noisy, incomplete, inconsistent, and have
low contrast. Therefore, pre-processing is needed in the medical image processing to improve image
quality, remove unwanted noise, preserves the edges within an image, and make the feature extraction
phase more reliable [1].</p>
      <p>There are different types of filtering techniques in the pre-processing. So, in the first step of our
work, the contrast of the mammogram images was regulated by histogram adjustment which increases
and improves the contrast of the output image by spreading out the intensity values.</p>
      <p>After that in the second step, the proposed segmentation method uses Wavelets transform as
feature extraction strategy on mammography images in order to get more information in this data set.
The parameters of the feature set were selected as mentioned in the previous sections. After a proper
features' extraction, each estimated feature vector of each pixel is sent to the neural networks classifier
for primary labeling. The MLP neural networks classifier is chosen among the most well-known
classifiers, it was initiated in [19], [20]. For the choice of the hidden layers' number and the number of
neurons in each layer, we choose the rule proposed by [34] since there is no general rule other than
rules of thumb as proposed in [35], [36]. The size of the hidden layer is 75% of the input layer.</p>
      <p>For the transfer functions, we retain the most used in the literature, namely the logistic function
and the hyperbolic tangent function. The gradient backpropagation algorithm is used for the training
of neural networks.</p>
      <p>Using a sliding window, the class for this window is assigned to its central pixel. However, this
central pixel belongs to other window neighbors that may be classified into other classes.
Consequently, in order to achieve a more precise segmentation result, for each pixel a Bayes fusion
method is used to combine the scores results of several windows that contain this central pixel:</p>
      <p>Consider to be the segmented image comprising the scores
neural networks classifier):
with
i =1…. n,</p>
      <p>j =1…. m,
where n and m represent the sizes of the mammography image.
of each pixel (the output of the</p>
      <p>We perused the images by using a sliding window of size , so that every pixel is
surrounded by pixels. Each central pixel of window with score belongs to the
window in the surrounding windows before the classification process. However, each central
pixel of the window , with produced different scores .</p>
      <p>For example, in the case of pixel with score and
by eight pixels, that are the center of the eight windows which pixel
, the central pixel is surrounded
belonged to, (see Figure 4).</p>
      <p>From the above example, we joined the scores produced by the current block and its eight
neighboring ones for the wavelet features: {S33 ,S32 ,S34 ,S23 ,S24 , S22 , S42 , S44 , S43 , S33 , S32 , S34 ,
S23 , S24 , S22 , S42 ,S44 , S43} (see Figure 5). In this work, a sliding window of size 9×9 is used, so the
central pixel is surrounded by 80 pixels, that are the center of the 80 windows which pixel P5,5
belonged to.</p>
      <p>The essential fusion algorithm steps are summarized as follows:
Step1: Pre-processing of the mammography image.</p>
      <p>Step2: Feature extraction via DWT.</p>
      <sec id="sec-3-1">
        <title>Step3:</title>
        <p>Neuronal classification of the estimated
feature vector. We obtain:</p>
      </sec>
      <sec id="sec-3-2">
        <title>Step4: do</title>
        <p>Scores1: for DWT.
while</p>
        <p>the recognition rate is changed
for each pixel do</p>
        <p>Bayesian fusion of Score1
and the neighboring scores.
end
Calculate the recognition rate.</p>
        <p>End</p>
        <p>The performance of the proposed algorithm for segmenting mammography images is assessed
using many images from the MIAS (Mammographic Image Analysis Society) database [21]
containing 322 mammograms sized 1024 x 1024 pixels. The images are arranged in pairs: those with
even-numbers correspond to left MLO (medio-lateral oblique Mammograms) and those with
oddnumbers are right MLO.</p>
        <p>For the learning phase, we have used image mdb028 of the MIAS database, and to test our
algorithm we have taken randomly the following MIAS images:
mdb025 and mdb132, for Well-defined/circumscribed masses (CIRC) [32].
mdb184, for Spiculated masses (SPIC) [32].
mdb134, mdb271 and mdb274, for Other, ill-defined masses (MISC) [32].
mdb136 and mdb310, for Normal breast (NORM) [32].</p>
        <p>Figure 6 illustrates the obtained results (detected masses are displayed in cyan color) compared to
expert decision (masses centers coordinates and radiuses shown in blue color). Table 1 demonstrates
the Jaccard index that was achieved by utilizing this fusion algorithm.</p>
        <p>So, as shown in figure 6 and table 1, the results obtained on these taken images from MIAS
database are promising and they show the effectiveness of our fusion algorithm for the masses
segmentation on mammography images.</p>
        <p>We can see clearly that our algorithm gives a good result whatever the kind of class of abnormality
present, CIRC, SPIC, … etc. And also, we don’t ahlasveedeatencytionf in the cases of normal
breast. So, the proposed method has the potential to identify the presence of any masses in the
mammogram image.
(a) mdb184</p>
        <p>(b) mdb184
(a) mdb132</p>
        <p>(b) mdb132
(a) mdb274</p>
        <p>(b) mdb274</p>
        <p>Figure 7 illustrates a comparison of our proposed approach, on MIAS images mdb184 and
mdb028, with another unsupervised techniques proposed by Kanta Maitra et al [33], based on Divide
and conquer algorithm, and Boulehmi Hela et al [2], based on Generalized Gaussian Density.</p>
        <p>As seen in figure 7, the proposed approach has the advantage of being simple and precise; we have
exactly detected the shape of the present masses.</p>
        <p>The outcomes of our contribution demonstrate that it is possible to reach excellent fusion
performance by neatly selecting the best fusion method. We also note that by our fusion method, the
segmentation results of the mammography images are much improved as compared to other works.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusion</title>
      <p>In this article, we have presented and discussed a new approach for the segmentation of
mammography images based totally on information fusion. We started by extracting the features
using the wavelet transform. After that, the estimated vector of features for every pixel was sent to the
neural network classifier for primary labeling. Next, a new fusion model for improving
decisionmaking is used, it consists of combining the scores of each pixel within a sliding window. The
proposed fusion algorithm was tested on mammography images from MIAS dataset.</p>
      <p>This research has shown that this method is very effective for the automatic detection of
abnormalities in digital mammogram.</p>
      <p>As perspective we will complete the masses detection system by classifying abnormal
mammographic images into benign and malignant.</p>
    </sec>
    <sec id="sec-5">
      <title>5. References</title>
      <p>[18] K. J. Geras, S. Wolfson, N. W. Y. Shen, S.G. Kim, E. Kim, L. Heacock, U. Parikh, L. Moy, K.</p>
      <p>Cho, High-resolution breast cancer screening with multi-view deep convolutional neural
networks, in: Proceedings of the Computer Vision and Pattern Recognition, 2017.
[19] J. Hérault, Ch. Jutten, Réseaux neuronaux et traitement du signal, Hermès, Paris, 1994.
[20] J. F. Jodouin, Les réseaux de neurones. Principes et définitions, Hermès, Paris, 1994.
[21] J. Suckling, J. Parker, D. Dance, S. Astley, I. Hutt, C. Boggis, I. Ricketts et al, Mammographic
Image Analysis Society (MIAS) database v1.21 [Dataset], 2015. URL:
https://www.repository.cam.ac.uk/handle/1810/250394
[22] A. Grossman, J. Morlet, Decomposition of Hardy Functions into Square Integrable Wavelets of</p>
      <p>
        Constant Shape, SIAM Journal on Mathematical Analysis, 15, (
        <xref ref-type="bibr" rid="ref4">4</xref>
        ), 1984, pp. 723-736.
[23] M. H. Sahbani, K. Hamrouni, Segmentation d’images texturées par transformée en ondelettes et
classification C-moyenne floue, Proc. 3rd Int. Conf. Sciences of Electronic, Technologies of
Information and Telecommunications, Tunisia, March 27-31, 2005.
[24] T. Iftene, A. Safia, Comparaison Entre La Matrice De Cooccurrence Et La Transformation En
Ondelettes Pour La Classification Texturale Des Images Hrv (Xs) De Spot, Télédétection, 4, (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ),
(2004), pp. 39–49.
[25] Ch. Anibou, M. N. Saidi, D. Aboutajdine, Classification of Textured Images Based on Discrete
      </p>
      <p>
        Wavelet Transform and Information Fusion, J. Inf. Process. Syst, 11, (
        <xref ref-type="bibr" rid="ref3">3</xref>
        ), (2015), pp. 421-437.
[26] S. Mallat, A theory of multiresolution signal decomposition: the wavelet representation, IEEE
      </p>
      <p>
        Trans. Pattern Analysis and Machine Intelligence, 11, (
        <xref ref-type="bibr" rid="ref7">7</xref>
        ), (1989), pp. 674-693.
[27] P. Scheunders, S. Livens, G. Van de Wouwer et al, Wavelet-based Texture Analysis, Int. J.
      </p>
      <p>Computer Science and Information Management, (1997).
[28] S. Arivazhagan, L. Ganesan, Texture segmentation using wavelet transform, Pattern Recognition</p>
      <p>
        Letters, 24, (
        <xref ref-type="bibr" rid="ref16">16</xref>
        ), (2003), pp. 3197–3203.
[29] A. Zitouni, F. Benkouider, F. Chouireb, M. Belkheiri, Classification of Textured Images Based
on New Information Fusion Methods, IET Image Processing, vol.13, issue 9, (2019), pp. 1540
1549.
[30] A. Dromigny-Badin, Fusion d’images par la théorie de l’évidence en vue d’applications
médicales et industrielles, PhD. Dissertation, Institut National des Sciences Appliquées de Lyon,
1998.
[31] A. Zitouni, Image Processing Methodology and Textures Analysis for their Segmentation, PhD.
      </p>
      <p>Dissertation, University Amar Telidji of Laghouat, Algeria, 2020.
[32] The mini-MIAS database of mammograms. URL: http://peipa.essex.ac.uk/info/mias.html.</p>
      <p>Accessed 20 Dec 2020.
[33] I. Kanta, S. Maitra, S. Nag, K. Bandyopadhyay, Detection of Abnormal Masses using Divide and</p>
      <p>
        Conquer Algorithm in Digital Mammogram, Int. J. Emerg. Sci., 1(
        <xref ref-type="bibr" rid="ref4">4</xref>
        ), (2011), 767-786.
[34] B. Wierenga, J. Kluytmans, Neural nets versus marketing models in time series analysis: a
simulation studies, in: Proceedings of the 23 annual conference “European marketing
association”, Maastricht, 1994, pp. 1139-1153.
[35] V. Venugopal, W. Baets, Neural networks and statistical techniques in marketing research: A
conceptual comparison, Marketing Intelligence and Planning, vol. 12, no. 7, (1994), pp. 30-38.
[36] D, Shepard, The new direct marketing, Chapter Business one Irwin Homewood IL, Journal of
Direct Marketing, vol. 6, (1992), pp. 52-53.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M.</given-names>
            <surname>Wisudawati Lulu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sarifuddin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Wibowo Eri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Abdullah</surname>
          </string-name>
          <string-name>
            <surname>Arman</surname>
          </string-name>
          ,
          <article-title>Feature Extraction Optimization with Combination 2D-Discrete Wavelet Transform and Gray Level Co-Occurrence Matrix for Classifying Normal and Abnormal Breast Tumors</article-title>
          ,
          <source>Modern Applied Science</source>
          <volume>14</volume>
          (
          <issue>15</issue>
          ):
          <volume>5</volume>
          (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>H.</given-names>
            <surname>Boulehmi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Mahersia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Hamrouni</surname>
          </string-name>
          ,
          <string-name>
            <surname>Unsupervised Masses Segmentation Technique Using Generalized Gaussian Density</surname>
          </string-name>
          ,
          <source>International Journal of Image Processing and Graphics (IJIPG)</source>
          ,
          <volume>1</volume>
          , (
          <issue>2</issue>
          ) (
          <year>2013</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>B. U.</given-names>
            <surname>Fahnun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. B.</given-names>
            <surname>Mutiara</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. P.</given-names>
            <surname>Wibowo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Arlan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Latief</surname>
          </string-name>
          ,
          <article-title>Filtering techniques for noise reduction in liver ultrasound images</article-title>
          ,
          <source>Phd thesis</source>
          ,
          <source>Program Doktor Teknologi Informasi Universitas Gunadarma</source>
          , (
          <year>2018</year>
          ),
          <fpage>261</fpage>
          -
          <lpage>266</lpage>
          . URL: https://doi.org/10.1109/EIConCIT.
          <year>2018</year>
          .8878547
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>M. M. Abdelsamea</surname>
            ,
            <given-names>M. H.</given-names>
          </string-name>
          <string-name>
            <surname>Mohamed</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Bamatraf</surname>
          </string-name>
          ,
          <article-title>Automated classification of malignant and benign breast cancer lesions using neural networks on digitized mammograms</article-title>
          ,
          <source>Cancer Informatics</source>
          ,
          <volume>18</volume>
          , (
          <year>2019</year>
          ),
          <fpage>1</fpage>
          -
          <lpage>3</lpage>
          .
          <string-name>
            <given-names>Sage</given-names>
            <surname>Journal</surname>
          </string-name>
          . URL: https://doi.org/10.1177/1176935119857570PMid:31244522 PMCid:PMC6580711
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>D.</given-names>
            <surname>Lestari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Madenda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Massich</surname>
          </string-name>
          ,
          <article-title>A Segmentation Algorithm for Breast Lesion Based on Active Contour Model</article-title>
          and
          <string-name>
            <given-names>Morphological</given-names>
            <surname>Operations</surname>
          </string-name>
          , Advanced Science, Engineering and Medicine,
          <volume>7</volume>
          ,
          <fpage>920</fpage>
          -
          <lpage>924</lpage>
          .
          <fpage>10</fpage>
          .1166/asem. (
          <year>2015</year>
          ).1786. URL: https://doi.org/10.1166/asem.
          <year>2015</year>
          .1786
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Al-antari</surname>
          </string-name>
          , M. A. Al-masni, SU. Park et al,
          <article-title>An Automatic Computer-Aided Diagnosis System for Breast Cancer in Digital Mammograms via Deep Belief Network</article-title>
          ,
          <source>J. Med. Biol. Eng</source>
          .
          <volume>38</volume>
          , (
          <year>2018</year>
          ),
          <fpage>443</fpage>
          -
          <lpage>456</lpage>
          . URL: https://doi.org/10.1007/s40846-017-0321-6
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>M.</given-names>
            <surname>Pratiwi</surname>
          </string-name>
          , Alexander,
          <string-name>
            <given-names>J.</given-names>
            <surname>Harefa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Nanda</surname>
          </string-name>
          ,
          <article-title>Mammograms classification using gray-level cooccurrence matrix and radial basis function neural network</article-title>
          , (
          <year>2015</year>
          ). URL: https://doi.org/10.1016/j.procs.
          <year>2015</year>
          .
          <volume>07</volume>
          .340
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>R.</given-names>
            <surname>Biswas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Nath</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Roy</surname>
          </string-name>
          ,
          <article-title>Mammogram classification using gray-level co-occurrence matrix for diagnosis of breast cancer</article-title>
          , (
          <year>2016</year>
          ),
          <fpage>161</fpage>
          -
          <lpage>166</lpage>
          . URL: https://doi.org/10.1109/ICMETE.
          <year>2016</year>
          .85
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>S.</given-names>
            <surname>Ergin</surname>
          </string-name>
          , I. Esener,
          <string-name>
            <given-names>T.</given-names>
            <surname>Yuksel</surname>
          </string-name>
          ,
          <article-title>A genuine glcm based feature extraction for breast tissue classification on mammograms</article-title>
          ,
          <source>International Journal of Intelligent Systems and Applications in Engineering</source>
          , (
          <year>2016</year>
          ),
          <fpage>124</fpage>
          -
          <lpage>124</lpage>
          . URL: https://doi.org/10.18201/ijisae.269453
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>K.</given-names>
            <surname>Ucar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. E.</given-names>
            <surname>Kocer</surname>
          </string-name>
          ,
          <article-title>Breast cancer classification with wavelet neural network</article-title>
          ,
          <source>International Artificial Intelligence and Data Processing Symposium (IDAP)</source>
          , (
          <year>2017</year>
          ),
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          . URL: https://doi.org/10.1109/IDAP.
          <year>2017</year>
          .8090347
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Putra</surname>
          </string-name>
          ,
          <article-title>Mammogram classification scheme using 2d discrete wavelet and local binary pattern for detection of breast cancer</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          , (
          <year>2018</year>
          ). URL: https://doi.org/10.1088/
          <fpage>1742</fpage>
          -6596/1008/1/012004
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>M.</given-names>
            <surname>Pawar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Talbar</surname>
          </string-name>
          ,
          <article-title>Local entropy maximization-based image fusion for contrast enhancement of mammogram</article-title>
          ,
          <source>Journal of King</source>
          Saud University Computer and Information Sciences, (
          <year>2018</year>
          ). URL: https://doi.org/10.1016/j.jksuci.
          <year>2018</year>
          .
          <volume>02</volume>
          .008
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>M.</given-names>
            <surname>Pawar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Talbar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Dudhane</surname>
          </string-name>
          ,
          <article-title>Local binary patterns descriptor based on sparse curvelet coefficients for false-positive reduction in mammograms</article-title>
          ,
          <source>Journal of healthcare engineering</source>
          , (
          <year>2018</year>
          ). URL: https://doi.org/10.1155/
          <year>2018</year>
          /5940436PMid:30356422 PMCid:PMC6178513
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Shachor</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Greenspan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Goldberger</surname>
          </string-name>
          ,
          <article-title>A mixture of views network with applications to the classification of breast microcalcifications</article-title>
          ,
          <source>Computer Vision and Pattern Recognition</source>
          , (
          <year>2018</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Bekker</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Shalhon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Greenspan</surname>
          </string-name>
          ,
          <string-name>
            <surname>J.</surname>
          </string-name>
          <article-title>Goldberger, Multi-view probabilistic classification of breast microcalcifications</article-title>
          ,
          <source>in: Proceedings of the IEEE Transactions on Medical Imaging</source>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>N.</given-names>
            <surname>Dhungel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Carneiro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. P.</given-names>
            <surname>Bradley</surname>
          </string-name>
          ,
          <article-title>Fully automated classification of mammograms using deep residual neural networks</article-title>
          ,
          <source>in: Proceedings of the 14th International Symposium on Biomedical Imaging (ISBI</source>
          <year>2017</year>
          ), IEEE,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Cao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ma</surname>
          </string-name>
          ,
          <article-title>A survey of computer-aided detection of breast cancer with mammography</article-title>
          ,
          <source>J. Health Med. Inform</source>
          .
          <volume>7</volume>
          (
          <issue>4</issue>
          ) (
          <year>2016</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>