<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Machine Learning for Melanoma Management in a Clinical Setting</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Paul Walsh</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jennifer Lynch</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Brian Kelly</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Timothy Manning</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>NSilico Life Science</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dublin</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ireland paul.walsh@nsilico.com</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Cork Institute of Technology</institution>
          ,
          <addr-line>Bishopstown, Cork</addr-line>
          ,
          <country country="IE">Ireland</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>[33] N. Gessert, T. Sentker</institution>
          ,
          <addr-line>F. Madesta, R. Schmitz, H. Kniep, I. Baltruschat, R. Werner and A. Schlaefer</addr-line>
          ,
          <institution>"Skin Lesion Diagnosis using Ensembles, Unscaled Multi-Crop Evaluation and Loss Weighting," arXiv</institution>
          ,
          <addr-line>2018</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <volume>31</volume>
      <fpage>784</fpage>
      <lpage>785</lpage>
      <abstract>
        <p>This paper describes work in progress on a melanoma management platform known as Simplicity MDT, which is in use in major hospitals for the collection, management and analysis of clinical data. It includes a facility for uploading and managing high resolution digital colour photographs of melanoma lesions. As the data managed by this platform is structured and annotated, it is proposed that this could serve as a basis for supervised training datasets for machine learning. Machine learning models trained using this data can serve the wider community for screening, diagnostic and prognostic purposes. The proposed machine learning architecture includes integration with a model zoo, which provides networks that are pre-trained on publicly available datasets. An overview of the current system is presented and a roadmap for future developments is outlined.</p>
      </abstract>
      <kwd-group>
        <kwd>Machine Learning</kwd>
        <kwd>Melanoma</kwd>
        <kwd>Connected Health</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>The motivation for the development of a machine learning driven melanoma
management platform stems from engagement with clinical end users, doctors, oncologists
and other clinical specialists who are searching for innovative solutions to impact the
treatment of melanoma. Melanoma is a malignant tumour of melanocytes with about
160,000 new cases diagnosed annually, with high prevalence among Europeans. This
is a serious health issue and the aim of this research is to impact the diagnosis of this
disease by the proposed integration of machine vision-based classification, through
deep learning technology, with the myriad of information sources in electronic health
records captured through multi-disciplinary team discussions.</p>
      <p>
        Multidisciplinary care is currently accepted as best practice in the delivery of
highquality cancer management in Ireland and internationally. Multidisciplinary Team
Meetings (MDTMs) take place at regular intervals, where a team comprised of medical
experts across different relevant disciplines come together to discuss patient cases,
review treatment, and plan treatments [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Care delivered to patients through a
multidisciplinary approach results in positive outcomes especially in terms of diagnosis and
treatment planning, patient satisfaction and improved survival rates. Participating in
MDTM’s also has positive outcomes for clinicians – centred around the opportunity for
education, improved communication and working relationships. While MDTMs
enhance healthcare, they are mostly reliant on antiquated paper-based systems and
integrated software management tools that are sorely lacking. Moreover, there is a lost
opportunity for allowing patients to participate in the care pathway. The melanoma
management platform Simplicity MDT provides significant opportunity for gathering
clinically labelled digital images of melanoma lesions (see screenshot of system in Figure
1). The system has already managed over 8,000 cancer cases, including over 1,000
melanoma cases with digital imaging. This rich data set provides a basis for
machinebased assessment by allowing health professionals to gather valuable supervised
training data that can be used to augment existing deep learning models to enhance clinical
assessment.
      </p>
      <p>Data A</p>
      <p>Data B
0
5
10
15
20
25
30</p>
    </sec>
    <sec id="sec-2">
      <title>Machine Learning for Melanoma Applications</title>
      <p>
        The term machine learning is used to refer to the field of computer science concerned
with developing algorithms that can mine datasets to build statistical models [2].
Machine learning systems rely on training data that can either be labelled or unlabelled [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
Labelled data allows researchers to apply “supervised” learning algorithms, which can
progressively train models to find associations between input patterns and their
expected labels [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Labelling data can be extremely expensive and time consuming, so
any framework that facilitates the collection and labelling of data can serve as a useful
basis for building supervised machine learning models [5]. Unsupervised machine
learning models on the other hand do not require labelled data, but in turn can’t be used
to generalize label mappings for new data, instead finding use in more niche
applications, such as anomaly detection.
      </p>
      <p>
        Deep neural networks are a class of ML model that come in a variety of architectures,
but all are characterized as having a large number of layers, which are used to detect
low-level features and map these two increasingly complex models in the higher layers
of the network architecture [6]. Deep neural networks have had significant impact on
the classification of skin cancer as outlined in the Nature publication from Estava et al.
[7], where they note that automated classification of skin lesions using images is a
challenging task owing to the fine-grained variability in the appearance of skin lesions. A
class of deep learning networks, known as convolutional neural networks (CNNs), have
demonstrated impact in challenging visual classification tasks in a variety of domains
[8] [9] [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. Estava et al. developed a CNN for the classification of skin
lesions using digital images, using only raw pixels and disease labels as inputs and
tested its performance against 21 qualified dermatologists on clinically assessed images
with the following binary classification use cases:
• Keratinocyte carcinomas (the most common skin cancer) versus benign
seborrheic keratoses.
• Malignant melanomas (the most fatal skin cancer) versus benign nevi.
      </p>
      <p>Their system performed as well as the experts across both tasks, indicating that
machine learning can achieve a level of accuracy comparable to a dermatologist for skin
cancer classification. Estava et al. proposed that deep neural networks deployed to
devices such as smart phones, could bring important dermatology assessment to a wider
audience and can possibly deliver universal access to diagnostic care.</p>
      <p>
        We plan to take up this challenge, by designing an integrated cloud based mobile
solution that can enhance automated skin cancer classification by leveraging mobile
edge computing (MEC), which provides cloud computing capabilities at the edge of the
network [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] and optimises performance by bringing computing resources to the data.
Furthermore, recent advances in edge computing will have profound impacts on
healthcare systems [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ].
      </p>
      <p>
        Connected healthcare has already been enhanced with the proliferation of
technologies [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] and mobile hardware is having a positive impact on healthcare, e.g. the
monitoring of hypertensive patients with connected blood pressure monitors. Similarly,
there have been a number of smart-phone apps developed for the detection of melanoma
[
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] and while they provide an array of features including information, education,
classification, risk assessment and change monitoring, they can only provide limited
processing power due to the current processing limitations of mobile devices.
      </p>
      <p>
        Smart mobile networks offer increased processing through mobile edge computing,
which offers low latency, high bandwidth and localized cloud computing capabilities
[
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. A major benefit of 5G technology is the provision of ad hoc local cloud instances
via a mobile network [20], which allows participating institutions to “securely share
patient genomic, imaging and clinical data for potentially lifesaving discoveries. It will
enable large amounts of data from sites all around the world to be analyzed in a
distributed way, while preserving the privacy and security of patient data at each site”
[21].
      </p>
    </sec>
    <sec id="sec-3">
      <title>Future Work</title>
      <p>
        We aim to develop Simplicity MDT as a system for gathering high quality clinically
labelled digital images and integrate this system with the advances in both mobile edge
computing and deep learning to provide universal platform for assessing suspect skin
cancer lesions. The system will have the following features:
• Use of state-of-the-art pre-trained melanoma machine learning models, based on
evaluations of results using ImageNet, VGG19, ResNet-50 [22], Inception [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]
and newer architectures, including PolyNet [23], ResNeXt [24], Densenet [25],
SENets [26] and DualPathNet [27] [22].
• Machine learning implemented via mobile edge technology in a secure and
federated environment [28]. The use of mobile edge local cloud computing will allow
clinicians in distributed locations, to securely review, edit and discuss cancer
cases in real time within multi-disciplinary teams.
• Ongoing experts “human-in-the-loop” to constantly label, validate and update
models [29] [30]. Our Simplicity MDT software platform serves as a basis for
collecting numerous multivariate labels which can be used to help build
prediction and classification models for melanoma images and other datasets.
      </p>
      <p>In addition, a vast range of techniques can be used to enhance classification and
object detection. Having a network of partners, collaborators and consumers can
provide a wealth of access to resources for creating quality labeled data sets.</p>
    </sec>
    <sec id="sec-4">
      <title>Acknowledgements</title>
      <p>PW is supported by funding through Science Foundation Ireland (SFI) Multi-Gene
Assay Cloud Computing Platform - (16/IFA/4342), BK, JL, PW are funded under H2020
RISE project SemAntically integrating Genomics with Electronic health records for
Cancer CARE (SageCare), grant number 644186.</p>
      <p>H. Greenspan, B. v. Ginneken and R. M. Summers, "Guest Editorial Deep
Learning in Medical Imaging: Overview and Future Promise of an Exciting New
[21] J. VANIAN, "Intel's Cancer Cloud Gets New Recruits," 31 3 2016. [Online].</p>
      <p>Available: http://fortune.com/2016/03/31/intel-cancer-cloud-research/.
[Accessed 2 2019].
[31] P. Tschandl, C. Rosendahl and H. Kittler, "The HAM10000 dataset, a large
collection of multi-source dermatoscopic images of common pigmented skin
lesions," arXiv, 2018.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>B. L. S. O. D. a. M. R.</given-names>
            <surname>Kane</surname>
          </string-name>
          ,
          <article-title>" Multidisciplinary team meetings and their impact on workflow in radiology and pathology departments</article-title>
          .,
          <source>" BMC medicine</source>
          , vol.
          <volume>5</volume>
          , no.
          <issue>1</issue>
          , p.
          <fpage>15</fpage>
          ,
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>A. L.</given-names>
            <surname>Samuel</surname>
          </string-name>
          ,
          <article-title>"Some Studies in Machine Learning Using the Game of Checkers,"</article-title>
          <source>IBM Journal of Research and Development</source>
          , vol.
          <volume>3</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>210</fpage>
          -
          <lpage>229</lpage>
          ,
          <year>1959</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>B.</given-names>
            <surname>Christopher</surname>
          </string-name>
          ,
          <source>Pattern Recognition and Machine Learning</source>
          , Springer,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>L.</given-names>
            <surname>Deng</surname>
          </string-name>
          and
          <string-name>
            <given-names>D.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <article-title>"Deep Learning: Methods and Applications," Foundations and Trends® in Signal Processing</article-title>
          , vol.
          <volume>7</volume>
          , no.
          <issue>3-4</issue>
          , pp.
          <fpage>197</fpage>
          -
          <lpage>387</lpage>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Technique</surname>
          </string-name>
          ,
          <article-title>"</article-title>
          <source>IEEE Transactions on Medical Imaging</source>
          , vol.
          <volume>35</volume>
          , no.
          <issue>5</issue>
          , pp.
          <fpage>1153</fpage>
          -
          <lpage>1159</lpage>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Y. LeCun</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          <string-name>
            <surname>Bengio</surname>
            and
            <given-names>G.</given-names>
          </string-name>
          <string-name>
            <surname>Hinton</surname>
          </string-name>
          ,
          <article-title>"Deep learning,"</article-title>
          <source>Nature</source>
          , vol.
          <volume>521</volume>
          , p.
          <fpage>436</fpage>
          -
          <lpage>444</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <surname>Thrun</surname>
          </string-name>
          ,
          <article-title>"Dermatologist-level classification of skin cancer with deep neural networks,"</article-title>
          <source>Nature</source>
          , vol.
          <volume>542</volume>
          , p.
          <fpage>115</fpage>
          -
          <lpage>118</lpage>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Karpathy</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Khosla</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Bernstein</surname>
            ,
            <given-names>A. C.</given-names>
          </string-name>
          <string-name>
            <surname>Berg</surname>
            and
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Fei-Fei</surname>
          </string-name>
          ,
          <article-title>"ImageNet Large Scale Visual Recognition Challenge,"</article-title>
          <source>International Journal of Computer Vision</source>
          , vol.
          <volume>115</volume>
          , no.
          <issue>3</issue>
          , p.
          <year>2015</year>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <given-names>A.</given-names>
            <surname>Krizhevsky</surname>
          </string-name>
          , I. Sutskever and
          <string-name>
            <given-names>G. E.</given-names>
            <surname>Hinton</surname>
          </string-name>
          ,
          <article-title>"ImageNet Classification with Deep Convolutional Neural Networks,"</article-title>
          <source>in Neural Information Processing Systems</source>
          <year>2012</year>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>S.</given-names>
            <surname>Ioffe</surname>
          </string-name>
          and
          <string-name>
            <given-names>C.</given-names>
            <surname>Szegedy</surname>
          </string-name>
          ,
          <article-title>"Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,"</article-title>
          arXiv,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>C.</given-names>
            <surname>Szegedy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Vanhoucke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ioffe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Shlens</surname>
          </string-name>
          and
          <string-name>
            <given-names>Z.</given-names>
            <surname>Wojna</surname>
          </string-name>
          ,
          <article-title>"Rethinking the Inception Architecture for Computer Vision,"</article-title>
          <source>in The IEEE Conference on Computer Vision and Pattern Recognition(CVPR)</source>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>C.</given-names>
            <surname>Szegedy</surname>
          </string-name>
          , W. Liu,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Jia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Sermanet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Reed</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Anguelov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Erhan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Vanhoucke</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Rabinovich</surname>
          </string-name>
          ,
          <article-title>"Going Deeper With Convolutions,"</article-title>
          <source>in The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</source>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>K.</given-names>
            <surname>He</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ren</surname>
          </string-name>
          and
          <string-name>
            <given-names>J.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <article-title>"Deep Residual Learning for Image Recognition,"</article-title>
          <source>in The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</source>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>M.</given-names>
            <surname>Patel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Naughton</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Chan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Sprecher</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Abeta</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Neal</surname>
          </string-name>
          ,
          <string-name>
            <surname>"MobileEdge Computing - Introductory Technical White Paper</surname>
          </string-name>
          ," etsi,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>A. H.</given-names>
            <surname>Sodhro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Luo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. K.</given-names>
            <surname>Sangaiah</surname>
          </string-name>
          and
          <string-name>
            <given-names>S. W.</given-names>
            <surname>Baik</surname>
          </string-name>
          ,
          <article-title>"Mobile edge computing based QoS optimization in medical healthcare applications,"</article-title>
          <source>International Journal of Information Management</source>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>O.</given-names>
            <surname>Salman</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Elhajj</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Kayssi</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Chehab</surname>
          </string-name>
          ,
          <article-title>"Edge computing enabling the Internet of Things," in IEEE 2nd World Forum on Internet of Things (WF-IoT)</article-title>
          , Milan,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>L.</given-names>
            <surname>Miller</surname>
          </string-name>
          ,
          <article-title>"The Journey to 5G," 2 3 2018</article-title>
          . [Online]. Available: https://www.healthcareitnews.com/news/journey-5g.
          <source>[Accessed 2</source>
          <year>2019</year>
          ].
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>A.</given-names>
            <surname>Kassianos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Emery</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Murchie</surname>
          </string-name>
          and
          <string-name>
            <given-names>F.</given-names>
            <surname>Walter</surname>
          </string-name>
          ,
          <article-title>"Smartphone applications for melanoma detection by community, patient and generalist clinician users: a review,"</article-title>
          <source>British Journal of Dermatology</source>
          , vol.
          <volume>172</volume>
          , no.
          <issue>6</issue>
          , pp.
          <fpage>1507</fpage>
          -
          <lpage>1518</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>Y. C.</given-names>
            <surname>Hu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Patel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Sabella</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Sprecher</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Young</surname>
          </string-name>
          ,
          <article-title>"Mobile Edge Computing A key technology towards 5G," 9</article-title>
          <year>2015</year>
          . [Online]. Available:
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>