<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Graph Neural Network For Fuzzy Twitter Graphs</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Georgios Drakopoulos</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Eleanna Kafeza</string-name>
          <email>eleana.kafeza@zu.ac.ae</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Phivos Mylonas</string-name>
          <email>fmylonas@ionian.gr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Spyros Sioutas</string-name>
          <email>sioutas@ceid.upatras.gr</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>College of Technological Innovation</institution>
          ,
          <addr-line>Dubai Academic City, E-L1-108, UAE</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Computer Engineering and Informatics Department, University of Patras</institution>
          ,
          <addr-line>Patras 26504, Hellas</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>Social graphs abound with information which can be harnessed for numerous behavioral purposes including online political campaigns, digital marketing operations such as brand loyalty assessment and opinion mining, and determining public sentiment regarding an event. In such scenarios the eficiency of the deployed methods depends critically on three factors, namely the account behavioral model, the social graph topology, and the nature of the information collected. A prime example is Twitter which is especially known for the lively activity and the intense conversations. Here an extensible computational methodology is proposed based on a graph neural network operating on an edge fuzzy graph constructed by a combination of structural, functional, and emotional Twitter attributes. These graphs constitute a strong algorithmic cornerstone for engineering cases where a properly formulated potential or uncertainty functional is linked to each edge. Starting from the ground truth in each individual vertex, the graph neural network progressively computes in an unsupervised manner a global graph state which can in turn be subject to further processing. The results, obtained using as a benchmark a recent similar graph neural network architecture along with two Twitter graphs, are promising.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Fuzzy graphs</kwd>
        <kwd>graph mining</kwd>
        <kwd>graph neural networks</kwd>
        <kwd>behavioral analytics</kwd>
        <kwd>emotional polarity</kwd>
        <kwd>Twitter</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
    </sec>
    <sec id="sec-2">
      <title>2. Previous Work</title>
      <sec id="sec-2-1">
        <title>3.1. Fundamental concepts</title>
        <p>GNNs operate on irregular domains expressing relation- In order to describe the proposed architecture a few basic
ships. Heterogeneous GNN architectures are examined concepts must be first revised or defined. First the class
in [1] and representative GNNs designed to complete ver- of edge fuzzy graphs is introduced in definition 1.
satile tasks in [2]. Edge labeling is proposed in [3] in the Definition 1 (Edge fuzzy graph). An edge fuzzy
context of few-short learning for GNNs. The technique graph is a combinatorial object represented by the ordered
of aggregated neural path in conjunction with machine triplet shown in equation (1).
learning (ML) tasks is described in [4]. The emotional
coherency of Twitter graphs with GNNs is explored in  =△ (, , ℎ) (1)
[5], whereas in [6] are given guidelines for social
recommendation based on GNNs. The elements in (1) have the following meaning:</p>
        <p>Graph mining is a mainstay of current ML [7]. In a
graph signal processing (GSP) context adjacency matrices • The vertex set  . In the context of this work
are considered as two-dimensional signals and signal pro- each vertex corresponds to a single Twitter account
cessing techniques are then employed to extract patterns through a bijection.
of interest [8]. An overview of the connections to deep • The set of fuzzy edges  where  ⊆  ×  . The
learning are given in [9]. In [10] a tensor stack network connectivity patterns therein reflect the underlying
(TSN) is trained to estimate the topological correlation graph dynamics.
of graph pairs compressed with the two-dimensional dis- • The functional ℎ :  → [0, 1] maps each edge to a
crete cosine transform (DCT2), while the same architec- probability drawn from a single distribution. These
ture evaluates graph resiliency in [11]. Flow-based GSP result from graph semantics and functionality.
is examined in [12]. The basic operations of GSP such as In the general case the digital account behavior for any
shifting and sampling are defined in [ 13]. A graph ver- online social network is given in definition 2.
sion of the LMS adaptive filtering algorithm is presented
in [14]. A versatile and space eficient data structure for Definition 2 (Account behavior). The online behavior
persistent graphs is described in [15]. of an account consists of the total peer interaction over all</p>
        <p>Behavioral attributes have recently emerged as an inte- possible ways ofered by the given social medium.
gral part of many recent computational systems [16]. The
connection between behavioral systems and data driven
analysis is explored in [17]. Digital trust is a paramount
factor for recruiting candidates from LinkedIn [18].
Clustering fMRI images with tensor distances for emotion
recognition [19], while gamification strategies are
explored in [20]. An overview of behavioral systems is
given in [21].</p>
        <p>The above definition can be readily extended in the case
two or more accounts are connected over multiple social
media, expanding thus the interaction potential.
However, this is outside the scope of this work.</p>
        <p>In this work the online behavior of Twitter accounts
has three distinct components, namely the follow
relationships, retweet patterns, and emotional polarity with
respect to a reference hashtag set. The intuition behind
their selection is as follows:</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Proposed Architecture</title>
      <p>In this section the proposed GNN architecture as well as
the notions underlying it are described.
• The follow relationships capture the structural
aspect of the Twitter graph since they constitute
the core of its edges.
• The retweet patterns are an integral part of the
functionality taking place bridging accounts in a
diferent way.
• The emotional coherency is a factor evaluating
the similarity of sentiments towards selected
top• Whether there is a directed follow link from the
-th account to the -th one denoted by the binary
tal fact that edges are fuzzy, namely that they belong to
The proposed GNN architecture relies on the fundamen-  , , and  express the relative contribution of each
term and in our experiments follow the semantic strength
Account
behavior
Retweet
patterns
Follow
patterns</p>
      <p>Emotional
polarity</p>
      <p>However, it should be noted that for larger benchmark
graphs or for dynamic ones alternative criteria should be
sought in order to avoid the overwhelming complexity
of determining a vertex cover.
3.2. Architecture
(2)
(3)
the graph with a certain probability which in the general
case depends on an attribute set. The latter is frequently
strictly local or a function of a small neighborhood and
rarely global since updating such a set is costly and prone
to dependency bottlenecks. In the context of this
conference paper the probability , for edge , between
vertices  and  is computed as in equation (4):
prob {, } =△ , =  , +  
In (4) three factors are taken into consideration:</p>
      <p>, + , (4)
indicator , .
• The ratio of the number , of retweets of the
-th account coming from the -th one to the total
retweets  in the graph.
• The signed correlation factor , expressing the
emotional coherency of the -th and -th accounts
with respect to the reference hashtag set.</p>
      <p>The sentiment [] of the -th account during iteration
 consists of a vector containing an emotional polarity
score, namely the percentage of the how positive, neutral,
of negative the -th account feels towards the as shown
in (5). Initially the ground truth vector of the -th account
is the respective average percentage of positive, neutral,
or negatively charged words in the tweets containing at
least one hashtag from the reference set.</p>
      <p>[] =△ [︀ ,

,
,</p>
      <p>︀]</p>
      <p>
        Given the iteration-dependent value of [], the value
of the correlation factor , should be computed
during each iteration as shown in (
        <xref ref-type="bibr" rid="ref7">6</xref>
        ). It should be
highlighted that , is the only term of (4) which is signed,
thereby reinforcing or weakening the strength between
two accounts depending on whether they have similar
sentiments towards the reference hashtag set.
√︂
∑︀3
=1
︁( [] [] − 1/3

︁) 2√︂
=1
∑︀3
      </p>
      <p>=1
∑︀3
=1
︁( [] [] − 1/3︁) ∑︀3

︁( [] [] − 1/3</p>
      <p>
        ︁)

︁( [] [] − 1/3

︁) 2
(5)
(
        <xref ref-type="bibr" rid="ref7">6</xref>
        )
      </p>
      <p>The weights of the linear combination in (4) encode
the sign and relative strength of each factor
contribution, namely how much each factor participates to the
edge probability existence and whether such
participation reinforces or weakens said probability respectively.</p>
      <p>Moreover, they ensure the numerical stability of , .</p>
      <p>Intuitively speaking, equation (4) is a linear
estimator of the true edge existence probability. The weights
 , =△
1
,
 , =△</p>
      <p>1
1 + ,
 , =△</p>
      <p>1
1 + 2,</p>
      <p>
        However, the weight selection of (
        <xref ref-type="bibr" rid="ref8">8</xref>
        ) has the
disadvantage of being almost singular close to zero, generating
thus excessive weight values. A viable alternative is the
inverse linear weight function of (
        <xref ref-type="bibr" rid="ref9">9</xref>
        ).
      </p>
      <p>Another option for the weight function is that the
inverse square function of equation (10). The latter typically
expresses a potential function in various applications.</p>
      <p>
        In figure 2 the weight functions of (
        <xref ref-type="bibr" rid="ref9">9</xref>
        ) and (10) are
shown for their entire range. It can be immediately
inferred they are strictly decreasing and everywhere
smooth, expressing the fact that the more likely is an
edge to belong to the graph, the easier to cross it.
      </p>
      <p>At the core of the proposed GNN architecture is the
update mechanism of (11). For the -th vertex the []
is computed as in (11). Therein the index  ranges over
all inbound neighbors of the -th vertex and thus it
depends on local connectivity patterns. However, since the
state vector of its neighbors depends on recursively on
that of its own vectors, this mechanism is essentially a
higher order status computation. During an update it
may be possible that certain neighbors may have already
had their own state vectors updated, whereas others not.</p>
      <p>Thus, the iteration indicator * will be used. This process
terminates when the state vectors remain unchanged
under a threshold of  0 for three consecutive iterations.</p>
      <p>[+1] =  ︃(  20 [− 1] +  20 ∑︁  ∆ , [* ]</p>
      <p>)︃

of the respective factor. This means that  is higher The hyperparameter  0 scales input to a practical
dosince the follow denotes a high degree of coherency be- main for the sigmoid function  (· ),  , is the weight of
tween the two accounts. Along a similar line of reason- the edge, and ∆ is the sum of the edge weights of the
ining, frequent retweets between two accounts indicate a bound neighbors. In (11) the sigmoid function is defined
somewhat strong connection between them. Moreover, as in (12) which is diferentiable and smooth everywhere.
a consistent emotional coherency between two accounts
may well suggest a behavioral link between them.  (;  0) =△ 1 (12)</p>
      <p>Additionally the weight  , assigned to each edge is a 1 + exp (−  0)
function of the strength of the corresponding edge.</p>
      <p>The derivative of the sigmoid function is given in (13).</p>
      <p>, =△  (, )</p>
      <p>
        The weight function  (· ) of (7) is the same for each
edge and it is directly or at least indirectly linked to
the semantics of the underlying graph. One of the most
common options is that shown in (
        <xref ref-type="bibr" rid="ref8">8</xref>
        ).
      </p>
      <p>
        (7)
(
        <xref ref-type="bibr" rid="ref8">8</xref>
        )
(
        <xref ref-type="bibr" rid="ref9">9</xref>
        )
(10)
(11)
      </p>
      <p>(13)</p>
      <p>The last form of (13) comes from the fundamental
property of the sigmoid function described in (14) below:
 (;  0) +  (− ;  0) = 1</p>
      <p>(14)</p>
      <p>The preceding properties ensure that  (· ) is smooth
enough to prevent divergence in most cases for a broad
spectrum of distributions.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Results</title>
      <p>The results of the proposed GNN methodology are
presented in this section along with intuition about them.
They are divided to four groups, one for each possible
combination of benchmark graph (1821 / US2020) and
weight function (inverse linear / inverse square).</p>
      <sec id="sec-4-1">
        <title>4.1. Dataset</title>
        <p>The two benchmark graphs used in the experiments were
taken from [5]. They represent two characteristic cases
of social graphs, namely one with a relative quiet and
coherent one (1821) and one containing heated
conversations and a considerable degree of dissonance (US2020).
The Twitter sampling interval was 8/2020-10/2020.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Number of iterations</title>
        <p>In table 3 the parameters used in the experimental setup
of this work are shown. This allows for the easy
exploration of the parameter space. Observe that the actual
values of these parameters are in accordance of the strength
of the respective factor.</p>
        <p>
          Table 4 contains the normalized number of iterations
as a function of the hyperparameter  0 of (11) for the
two benchmark graphs of table 2 and for the two possible
weight functions shown in equations (
          <xref ref-type="bibr" rid="ref9">9</xref>
          ) and (10).
Normalization takes place per graph and per weight function
in order to show the comparative efect of  0 in each
case. In order to demonstrate the efect of the emotional
attribute , of (4) on the convergence rate the same
1
0.9
n0.8
o
it
c
n
u
ft
h
g
i
e
W0.7
0.6
0.5
        </p>
        <p>0
Weight functions vs edge probability
0.2
0.4
0.6
0.8</p>
        <p>1</p>
        <p>Edge probability
GNN is run with the latter removed from the initial local case. There it can be seen that the US2020 yields for
ground truth vectors at the vertices. both weight choices slightly diferent results from the</p>
        <p>From table 4 it follows immediately that the inclusion initial distribution when the emotional factor is excluded
of the behavioral factor in (4) leads to quicker conver- but considerably diferent ones when they are included.
gence of the proposed GNN architecture. This can be Thus, it is a graph with a heavy emotional charge. On the
attributed to the following reasons: other hand, the 1821 graph tends to yield similar results
in every case, signifying thus greater coherency.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions And Future Work</title>
      <p>• Information enrichment: From an algorithmic
perspective, the behavioral factor adds an
independent dimension to the profile of each vertex.</p>
      <p>Hence, the new vertex profile space can
diferentiate adequately between dissimilar vertices while
maintaining close enough similar ones.
• Numerical variation: The above is enhanced
by having more diversified edge weights. Besides
the additional factor, the behavioral term is also
signed. In turn this expands the range of weights,
increasing thus the possible number of values.</p>
      <p>This conference paper focuses on a graph neural
network architecture for discovering community structure
in large Twitter graphs. In this approach said structure is
formed using a Twitter account behavioral model which
results from fusing structural and functional attributes
with emotional ones. The proposed model can be
naturally extended to include additional features from these
categories or even ones belonging to diferent categories</p>
      <p>The above factors suggest that variability in the weight as long as they can be expressed in a numerical scale
space as well as in the vertex profile increase the flexibil- where normalization does not influence semantics. In
ity of the update mechanism of (11). This is in accordance our experiments the inclusion of behavioral attributes
with the standard pattern recognition maxim stating that leads consistently to quicker GNN convergence.
mapping data to a space of higher dimensionality facil- This work can be extended in a number of ways. First,
itates their clustering. On the other hand, the curse of multiple weight functions can map each edge to a weight
dimensionality imposes a limit on how big this new space vector and hence to a multidimensional weight space
can get. As both spaces used in this work however are where each dimension has its own semantics. Then the
low dimensional, this does not constitute a problem. fundamental parameters of candidate distributions
de</p>
      <p>Regarding the total sentiment, in table 5 is shown the scribing this space can be derived through signal
estiaverage emotional distribution before and after the GNN mation techniques. Second, alternative behavioral
modexecution in each case using the value of hyperparame- els depending only on local properties or on local
estiter  0 which leads to the quickest convergence in each mates of global ones should be developed as this would be</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>most appealing for a distributed implementation</article-title>
          . Third, [10]
          <string-name>
            <given-names>G.</given-names>
            <surname>Drakopoulos</surname>
          </string-name>
          , E. Kafeza,
          <string-name>
            <given-names>P.</given-names>
            <surname>Mylonas</surname>
          </string-name>
          , L. Iliadis,
          <article-title>models for computing or estimating the edge existence Transform-based graph topology similarity metprobability which reflect the underlying graph semantics rics</article-title>
          ,
          <source>NCAA</source>
          <volume>33</volume>
          (
          <year>2021</year>
          )
          <fpage>16363</fpage>
          -
          <lpage>16375</lpage>
          . doi:
          <volume>10</volume>
          .1007/ should be research objectives.
          <source>Finally, given the evolv- s00521-021-06235-9.</source>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <article-title>ing nature of online social networks, an architecture for</article-title>
          [11]
          <string-name>
            <given-names>G.</given-names>
            <surname>Drakopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Mylonas</surname>
          </string-name>
          ,
          <article-title>Evaluating graph renon-stationary graphs should also be developed. In this silience with tensor stack networks: A keras imcase certain transient global graph states can be used to plementation</article-title>
          , NCAA
          <volume>32</volume>
          (
          <year>2020</year>
          )
          <fpage>4161</fpage>
          -
          <lpage>4176</lpage>
          . doi:10.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <article-title>obtain intermediate status results regarding clustering</article-title>
          ,
          <volume>1007</volume>
          /s00521-020-04790-1.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <article-title>lfow, degree distribution, or any other global property</article-title>
          . [12]
          <string-name>
            <given-names>M. T.</given-names>
            <surname>Schaub</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Segarra</surname>
          </string-name>
          ,
          <article-title>Flow smoothing and deThese transient states can well serve as starting points noising: Graph signal processing in the edge-space, for new partitioning techniques</article-title>
          . in: GlobalSIP, IEEE,
          <year>2018</year>
          , pp.
          <fpage>735</fpage>
          -
          <lpage>739</lpage>
          . [13]
          <string-name>
            <given-names>A.</given-names>
            <surname>Gavili</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.-P.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <article-title>On the shift operator, graph frequency, and optimal filtering in graph signal proAcknowledgments cessing</article-title>
          ,
          <source>IEEE Transactions on Signal Processing</source>
          <volume>65</volume>
          (
          <year>2017</year>
          )
          <fpage>6303</fpage>
          -
          <lpage>6318</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <source>This conference paper was supported by the Research</source>
          [14]
          <string-name>
            <given-names>F.</given-names>
            <surname>Hua</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Nassif</surname>
          </string-name>
          , C. Richard,
          <string-name>
            <given-names>H.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. H.</given-names>
            <surname>Sayed</surname>
          </string-name>
          ,
          <article-title>Incentive Fund (RIF) Grant R18087 provided by Zayed A preconditioned graph difusion LMS for adaptive University, UAE. graph signal processing</article-title>
          , in: EUSIPCO, IEEE,
          <year>2018</year>
          , pp.
          <fpage>111</fpage>
          -
          <lpage>115</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>References</surname>
            [15]
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Kontopoulos</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <string-name>
            <surname>Drakopoulos</surname>
          </string-name>
          ,
          <article-title>A space eficient scheme for graph representation</article-title>
          , in: ICTAI, IEEE,
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>W.</given-names>
            <surname>Fan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Ma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>He</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Tang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Yin</surname>
          </string-name>
          , [20]
          <string-name>
            <given-names>S.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Song</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Lockee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Burton</surname>
          </string-name>
          ,
          <article-title>What is gamifiGraph neural networks for social recommendation, cation in learning and education?</article-title>
          , in: Gamification in: The WWW conference,
          <year>2019</year>
          , pp.
          <fpage>417</fpage>
          -
          <lpage>426</lpage>
          . in learning and education, Springer,
          <year>2018</year>
          , pp.
          <fpage>25</fpage>
          -
          <lpage>[</lpage>
          7]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ortega</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Frossard</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Kovačević</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. M.</given-names>
            <surname>Moura</surname>
          </string-name>
          , 38. P. Vandergheynst,
          <source>Graph signal processing:</source>
          [21]
          <string-name>
            <given-names>N.</given-names>
            <surname>Wilkinson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Klaes</surname>
          </string-name>
          ,
          <article-title>An introduction to behavOverview, challenges, and applications</article-title>
          , Proceed- ioral economics,
          <source>Macmillan International Higher ings of the IEEE</source>
          <volume>106</volume>
          (
          <year>2018</year>
          )
          <fpage>808</fpage>
          -
          <lpage>828</lpage>
          . Education,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>G.</given-names>
            <surname>Mateos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Segarra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. G.</given-names>
            <surname>Marques</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Ribeiro</surname>
          </string-name>
          ,
          <article-title>Connecting the dots: Identifying network structure via graph signal processing</article-title>
          ,
          <source>IEEE Signal Processing Magazine</source>
          <volume>36</volume>
          (
          <year>2019</year>
          )
          <fpage>16</fpage>
          -
          <lpage>43</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>M.</given-names>
            <surname>Cheung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Shi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Wright</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. Y.</given-names>
            <surname>Jiang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. M.</given-names>
            <surname>Moura</surname>
          </string-name>
          ,
          <article-title>Graph signal processing and deep learning: Convolution, pooling, and topology</article-title>
          ,
          <source>IEEE Signal Processing Magazine</source>
          <volume>37</volume>
          (
          <year>2020</year>
          )
          <fpage>139</fpage>
          -
          <lpage>149</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>