=Paper=
{{Paper
|id=Vol-3052/paper24
|storemode=property
|title=A graph neural network for fuzzy Twitter graphs
|pdfUrl=https://ceur-ws.org/Vol-3052/paper24.pdf
|volume=Vol-3052
|authors=Georgios Drakopoulos,,Eleanna Kafeza,,Phivos Mylonas,,Spyros Sioutas
|dblpUrl=https://dblp.org/rec/conf/cikm/DrakopoulosKMS21
}}
==A graph neural network for fuzzy Twitter graphs==
A Graph Neural Network For Fuzzy Twitter Graphs
Georgios Drakopoulos1 , Eleanna Kafeza2 , Phivos Mylonas1 and Spyros Sioutas3
1
Department of Informatics, Ionian University, Tsirigoti Sq. 7, Kerkyra 49100, Hellas
1
College of Technological Innovation, Dubai Academic City, E-L1-108, UAE
3
Computer Engineering and Informatics Department, University of Patras, Patras 26504, Hellas
Abstract
Social graphs abound with information which can be harnessed for numerous behavioral purposes including online political
campaigns, digital marketing operations such as brand loyalty assessment and opinion mining, and determining public
sentiment regarding an event. In such scenarios the efficiency of the deployed methods depends critically on three factors,
namely the account behavioral model, the social graph topology, and the nature of the information collected. A prime example
is Twitter which is especially known for the lively activity and the intense conversations. Here an extensible computational
methodology is proposed based on a graph neural network operating on an edge fuzzy graph constructed by a combination
of structural, functional, and emotional Twitter attributes. These graphs constitute a strong algorithmic cornerstone for
engineering cases where a properly formulated potential or uncertainty functional is linked to each edge. Starting from the
ground truth in each individual vertex, the graph neural network progressively computes in an unsupervised manner a global
graph state which can in turn be subject to further processing. The results, obtained using as a benchmark a recent similar
graph neural network architecture along with two Twitter graphs, are promising.
Keywords
Fuzzy graphs, graph mining, graph neural networks, behavioral analytics, emotional polarity, Twitter
1. Introduction graph neural network (GNN) architecture. GNNs con-
stitute a class of unsupervised neural networks where
Graph mining is an integral part of the interconnected each vertex, representing a processing node, starts with
era since it lays the groundwork for numerous applica- a local ground truth information vector and iteratively
tions across a wide array of financial and technological a global status is derived based on the fundamental fact
fields including among others social network analysis, that graphs contain inherently higher order information
database query optimization, graph signal processing in a distributed manner. The resulting graph global state
(GSP), supply chain and logistics networks, and brain cir- can be subsequently further processed in order to derive
cuit analysis. In this context modeling a graph in terms global properties such as community discovery.
of vertices, connectivity patterns, and associated features The primary research objective of this conference pa-
is tantamount to data model selection. Edge fuzzy graphs per is the development of a GNN architecture designed
extend classical graphs as probabilities drawn from a sin- for edge fuzzy Twitter graphs constructed from incor-
gle distribution which may well have unknown param- porating structural, functional, and behavioral features.
eters to be estimated. Said distribution is closely linked The proposed methodology can be inherently extended to
to the semantics and functional nature of the underlying other possible attribute types, making it thus appropriate
graph. For instance, in a transportation network edge ex- for mining graphs originating from social media or evolv-
istence probabilities can show how likely a specific road ing computational ecosystems for that matter. This work
is to be blocked from snow in winter months, whereas differentiates itself from previous ones in two aspects,
in a computer network they may model the chance of a namely the fusion of various heterogeneous attributes
virus being propagated along a given link. and the induced edge fuzzy topology.
In order to compute an estimation of the global graph The remaining of this work is structured as follows. In
state which allows not only a higher level overview but section 2 the recent scientific literature regarding GNNs,
also subsequent processing, in this work will be used a graph mining, and computational behavioral science is
briefly reviewed. The proposed methodology along with
CIKM’21: 30th ACM International Conference on Information and the relevant intuition are given in 3. The results obtained
Knowledge Management, November 01–05, 2021, Virtual Event, QLD,
from the experiments are the focus of section 4. Future
Australia
$ c16drak@ionio.gr (G. Drakopoulos); eleana.kafeza@zu.ac.ae research directions are given in 5. Technical acronyms
(E. Kafeza); fmylonas@ionian.gr (P. Mylonas); are defined the first time they are encountered in the
sioutas@ceid.upatras.gr (S. Sioutas) text. Finally, the notation of this conference paper is
0000-0002-0975-1877 (G. Drakopoulos); 0000-0001-9565-2375 summarized in table 1.
(E. Kafeza); 0000-0002-6916-3129 (P. Mylonas)
© 2021 Copyright for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
Workshop
Proceedings
http://ceur-ws.org
ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org)
1
Georgios Drakopoulos et al. CEUR Workshop Proceedings 1–7
Table 1
Notation of this conference paper.
Symbol Meaning First in
△
= Definition or equality by definition Eq. (1)
{𝑠1 , . . . , 𝑠𝑛 } Set with elements 𝑠1 , . . . , 𝑠𝑛 Eq. (2)
(𝑡1 , . . . , 𝑡𝑛 ) Tuple with elements 𝑡1 , . . . , 𝑡𝑛 Eq. (1)
|𝑆| Set cardinality functional Eq. (3)
prob {Ω} Probability of event Ω occurring Eq. (4)
2. Previous Work 3.1. Fundamental concepts
GNNs operate on irregular domains expressing relation- In order to describe the proposed architecture a few basic
ships. Heterogeneous GNN architectures are examined concepts must be first revised or defined. First the class
in [1] and representative GNNs designed to complete ver- of edge fuzzy graphs is introduced in definition 1.
satile tasks in [2]. Edge labeling is proposed in [3] in the Definition 1 (Edge fuzzy graph). An edge fuzzy
context of few-short learning for GNNs. The technique graph is a combinatorial object represented by the ordered
of aggregated neural path in conjunction with machine triplet shown in equation (1).
learning (ML) tasks is described in [4]. The emotional
coherency of Twitter graphs with GNNs is explored in △
𝐺 = (𝑉, 𝐸, ℎ) (1)
[5], whereas in [6] are given guidelines for social recom-
mendation based on GNNs. The elements in (1) have the following meaning:
Graph mining is a mainstay of current ML [7]. In a
graph signal processing (GSP) context adjacency matrices • The vertex set 𝑉 . In the context of this work
are considered as two-dimensional signals and signal pro- each vertex corresponds to a single Twitter account
cessing techniques are then employed to extract patterns through a bijection.
of interest [8]. An overview of the connections to deep • The set of fuzzy edges 𝐸 where 𝐸 ⊆ 𝑉 × 𝑉 . The
learning are given in [9]. In [10] a tensor stack network connectivity patterns therein reflect the underlying
(TSN) is trained to estimate the topological correlation graph dynamics.
of graph pairs compressed with the two-dimensional dis- • The functional ℎ : 𝐸 → [0, 1] maps each edge to a
crete cosine transform (DCT2), while the same architec- probability drawn from a single distribution. These
ture evaluates graph resiliency in [11]. Flow-based GSP result from graph semantics and functionality.
is examined in [12]. The basic operations of GSP such as In the general case the digital account behavior for any
shifting and sampling are defined in [13]. A graph ver- online social network is given in definition 2.
sion of the LMS adaptive filtering algorithm is presented
in [14]. A versatile and space efficient data structure for Definition 2 (Account behavior). The online behavior
persistent graphs is described in [15]. of an account consists of the total peer interaction over all
Behavioral attributes have recently emerged as an inte- possible ways offered by the given social medium.
gral part of many recent computational systems [16]. The
The above definition can be readily extended in the case
connection between behavioral systems and data driven
two or more accounts are connected over multiple social
analysis is explored in [17]. Digital trust is a paramount
media, expanding thus the interaction potential. How-
factor for recruiting candidates from LinkedIn [18]. Clus-
ever, this is outside the scope of this work.
tering fMRI images with tensor distances for emotion
In this work the online behavior of Twitter accounts
recognition [19], while gamification strategies are ex-
has three distinct components, namely the follow rela-
plored in [20]. An overview of behavioral systems is
tionships, retweet patterns, and emotional polarity with
given in [21].
respect to a reference hashtag set. The intuition behind
their selection is as follows:
3. Proposed Architecture • The follow relationships capture the structural
In this section the proposed GNN architecture as well as aspect of the Twitter graph since they constitute
the notions underlying it are described. the core of its edges.
• The retweet patterns are an integral part of the
functionality taking place bridging accounts in a
different way.
2
Georgios Drakopoulos et al. CEUR Workshop Proceedings 1–7
Account the graph with a certain probability which in the general
behavior case depends on an attribute set. The latter is frequently
strictly local or a function of a small neighborhood and
rarely global since updating such a set is costly and prone
to dependency bottlenecks. In the context of this con-
Follow Retweet Emotional
ference paper the probability 𝑝𝑖,𝑗 for edge 𝑒𝑖,𝑗 between
patterns patterns polarity vertices 𝑣𝑖 and 𝑣𝑗 is computed as in equation (4):
△ 𝑟𝑖,𝑗
Figure 1: Behavioral model. prob {𝑒𝑖,𝑗 } = 𝑝𝑖,𝑗 = 𝑤𝑓 𝐹𝑖,𝑗 + 𝑤𝑟 + 𝑤𝑐 𝑐𝑖,𝑗 (4)
𝑅
In (4) three factors are taken into consideration:
• The emotional coherency is a factor evaluating • Whether there is a directed follow link from the
the similarity of sentiments towards selected top- 𝑖-th account to the 𝑗-th one denoted by the binary
ics expressed as hashtags. indicator 𝐹𝑖,𝑗 .
• The ratio of the number 𝑟𝑖,𝑗 of retweets of the
The above are also shown in figure 1. 𝑖-th account coming from the 𝑗-th one to the total
In order to model the behavioral aspects of the Twitter retweets 𝑅 in the graph.
accounts, a set of the most common hashtags from each • The signed correlation factor 𝑐𝑖,𝑗 expressing the
graph is selected. The inspiration for the selection of such emotional coherency of the 𝑖-th and 𝑗-th accounts
a set is the concept of node cover. Definition 3 clarifies it. with respect to the reference hashtag set.
Definition 3 (Reference hashtag set). Let 𝐻0 be the The sentiment 𝑙[𝑡] of the 𝑖-th account during iteration
set of hashtags in a Twitter graph 𝑇 . A hashtag ℎ ∈ 𝐻0 is 𝑡 consists of a vector containing an emotional polarity
also belongs to the reference hashtag set if and only if the score, namely the percentage of the how positive, neutral,
accounts who have used ℎ constitute a vertex cover for 𝑇 . of negative the 𝑖-th account feels towards the as shown
Let 𝐻 be the set of hashtags satisfying definition 3. in (5). Initially the ground truth vector of the 𝑖-th account
is the respective average percentage of positive, neutral,
△ or negatively charged words in the tweets containing at
𝐻 = {ℎ1 , . . . , ℎ𝑝 } (2)
least one hashtag from the reference set.
The ratio of the cardinality of 𝐻 to that of 𝐻0 can be [𝑡] △ ]︀𝑇
(5)
[︀
taken as a measure of the important information existing 𝑙𝑖 = 𝑛𝑝,𝑖 𝑛𝑛,𝑖 𝑛𝑔,𝑖
in the underlying Twitter graph as in (3):
Given the iteration-dependent value of 𝑙[𝑡] , the value
△ |𝐻| of the correlation factor 𝑐𝑖,𝑗 should be computed dur-
𝜌 = (3)
|𝐻0 | ing each iteration as shown in (6). It should be high-
lighted that 𝑐𝑖,𝑗 is the only term of (4) which is signed,
Using the notion of the vertex set to find popular hash- thereby reinforcing or weakening the strength between
tags has the following advantages: two accounts depending on whether they have similar
sentiments towards the reference hashtag set.
• Selecting hashtags ℎ does not depend on any hy-
perparameters or on any thresholds whatsoever. ∑︀3 (︁ [𝑡] )︁ ∑︀
3
(︁
[𝑡]
)︁
𝑘=1 𝑙𝑖 [𝑘] − 1/3 𝑘=1 𝑙𝑗 [𝑘] − 1/3
• The widespread use of hashtag ℎ is a clear indi- √︂
cation of its popularity. ∑︀3 (︁ [𝑡] )︁2 √︂∑︀
3
(︁
[𝑡]
)︁2
𝑙 [𝑘] − 1/3 𝑙𝑗 [𝑘] − 1/3
• Although vertex cover is an NP-hard problem, 𝑘=1 𝑖 𝑘=1
approximation algorithms for it exist. (6)
The weights of the linear combination in (4) encode
However, it should be noted that for larger benchmark the sign and relative strength of each factor contribu-
graphs or for dynamic ones alternative criteria should be tion, namely how much each factor participates to the
sought in order to avoid the overwhelming complexity edge probability existence and whether such participa-
of determining a vertex cover. tion reinforces or weakens said probability respectively.
Moreover, they ensure the numerical stability of 𝑝𝑖,𝑗 .
3.2. Architecture Intuitively speaking, equation (4) is a linear estima-
tor of the true edge existence probability. The weights
The proposed GNN architecture relies on the fundamen- 𝑤𝑓 , 𝑤𝑟 , and 𝑤𝑐 express the relative contribution of each
tal fact that edges are fuzzy, namely that they belong to term and in our experiments follow the semantic strength
3
Georgios Drakopoulos et al. CEUR Workshop Proceedings 1–7
of the respective factor. This means that 𝑤𝑓 is higher The hyperparameter 𝛽0 scales input to a practical do-
since the follow denotes a high degree of coherency be- main for the sigmoid function 𝜙 (·), 𝛿𝑖,𝑘 is the weight of
tween the two accounts. Along a similar line of reason- the edge, and ∆ is the sum of the edge weights of the in-
ing, frequent retweets between two accounts indicate a bound neighbors. In (11) the sigmoid function is defined
somewhat strong connection between them. Moreover, as in (12) which is differentiable and smooth everywhere.
a consistent emotional coherency between two accounts
may well suggest a behavioral link between them. △ 1
𝜙 (𝑠; 𝜎0 ) = (12)
Additionally the weight 𝛿𝑖,𝑗 assigned to each edge is a 1 + exp (−𝜎0 𝑠)
function of the strength of the corresponding edge.
The derivative of the sigmoid function is given in (13).
△
𝛿𝑖,𝑗 = 𝑓 (𝑝𝑖,𝑗 ) (7) 𝜕𝜙 (𝑠; 𝜎0 )
= 𝜎0 𝜙 (𝑠; 𝜎0 ) 𝜙 (−𝑠; 𝜎0 )
𝜕𝑠
The weight function 𝑓 (·) of (7) is the same for each
= 𝜎0 𝜙 (𝑠; 𝜎0 ) (1 − 𝜙 (𝑠; 𝜎0 )) (13)
edge and it is directly or at least indirectly linked to
the semantics of the underlying graph. One of the most The last form of (13) comes from the fundamental
common options is that shown in (8). property of the sigmoid function described in (14) below:
△ 1 (14)
𝛿𝑖,𝑗 = (8) 𝜙 (𝑠; 𝜎0 ) + 𝜙 (−𝑠; 𝜎0 ) = 1
𝑝𝑖,𝑗
The preceding properties ensure that 𝜙 (·) is smooth
However, the weight selection of (8) has the disadvan-
enough to prevent divergence in most cases for a broad
tage of being almost singular close to zero, generating
spectrum of distributions.
thus excessive weight values. A viable alternative is the
inverse linear weight function of (9).
△ 1
4. Results
𝛿𝑖,𝑗 = (9)
1 + 𝑝𝑖,𝑗 The results of the proposed GNN methodology are pre-
sented in this section along with intuition about them.
Another option for the weight function is that the in-
They are divided to four groups, one for each possible
verse square function of equation (10). The latter typically
combination of benchmark graph (1821 / US2020) and
expresses a potential function in various applications.
weight function (inverse linear / inverse square).
△ 1
𝛿𝑖,𝑗 = (10)
1 + 𝑝2𝑖,𝑗 4.1. Dataset
In figure 2 the weight functions of (9) and (10) are The two benchmark graphs used in the experiments were
shown for their entire range. It can be immediately taken from [5]. They represent two characteristic cases
inferred they are strictly decreasing and everywhere of social graphs, namely one with a relative quiet and
smooth, expressing the fact that the more likely is an coherent one (1821) and one containing heated conversa-
edge to belong to the graph, the easier to cross it. tions and a considerable degree of dissonance (US2020).
At the core of the proposed GNN architecture is the The Twitter sampling interval was 8/2020-10/2020.
[𝑡]
update mechanism of (11). For the 𝑖-th vertex the 𝑙𝑖
is computed as in (11). Therein the index 𝑗 ranges over 4.2. Number of iterations
all inbound neighbors of the 𝑖-th vertex and thus it de-
In table 3 the parameters used in the experimental setup
pends on local connectivity patterns. However, since the
of this work are shown. This allows for the easy explo-
state vector of its neighbors depends on recursively on
ration of the parameter space. Observe that the actual val-
that of its own vectors, this mechanism is essentially a
ues of these parameters are in accordance of the strength
higher order status computation. During an update it
of the respective factor.
may be possible that certain neighbors may have already
Table 4 contains the normalized number of iterations
had their own state vectors updated, whereas others not.
as a function of the hyperparameter 𝛽0 of (11) for the
Thus, the iteration indicator * will be used. This process
two benchmark graphs of table 2 and for the two possible
terminates when the state vectors remain unchanged
weight functions shown in equations (9) and (10). Nor-
under a threshold of 𝜂0 for three consecutive iterations.
malization takes place per graph and per weight function
(︃ )︃ in order to show the comparative effect of 𝛽0 in each
𝛽0 [𝑡−1] 𝛽0 ∑︁ 𝛿𝑖,𝑗 [*]
𝑙 [𝑡+1]
= 𝜙 𝑙 + 𝑙 (11) case. In order to demonstrate the effect of the emotional
2 2 𝑗 ∆ 𝑗 attribute 𝑐𝑖,𝑗 of (4) on the convergence rate the same
4
Georgios Drakopoulos et al. CEUR Workshop Proceedings 1–7
Weight functions vs edge probability
1
inv.linear
inv.square
0.9
Weight function
0.8
0.7
0.6
0.5
0 0.2 0.4 0.6 0.8 1
Edge probability
Figure 2: Weight functions.
Table 2
Dataset synopsis (from [5]).
Property 1821 graph US2020 graph
Number of vertices 132.317 147.881
Number of edges 2.225.177 2.447.224
Density / Log-density 16.8170 / 1.2393 16.5486 / 1.2357
Completeness / Log-completeness 2.54𝑒−4 / 0.6196 2.38𝑒−4 / 0.6173
Number of triangles 446.513 489.773
Number of squares 215.387 218.633
Number of cliques of size four 102.044 125.806
Graph diameter 10 11
Percentage of vertices reachable at diameter-1 95.33% 98.17%
Percentage of vertices reachable at diameter-2 93.26% 96.44%
Percentage of vertices reachable at diameter-3 89.11% 91.22%
Percentage of vertices reachable at diameter-4 84.73% 87.47%
Number of favorites 36.994.815 42.114.509
Number of tweets 17.465.844 22.773.674
Table 3
Parameters of the experiments.
Parameter Meaning Value
𝑤𝑓 Edge follow weight 0.5
𝑤𝑟 Edge retweet weight 0.25
𝑤𝑐 Edge hashtag emotional coherence 0.25
𝜂0 State vector equality threshold 0.05
5
Georgios Drakopoulos et al. CEUR Workshop Proceedings 1–7
Table 4
Normalized number of iterations as a function of the hyperparameter 𝛽0 .
Hyperparameter Graph inv.linear inv.linear+beh inv.square inv.square+beh
0.5 1821 1.49 1.26 1.37 1.38
0.7 1.45 1.24 1.32 1.23
0.8 1.39 1.16 1.27 1.15
0.9 1.33 1.08 1.22 1.07
1 1.28 1 1.19 1
0.5 US2020 1.41 1.17 1.33 1.13
0.7 1.42 1.14 1.29 1.11
0.8 1.37 1.09 1.25 1.08
0.9 1.29 1.05 1.21 1.03
1 1.24 1 1.18 1
Table 5
Emotional distributions (pos/neu/neg) computed with the best value of 𝛽0 .
Graph init i.linear i.linear+beh i.square i.square+beh
1821 0.64/0.24/0.12 0.68/0.14/0.18 0.73/0.15/0.12 0.69/0.15/0.16 0.74/0.14/0.12
US2020 0.28/0.23/0.49 0.21/0.22/0.57 0.17/0.16/0.67 0.22/0.22/0.56 0.16/0.16/0.68
GNN is run with the latter removed from the initial local case. There it can be seen that the US2020 yields for
ground truth vectors at the vertices. both weight choices slightly different results from the
From table 4 it follows immediately that the inclusion initial distribution when the emotional factor is excluded
of the behavioral factor in (4) leads to quicker conver- but considerably different ones when they are included.
gence of the proposed GNN architecture. This can be Thus, it is a graph with a heavy emotional charge. On the
attributed to the following reasons: other hand, the 1821 graph tends to yield similar results
in every case, signifying thus greater coherency.
• Information enrichment: From an algorithmic
perspective, the behavioral factor adds an inde-
pendent dimension to the profile of each vertex. 5. Conclusions And Future Work
Hence, the new vertex profile space can differenti-
ate adequately between dissimilar vertices while This conference paper focuses on a graph neural net-
maintaining close enough similar ones. work architecture for discovering community structure
• Numerical variation: The above is enhanced in large Twitter graphs. In this approach said structure is
by having more diversified edge weights. Besides formed using a Twitter account behavioral model which
the additional factor, the behavioral term is also results from fusing structural and functional attributes
signed. In turn this expands the range of weights, with emotional ones. The proposed model can be natu-
increasing thus the possible number of values. rally extended to include additional features from these
categories or even ones belonging to different categories
The above factors suggest that variability in the weight as long as they can be expressed in a numerical scale
space as well as in the vertex profile increase the flexibil- where normalization does not influence semantics. In
ity of the update mechanism of (11). This is in accordance our experiments the inclusion of behavioral attributes
with the standard pattern recognition maxim stating that leads consistently to quicker GNN convergence.
mapping data to a space of higher dimensionality facil- This work can be extended in a number of ways. First,
itates their clustering. On the other hand, the curse of multiple weight functions can map each edge to a weight
dimensionality imposes a limit on how big this new space vector and hence to a multidimensional weight space
can get. As both spaces used in this work however are where each dimension has its own semantics. Then the
low dimensional, this does not constitute a problem. fundamental parameters of candidate distributions de-
Regarding the total sentiment, in table 5 is shown the scribing this space can be derived through signal esti-
average emotional distribution before and after the GNN mation techniques. Second, alternative behavioral mod-
execution in each case using the value of hyperparame- els depending only on local properties or on local esti-
ter 𝛽0 which leads to the quickest convergence in each mates of global ones should be developed as this would be
6
Georgios Drakopoulos et al. CEUR Workshop Proceedings 1–7
most appealing for a distributed implementation. Third, [10] G. Drakopoulos, E. Kafeza, P. Mylonas, L. Iliadis,
models for computing or estimating the edge existence Transform-based graph topology similarity met-
probability which reflect the underlying graph semantics rics, NCAA 33 (2021) 16363–16375. doi:10.1007/
should be research objectives. Finally, given the evolv- s00521-021-06235-9.
ing nature of online social networks, an architecture for [11] G. Drakopoulos, P. Mylonas, Evaluating graph re-
non-stationary graphs should also be developed. In this silience with tensor stack networks: A keras im-
case certain transient global graph states can be used to plementation, NCAA 32 (2020) 4161–4176. doi:10.
obtain intermediate status results regarding clustering, 1007/s00521-020-04790-1.
flow, degree distribution, or any other global property. [12] M. T. Schaub, S. Segarra, Flow smoothing and de-
These transient states can well serve as starting points noising: Graph signal processing in the edge-space,
for new partitioning techniques. in: GlobalSIP, IEEE, 2018, pp. 735–739.
[13] A. Gavili, X.-P. Zhang, On the shift operator, graph
frequency, and optimal filtering in graph signal pro-
Acknowledgments cessing, IEEE Transactions on Signal Processing 65
(2017) 6303–6318.
This conference paper was supported by the Research
[14] F. Hua, R. Nassif, C. Richard, H. Wang, A. H. Sayed,
Incentive Fund (RIF) Grant R18087 provided by Zayed
A preconditioned graph diffusion LMS for adaptive
University, UAE.
graph signal processing, in: EUSIPCO, IEEE, 2018,
pp. 111–115.
References [15] S. Kontopoulos, G. Drakopoulos, A space efficient
scheme for graph representation, in: ICTAI, IEEE,
[1] C. Zhang, D. Song, C. Huang, A. Swami, N. V. 2014, pp. 299–303. doi:10.1109/ICTAI.2014.52.
Chawla, Heterogeneous graph neural network, in: [16] T. M. Cihon, M. A. Mattaini, Emerging cultural
ICDM, 2019, pp. 793–803. and behavioral systems science, Perspectives on
[2] C. Yu, Y. Liu, C. Gao, C. Shen, N. Sang, Represen- behavior science 42 (2019) 699–711.
tative graph neural network, in: ECCV, Springer, [17] I. Markovsky, F. Dörfler, Behavioral systems the-
2020, pp. 379–396. ory in data-driven analysis, signal processing, and
[3] J. Kim, T. Kim, S. Kim, C. D. Yoo, Edge-labeling control, Annual Reviews in Control (2021).
graph neural network for few-shot learning, in: [18] G. Drakopoulos, E. Kafeza, P. Mylonas,
CVPR, 2019, pp. 11–20. H. Al Katheeri, Building trusted startup teams from
[4] X. Fu, J. Zhang, Z. Meng, I. King, Magnn: Meta- LinkedIn attributes: A higher order probabilistic
path aggregated graph neural network for hetero- analysis, in: ICTAI, IEEE, 2020, pp. 867–874.
geneous graph embedding, in: WWW, 2020, pp. doi:10.1109/ICTAI50040.2020.00136.
2331–2341. [19] G. Drakopoulos, I. Giannoukou, P. Mylonas,
[5] G. Drakopoulos, I. Giannoukou, P. Mylonas, S. Sioutas, On tensor distances for self organiz-
S. Sioutas, A graph neural network for assessing ing maps: Clustering cognitive tasks, in: DEXA,
the affective coherence of Twitter graphs, in: IEEE volume 12392 of Lecture Notes in Computer Sci-
Big Data, IEEE, 2020, pp. 3618–3627. doi:10.1109/ ence, Springer, 2020, pp. 195–210. doi:10.1007/
BigData50022.2020.9378492. 978-3-030-59051-2\_13.
[6] W. Fan, Y. Ma, Q. Li, Y. He, E. Zhao, J. Tang, D. Yin, [20] S. Kim, K. Song, B. Lockee, J. Burton, What is gamifi-
Graph neural networks for social recommendation, cation in learning and education?, in: Gamification
in: The WWW conference, 2019, pp. 417–426. in learning and education, Springer, 2018, pp. 25–
[7] A. Ortega, P. Frossard, J. Kovačević, J. M. Moura, 38.
P. Vandergheynst, Graph signal processing: [21] N. Wilkinson, M. Klaes, An introduction to behav-
Overview, challenges, and applications, Proceed- ioral economics, Macmillan International Higher
ings of the IEEE 106 (2018) 808–828. Education, 2017.
[8] G. Mateos, S. Segarra, A. G. Marques, A. Ribeiro,
Connecting the dots: Identifying network structure
via graph signal processing, IEEE Signal Processing
Magazine 36 (2019) 16–43.
[9] M. Cheung, J. Shi, O. Wright, L. Y. Jiang, X. Liu, J. M.
Moura, Graph signal processing and deep learning:
Convolution, pooling, and topology, IEEE Signal
Processing Magazine 37 (2020) 139–149.
7