Plenary Speakers

Jennifer Neville
Prof. of Computer Science and Statistics
Purdue University
USA 
Towards Relational AI  the good, the bad, and the ugly of learning over networks
In the last 20 years, there has been a great deal of research on machine learning methods for graphs, networks, and other types of relational data. By moving beyond the independence assumptions of more traditional ML methods, relational models are now able to successfully exploit the additional information that is often observed in relationships among entities. Specifically, network models are able to use relational information to improve predictions about user interests, behavior, and interactions, particularly when individual data is sparse. The tradeoff however, is that the heterogeneity, partialobservability, and interdependence of largescale network data can make it difficult to develop efficient and unbiased methods, due to several algorithmic and statistical challenges. In this talk, I will discuss these issues while surveying several general approaches used for relational learning in largescale social and information networks. In addition, to reflect on the movement toward pervasive use of the models in personalized online systems, I will discuss potential implications for privacy, polarization of communities, and spread of misinformation.
More info about the speaker here.

Naoki Saito
Mathematics
University of California, Davis
USA 
The First Steps toward Building Natural Graph Wavelets
For the development and theory of discrete wavelets on regular lattices in R^d, the Fourier series and transforms have played a significant role. Hence, when attempting to develop wavelet theory naturally tailored for graphs and networks, some researchers have used graph Laplacian eigenvalues and eigenvectors in place of the frequencies and complex exponentials, respectively. While tempting to do so, there are several fundamental problems in this viewpoint. One of them is the intricate relationship between the frequencies and the Laplacian eigenvalues. For undirected and unweighted paths (or cycles), the Laplacian eigenvectors are the discrete cosine (or Fourier) basis vectors and the corresponding eigenvalues are square of their frequencies. Consequently on those simple graphs, one can precisely develop the classical wavelets using the LittlewoodPaley theory. However, as soon as a graph becomes even slightly more complicated (e.g., a discretized thin rectangle in 2D), the situation completely changes: we cannot view the eigenvalues as a simple monotonic function of frequency anymore. Hence, the first step toward building natural graph wavelets is how to sort and organize Laplacian eigenfunctions without using the eigenvalues and to create a dual domain graph. In this talk, I will discuss this important problem further and explain my effort using Earth Mover's/Wasserstein Distance to measure natural distances between eigenfunctions followed by embedding the resulting distance matrix into an appropriate Euclidean domain. Then, I will discuss how to combine the "close" eigenfunctions to form natural graph wavelet frames and bases that contain localized basis functions at various nodes of the input graph. This latter part is joint work with Haotian Li (UCD) and Alex Cloninger (UCSD).
More info about the speaker here.

Gonzalo Mateos
Electrical and Computer Engineering
University of Rochester
USA 
Digraph Signal Processing: Orthonormal Transforms and Network Inference
More info about the speaker here.

Gal Mishne
Applied Math
Yale University
USA 
Graph Signal Processing on Tensors
More info about the speaker here.

Alejandro Ribeiro
Electrical and Systems Engineering
University of Pennsylvania
USA 
Graph Neural Networks
More info about the speaker here.