Offres de stages, thèses et postdocs

Stage M2 - Graph Neural Networks with Optimal Transport

Retour Accueil / Offres de stages, thèses et postdocs / Stage M2 - Graph Neural Networks with Optimal Transport

Graph Neural Networks with Optimal Transport

Expected starting date : May / February 2023 (6 months)

Supervision : Mokhtar Z. Alaya (elmokhtar.alaya@utc.fr) and Muhammet Balcilar ( muhammetbalcilar@gmail.com)

Key-words. Deep learning ; Graph neural networks ; Optimal Transport ; Wasserstein distance ; Gromov- Wasserstein distance

Context. Over the past decade, Deep Learning, and more specifically Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), had a strong impact in various applications of machine learning, such as image recognition [10] and speech analysis [8]. These successes have mostly been achieved on sequences or im- ages, i.e. on data defined on grid structures that benefit from linear algebra operations in Eu- clidean spaces. However, there are many domains where data (e.g. social networks, molecules, knowledge graph) cannot be trivially encoded into a Euclidean domain but can be naturally represented as graphs. This explains the recent challenge tackled by the machine learning community which consists of transposing the deep learning paradigm into the world of graphs. The objective is to revisit Neural Networks to operate on graph data, in order to benefit from the representation learning ability. In this context, many Graph Neural Networks (GNNs) have been recently proposed in the literature of geometric learning [12, 9]. GNNs are Neural Networks that rely on the computation of hidden representations of nodes using information carried by the whole graph. In contrast to conventional Neural Network, where the architecture of the net- work is related to the known and invariant topology of the data (e.g. a 2-D grid for images), the node features of GNNs are propagated according to the graph topology called Message Passing (Graph) Neural Networks (MPNN). Some recent works studied the spectral expressive power of MPNN and showed majority of them work as loss-pass filter [4] and how to improve this expressive power limits in [3]. It is a well-known issue that, if there are two different point clouds where each has a dif- ferent number of points in it, the best approach to measure the distance between two groups is Mover distance which we can calculate it by optimal transport [11, 1, 2, 6]. Instead of summarizing the node representation into some fixed-length vector, calculating optimal transport distance from any graph’s learned node features to another learnable anchor graph’s node represen- tation and uses that optimal distance as the output of the graph readout layer is proposed in [5]. However, since the anchor graph’s node are learnable this approach can overfit easily and needs heavy regularization. Another application of OT in graph-based data learning is to use kernels where pair-wise graph distances were calculated by optimal transport in [7, 13]. However, both mentioned method uses the given raw node feature to calculate the opti- mal transport distance and no advantage to learn the new more meaningful representation of nodes for the given certain task.

Scientific objectives and expected achievements.

In our proposal, we would like to use an optimal transport distance to create a kernel based graph level prediction model. In the model, the new repre- sentation of nodes will be learned by Graph Neural Network. Later, the new representation’s distance will be calculated by optimal transport in order to create a better kernel for the prediction model. The prediction would be done by the Support vector machine or Gaussian Random Process which are fully differentiable.

Research environment/Location. The research will take place within the LMAC laboratory located at Compiègne, France. The internship will be supervised by Mokhtar Z. Alaya (LMAC - UTC) and Muhammet Balcilar (InterDigital, Rennes).

Candidate profile. Candidates are expected to be graduated in computer science and/or machine learn- ing and/or signal image processing and/or applied mathematics/statistics, and show an excellent aca- demic. The internship subject requires skills Python development tools, specillay TensorFlow or Pytorch.

For more details. Feel free to contact by sending an email to Mokhtar Z. Alaya (elmokhtar.alaya@utc.fr) and to Muhammet Balcilar (muhammetbalcilar@gmail.com).