RoMaDS - Past events

Go back to upcoming events.

 

2023

25.09.2023 - Seminar: Marco Carfagnini (University of California San Diego), Spectral gaps via small deviations
15h00-16h00, Department of Mathematics, Aula Dal Passo. Link Teams.
Abstract

In this talk we will discuss spectral gaps of second order differential operators and their connection to limit laws such as small deviations and Chung’s laws of the iterated logarithm. The main focus is on hypoelliptic diffusions such as the Kolmogorov diffusion and horizontal Brownian motions on Carnot groups. If time permits, we will discuss spectral properties and existence of spectral gaps on general Dirichlet metric measure spaces.This talk is based on joint works with Maria (Masha) Gordina and Alexander (Sasha) Teplyaev.


15.09.2023 - Workshop: A day on Statistical Physics for Machine Learning
09h30-17h00, Department of Mathematics, Aula Gismondi

Here the link to the webpage of the event.


17.05.2023 - Seminar: Daniele Calandriello (Google Deep Mind, Paris), Efficient exploration in stochastic environments
Aula 2001.
Abstract

Machine learning has seen an explosive growth recently, driven mostly by breakthroughs in classification and generative models. However ML applications in decision making settings are much more limited, where data collection is much higher and ML models must be sufficiently robust and accurate to deal with unforeseen consequences and avoid worst case scenarios. In this talk we will introduce some classical results for online decision making in stochastic linear spaces, with applications to active learning, bandit/bayesian optimization and deep learning. Starting from a rigorous analysis of the noise propagation we can formulate provably robust (i.e. no-regret) algorithms, and then create variants that can scale to modern ML data regimes without sacrificing safety. And if time suffice, we will highlight how these approaches inspired a new wave of exploration techniques to enable reinforcement learning agents to solve extremely long horizon tasks.


24.05.2023 - Seminar: Rongfeng Sun (NUS, Singapore), A new correlation inequality for Ising models with external fields
Aula 2001, 14h00. Link to Teams
Abstract

We study ferromagnetic Ising models on finite graphs with an inhomogeneous external field. We show that the influence of boundary conditions on any given spin is maximised when the external field is identically 0. One corollary is that spin-spin correlation is maximised when the external field vanishes. In particular, the random field Ising model on Z^d, d ≥ 3, exhibits exponential decay of correlations in the entire high temperature regime of the pure Ising model. Another corollary is that the pure Ising model on Z^d, d ≥ 3, satisfies the conjectured strong spatial mixing property in the entire high temperature regime. Based on joint work with Jian Ding and Jian Song.


13.03-09.06.2023 - Course: Luciano Gualà (Università Tor Vergata), Advanced Topics on Algorithms
Mondays 16h00-18h00 (aula 3 PP2)
Wednesdays 16h00-18h00 (aula T7, Sogene).
See here all the details of the course.
26.04.2023 - Seminar: Federico Ricci Tersenghi (La Sapienza), Phase transitions and algorithmic thresholds for optimization and inference problems on sparse random graphs
Aula 2001, 14h00.
Abstract

Focusing on some fundamental constraint satisfaction problems defined on sparse random graphs (e.g. random k-sat, random q-coloring) I will start summarizing the rich phase diagram of the solution space which has been derived using powerful techniques from statistical mechanics of disordered systems. I will then discuss the implications of some of these phase transitions for the behaviour of smart algorithms searching for solutions. In the second part, I will consider the problem of inferring a signal from noisy and/or incomplete data within the Bayesian framework called the teacher-student scenario. I will discuss the corresponding phase diagrams and how phase transitions may affect the performances of inference algorithms based on message-passing and Monte Carlo sampling.


29.03.2023 - Seminar: Francesco Grotto (Università di Pisa), Random Waves, Oscillatory Integrals and Random Walks
Aula Dal Passo, 11h00.
Abstract

We consider Random Wave models, that is probability distributions on Laplacian eigenfunctions, on homogeneous spaces such as Euclidean spaces, Hyperspheres and Hyperbolic spaces. Determining the asymptotic behavior at large frequency of functionals of these objects is complicated by the oscillatory nature of their covariance functions. Oscillatory integrals appearing in evaluating variances of said functionals turn out to be closely related to integral representations of densities of uniform random walks in Euclidean spaces, and this connection can be exploited to deduce results on fluctuations of integral functionals of Random Waves.


15.02.2023 - Seminar: Francesco Vaccarino (Politecnico di Torino), Hodge-Shapley game: a Laplacian-based Shapley-like associated game for eXplainable AI
Aula Dal Passo, 14h00.
Abstract

In cooperative game theory, a set of players or decision-makers should negotiate to decide how to allocate the worth gained by the coalition composed of all the players. A value is a solution concept that suggests the outcome of the negotiation among players. Among the many existing alternative solution concepts, it is prevalent the Shapley value solution concept. Its popularity also derives from the property of being a fair allocation, where a set of desirable properties or axioms describes fairness. The axioms characterize the Shapley value in the sense that it is the unique value satisfying those properties; at the same time, the axioms allow deriving a simple explicit combinatorial formula to compute the Shapley value. In our approach, coalitions are the main subjects of cooperation, instead of single players, and, inspired by the Shapley value, the goal is to derive a fair associated game, i.e. an allocation to coalitions satisfying a set of desirable properties. The methodology is based on using the Hodge decomposition of the simplicial complex associated with the partially ordered set of the subsets of the set of players ordered by inclusion. We will motivate this investigation within the framework of Explainable Artificial Intelligence (XAI).
Joint work with Antonio Mastropietro (Eurecom - F)


08.02.2023 - Seminar: Francesco Tudisco (Gran Sasso Science Institute), Efficient training of low-rank neural networks
Aula Dal Passo, 14h00. Slides.
Abstract

Neural networks have achieved tremendous success in a variety of applications. However, their memory footprint and computational demand can render them impractical in application settings with limited hardware or energy resources. At the same time, overparametrization seems to be necessary in order to overcome the highly nonconvex nature of the optimization problem. An optimal trade-off is then to be found in order to reduce networks' dimensions while maintaining high performance. Popular approaches in the literature are based on pruning techniques that look for "winning tickets", smaller subnetworks achieving approximately the initial performance. However, these techniques are not able to reduce the memory footprint of the training phase and can be unstable with respect to the input weights. In this talk, we will present a training algorithm that looks for "low-rank lottery tickets" by interpreting the training phase as a continuous ODE and by integrating it within the manifold of low-rank matrices. The low-rank subnetworks and their ranks are determined and adapted during the training phase, allowing the overall time and memory resources required by both training and inference phases to be reduced significantly. We will illustrate the efficiency of this approach on a variety of fully connected and convolutional networks.
The talk is based on:
S Schotthöfer, E Zangrando, J Kusch, G Ceruti, F Tudisco
Low-rank lottery tickets: finding efficient low-rank neural networks via matrix differential equations
NeurIPS 2022
https://arxiv.org/pdf/2205.13571.pdf


01.02.2023 - Seminar: Giovanni Conforti (École Polytechnique, Paris), A probabilistic approach to exponential convergence of Sinkhorn's algorithm
Aula Dal Passo, 14h00. Slides.
Abstract

The entropic optimal transport problem (EOT) is obtained adding an entropic regularisation term in the cost function of the Monge-Kantorovich problem and is nowadays regularly employed in machine learning applications as a more tractable and numerically more stable version of the optimal transport problem. On the other hand, E.Schrödinger asked back in 1931 the question of finding the most likely evolution of a cloud of independent Brownian particles conditionally to observations. The mathematical formulation of his question through large deviations theory is known as the Schrödinger problem and turns out to be fully equivalent to EOT. In this talk, I shall illustrate both viewpoints and then move on to sketch the ideas of a probabilistic method to show exponential convergence of Sinkhorn's algorithm, whose application the heart of the recent successful applications of EOT in statistical machine learning and beyond. In particular, we shall discuss how the proposed method opens new perspective for showing exponential convergence for marginal distribution that are non compactly supported.


10-13.01.2023 - Mini-course: Boris Hanin (Princeton University), Neural Networks and Gaussian Processes
Aula Dal Passo, Tue 10th 14h00-16h00, Wed 11th 14h00-16h00, Fri 13th 10h00-12h00.

Lecture notes.

Video of Lecture 1.
Video of Lecture 2.
Video of Lecture 3.

Abstract

Lecture 1. Introduction to Neural Networks 

  • Definition and Examples
  • Typical use of neural networks for supervised learning
  • Big Questions: optimization, generalization with overparameterized interpolation, feature learning
Lecture 2. Neural Networks at Finite Depth and Infinite Width 
  • Gaussian process behavior at initialization
  • NTK/linear dynamics in optimization
Lecture 3. Neural Networks at Large Depth and Finite Width 
  • Beyond the NTK/GP regime
  • Higher order cumulants at initialization
  • Tuning to criticality
  • Open problems


2022

22-25.11.2022 - Mini-course: Dario Fasino and Enrico Bozzo (Università di Udine), Applicazioni dell'algebra lineare numerica allo studio di reti e sistemi complessi
Biblioteca Storica, Tue 14h00-18h00, Wed 10h00-14h00, Thu 14h00-18h00. Download the slides from the first, second, third lecture of Enrico Bozzo. Download the slides from the first, second, third , fourth , fifth part of Dario Fasino's lecture.
Abstract

Enrico Bozzo: Sistemi dinamici lineari su grafi
Grafi e matrici: concetti di connettività, matrice di adiacenza, matrici non negative, matrici primitive, teoria di Perron-Frobenius. Matrici stocastiche e substocastiche, problema del consenso. Matrici Laplaciane e di Metzler, punti di equilibrio e consenso nel caso continuo, cenno ai sistemi compartimentali.
Dario Fasino: Metodi matriciali nell'analisi di reti complesse
Breve panoramica sulla scienza delle reti. Concetti classici di centralità basati su cammini minimi. Misure di centralità, somiglianza e distanza tra nodi basate su tecniche spettrali e funzioni di matrici. Catene di Markov a tempo discreto: Percorsi casuali classici e non-retrocedenti. Tecniche matriciali per la localizzazione di clusters, strutture core-periphery o quasi-bipartite. Introduzione ai percorsi casuali del secondo ordine: Tensori stocastici, PageRank nonlineare.


15.12.2022 - Lecture: Andrea Clementi (Tor Vergata), Mining data streams
Aula 22, 16h30.
The lecture is intended for master and PhD students in Mathematics and Computer Science with no particular background on the subject. Here is a syllabus of the lecture.

16.11.2022 - Seminar: Cesare Molinari (IIT and Università di Genova), Iterative regularization for convex regularizers
Aula Dal Passo, 14h00. Slides.
Abstract

Iterative regularization exploits the implicit bias of an optimization algorithm to regularize ill-posed problems. Constructing algorithms with such built-in regularization mechanisms is a classic challenge in inverse problems but also in modern machine learning, where it provides both a new perspective on algorithms analysis, and significant speed-ups compared to explicit regularization. In this talk, we propose and study the first iterative regularization procedure able to handle biases described by non smooth and non strongly convex functionals, prominent in low-complexity regularization. Our approach is based on a primal-dual algorithm of which we analyze convergence and stability properties, even in the case where the original problem is unfeasible. The general results are illustrated considering the special case of sparse recovery with the ℓ1 penalty. Our theoretical results are complemented by experiments showing the computational benefits of our approach.


09.11.2022 - Seminar: Alessandra Cipriani (UCL), Topological data analysis: vineyards for metallic structures
Aula Dal Passo, 14h00. Slides.
Abstract

Modeling microstructures is a problem that interests material science as well as mathematics. The most basic model for steel microstructure is the Poisson-Voronoi diagram. It has mathematically attractive properties and has been used in the approximation of single-phase steel microstructures. We would like to present methods that can be used to assess whether a real microstructure can be approximated by such a model. In this talk, we construct tests that use data coming from serial sectioning (multiple 2D sections) of a 3D metallic structure. The proposed statistics exploit tools from topological data analysis such as persistence diagrams and (a modified version of) persistence vineyards.


02.11.2022 - Seminar: Guillaume Poly (University of Rennes 1), Around total variation for Breuer-Major Theorem and nodal volume of Gaussian fields
Aula Dal Passo, 14h00. Link to Teams.
Abstract

In this talk, i will revisit the Breuer-Major Theorem from the perspective of total variation metric and will introduce some known results recently established in this framework. I will then explain how to break the limitations of these results and establish unconditional criteria of convergence in total variation by using a specific gradient in Malliavin calculus (the sharp operator). Next, i will explain how this kind of ideas may be used in the framework of nodal volume of Gaussian fields in order to establish CLT in the total variation topology. This talk is mainly based on ongoing research with J.Angst and F.Dalmao.


26.10.2022 - Seminar: Solesne Bourguin (Boston University), Regularity of forward-backward SDEs via PDE techniques
Aula Dal Passo, 14h00.
Abstract

The study of the regularity of the law of solutions to SDEs is an important and classical topic in stochastic analysis. This was for instance Malliavin's motivation for the development of the stochastic calculus of variations in order to prove a probabilistic version of Hörmander's sum-of-squares theorem. The object of the present work is to study the regularity of solutions to forward-backward SDEs via a novel combination of the Malliavin calculus with PDE techniques such as backward uniqueness and strong unique continuation. We obtain new conditions for the existence of densities of solutions to backward SDEs that not only include all existing results as particular cases, but also allow us to deal with multidimensional forward components. Applications to finance and mathematical biology will be discussed if time permits.


15-16.09.2022 - Workshop: Topology of Data in Rome
Here the link to the webpage of the event.
List of speakers

Katherine Benjamin University of Oxford, UK
Ryan Budney University of Victoria, Canada
Wojtek Chacholski KTH, Sweden
Pawel Dłotko Dioscuri Centre of TDA, Poland
Barbara Giunti Graz University of Technology, Austria
Kelly Maggs École Politechnyque Fédérale de Lausanne, Switzerland
Anibal Medina Max Planck Institut für Mathematik, Germany
Bastian Rieck AIDOS Lab, Germany


02.09.2022 - Seminar: Elisa Alòs (Universitat Pompeu Fabra, Barcelona), On the skew and curvature of implied and local volatilities
Aula Dal Passo, 14h00.
Abstract

In this talk, we study the relationship between the short-end of the local and the implied volatility surfaces. Our results, based on Malliavin calculus techniques, recover the recent $\frac{1}{H+3/2}$ rule (where $H$ denotes the Hurst parameter of the volatility process) for rough volatilitites (see Bourgey, De Marco, Friz, and Pigato (2022)), that states that the short-time skew slope of the at-the-money implied volatility is $\frac{1}{H+3/2}$ the corresponding slope for local volatilities. Moreover, we see that the at-the-money short-end curvature of the implied volatility can be written in terms of the short-end skew and curvature of the local volatility and viceversa, and that this relationship depends on $H$.


13.06.2022 - Seminar: Davide Bianchi (Harbin Institute of Technology, Shenzhen), Asymptotic spectra of large graphs with a uniform local structure
Aula Dal Passo, 15h00.
Abstract

We are concerned with sequences of graphs with a uniform local structure. The underlying sequence of adjacency matrices has a canonical eigenvalue distribution, in the Weyl sense, and it has been shown that we can associate to it a symbol f, [2]. The knowledge of the symbol and of its basic analytical features provides key information on the eigenvalue structure in terms of localization, spectral gap, clustering, and global distribution. We discuss different applications and provide numerical examples in order to underline the practical use of the developed theory, [1]. In particular, we show how the knowledge of the symbol f
• can benefit iterative methods to solve Poisson equations on large graphs;
• provides insight on the recurrence/transience property of random walks on graphs. References
[1] A. Adriani, D. Bianchi, P. Ferrari, and S. Serra-Capizzano. Asymptotic spectra of large (grid) graphs with a uniform local structure (part II): numerical applications. 2021. arXiv: 2111.13859.
[2] A. Adriani, D. Bianchi, and S. Serra-Capizzano. “Asymptotic spectra of large (grid) graphs with a uniform local structure (part I): theory”. In: Milan Journal of Mathematics 88 (2020), pp. 409–454.


11.05.2022-30.06.2022 - Course: Dmitri Koroliouk (Kiev), Introduction into Neural Networks and Deep Learning
Download the program of the course. Links to the first lecture and second lecture.

02.05.2022-29.05.2022 - Course: Vlad Bally, Introduction to rough paths
See the page of the course for schedule and abstract.

30.05.2022 - Conference: A Day on Random Graphs
Here the link to the webpage of the event.

20.05.2022 - Seminar: Giacomo Giorgio & Edoardo Lombardo
Aula 1200, 14h00.
Giacomo Giorgio, Convergence in Total Variation for nonlinear functionals of random hyperspherical harmonics
Abstract

Random hyperspherical harmonics are Gaussian Laplace eigenfunctions on the unit d-dimensional sphere (d ≥ 2). We study the convergence in Total Variation distance for their nonlinear statistics in the high energy limit, i.e., for diverging sequences of Laplace eigenvalues. Our approach takes advantage of a recent result by Bally, Caramellino and Poly (2020): combining the Central Limit Theorem in Wasserstein distance obtained by Marinucci and Rossi (2015) for Hermite-rank 2 functionals with new results on the asymptotic behavior of their Malliavin-Sobolev norms, we are able to establish second order Gaussian fluctuations in this stronger probability metric as soon as the functional is regular enough. Our argument requires some novel estimates on moments of products of Gegenbauer polynomials that may be of independent interest, which we prove via the link between graph theory and diagram formulas.

Edoardo Lombardo, High order approximations for the Cox-Ingersoll-Ross process using random grids
Abstract

We present new high order approximations schemes for the Cox-Ingersoll-Ross (CIR) process that are obtained by using a recent technique developed by Alfonsi and Bally (2021) for the approximation of semigroups. The idea consists in using a suitable combination of discretization schemes calculated on different random grids to increase the order of convergence. This technique coupled with the second order scheme proposed by Alfonsi (2010) for the CIR leads to weak approximations of order $2k$ for all $k\in\N$. Despite the singularity of the square-root volatility coefficient, we show rigorously this order of convergence under some restrictions on the volatility parameters. We illustrate numerically the convergence of these approximations for the CIR process and for the Heston stochastic volatility model.


19.05.2022 - Lecture: Andrea Clementi, Finding Similar Items in Large Data Sets
Aula G2B, 11h00.
The lecture is intended for master and PhD students in Mathematics and Computer Science with no particular background on the subject. Here is a syllabus of the lecture.

09.05.2022 - Inauguration Colloquium: Lorenzo Rosasco (University of Genova), A guided tour of machine learning (theory)
Aula Dal Passo, 15h00. Slides of the seminar.
Bio of Lorenzo Rosasco

Lorenzo Rosasco is a professor at the University of Genova. He is also visiting professor at the Massachusetts Institute of Technology (MIT) and external collaborator at the Italian Technological Institute (IIT). He coordinates the Machine Learning Genova center (MaLGa) and leads the Laboratory for Computational and Statistical Learning focused on theory, algorithms and applications of machine learning. He received his PhD in 2006 from the University of Genova, after being a visiting student at the Center for Biological and Computational Learning at MIT, the Toyota Technological Institute at Chicago (TTI-Chicago) and the Johann Radon Institute for Computational and Applied Mathematics. Between 2006 a and 2013 he has been a postdoc and research scientist at the Brain and Cognitive Sciences Department at MIT. He his a recipient of a number of grants, including a FIRB and an ERC consolidator.

Abstract

In this talk, we will provide a basic introduction to some of the fundamental ideas and results in machine learning, with emphasis on mathematical aspects. We will begin contrasting the modern data driven approach to modeling to classic mechanistic approaches. Then, we will discuss basic elements of machine learning theory connected to approximation theory, probability and optimization. Finally, we will discuss the need of new theoretical advances at the light of recent empirical observations while using deep neural networks.



05.05.2022 - Seminar: Riccardo Maffucci (EPFL), Distribution of nodal intersections for random waves
Aula De Blasi, 16h00.
Abstract

This is work is collaboration with Maurizia Rossi. Random waves are Gaussian Laplacian eigenfunctions on the 3D torus. We investigate the length of intersection between the zero (nodal) set, and a fixed surface. Expectation, and variance in a general scenario are prior work. In the generic setting we prove a CLT. We will discuss (smaller order) variance and (non-Gaussian) limiting distribution in the case of ’static’ surfaces (e.g. sphere). Under a certain assumption, there is asymptotic full correlation between intersection length and nodal area.


29.04.2022 - Seminar: Antonio Lerario (Sissa), The zonoid algebra
Aula De Blasi, 14h00.
Abstract

In this seminar I will discuss the so called "zonoid algebra", a construction introduced in a recent work (joint with Breiding, Bürgisser and Mathis) which allows to put a ring structure on the set of zonoids (i.e. Hausdorff limits of Minkowski sums of segments). This framework gives a new perspective on classical objects in convex geometry, and it allows to introduce new functionals on zonoids, in particular generalizing the notion of mixed volume. Moreover this algebra plays the role of a probabilistic intersection ring for compact homogeneous spaces. Joint work with P. Breiding, P. Bürgisser and L. Mathis.


01.04.2022 - Seminar: Alessia Caponera, Nonparametric Estimation of Covariance and Autocovariance Operators on the Sphere
Aula 1200, 15h30.
Abstract

We propose nonparametric estimators for the second-order central moments of spherical random fields within a functional data context. We consider a measurement framework where each field among an identically distributed collection of spherical random fields is sampled at a few random directions, possibly subject to measurement error. The collection of fields could be i.i.d. or serially dependent. Though similar setups have already been explored for random functions defined on the unit interval, the nonparametric estimators proposed in the literature often rely on local polynomials, which do not readily extend to the (product) spherical setting. We therefore formulate our estimation procedure as a variational problem involving a generalized Tikhonov regularization term. The latter favours smooth covariance/autocovariance functions, where the smoothness is specified by means of suitable Sobolev-like pseudo-differential operators. Using the machinery of reproducing kernel Hilbert spaces, we establish representer theorems that fully characterize the form of our estimators. We determine their uniform rates of convergence as the number of fields diverges, both for the dense (increasing number of spatial samples) and sparse (bounded number of spatial samples) regimes. We moreover validate and demonstrate the practical feasibility of our estimation procedure in a simulation setting.
Authors: Alessia Caponera, Julien Fageot, Matthieu Simeoni and Victor M. Panaretos