
Events
If you want to receive updates on RoMaDS activities, write an email to salvi(at)mat.uniroma2.it to be added to our mailing list.
Upcoming events
04.03-22.04.2025 - Mini-course: Emanuele Natale (Université Côte d'Azur), Parallel Machine Learning in JuliaDepartment of Mathematics, Aula 13.
Schedule: Tuesday 4, 11, 18 March and 1, 8, 15, 22 April at 16:00. Friday 21 March at 14:00.
Abstract
This course provides a practical introduction to some of the fundamental ideas behind modern machine learning. Assuming basic programming and machine learning knowledge, the course will equip the student with the skills to implement and benchmark parallel and distributed machine learning algorithms using Julia, a modern language excelling in scientific computing. We'll delve into the Julia language and its machine learning ecosystem, focusing on examples that showcase the implementation of modern deep learning architectures such as GPUs. We will also briefly touch upon automatic differentiation, a fundamental concept that has allowed the success of modern machine learning.31.03.2025 - Seminar/minimini-course: Francesco Preta (Stip AI, Berkeley), Basics of Natural Language Processing and applications of Large Language Models
14h00-17h00, Department of Mathematics, Aula Dal Passo.
Abstract
This seminar provides an introduction to Natural Language Processing (NLP), covering fundamental concepts such as tokenization, Named Entity Recognition (NER), and sentiment analysis. We will show how many NLP tasks can be translated into regular data science problem through the use of embeddings and introduce different embedding strategies for words and sentences obtained at different stages of history. Finally, we will explore the current industrial applications of Large Language Models (LLMs) including prompt engineering strategies, RAG and chatbots. Attendees will gain insights into how LLMs work, their advantages, and real-world use cases.
07-11.04.2025 - Mini-course: Johannes Schmidt-Hieber (University of Twente), Statistical theory of deep learning
Department of Mathematics, Aula Dal Passo.
Schedule: Mon 14:00-16:30, Wed 14:00-16:30, Fri 14:00-16:30.
Abstract
Lecture 1. Intro and theory for shallow networks
- Perceptron convergence theorem
- Universal approximation theorem
- Approximation rates for shallow neural networks
- Barron spaces
- Advantages of additional hidden layers
- Deep ReLU networks
- Misclassification error for image deformation models
- Optimization in machine learning
- Weight balancing phenomenon
- Analysis of dropout
- Benign overfitting
- Grokking
May-June 2025 - PhD course: Quentin Berger (Université Sorbonne Paris Nord), Statistical mechanics and disordered systems
13:30-16h00 on May 26, 29 and June 5, 9 12, 16, 19. Department of Mathematics, Aula Dal Passo.
Abstract
In the first part of the course course, I will review some models of
statistical mechanics and discuss a central question in the study of
disordered systems, the so-called disorder relevance. This question is
quite broad and mainly consists in determining whether the properties of a
system are stable under small (random) perturbations or whether they are
affected by the presence of a small noise.
For the rest of the course I will develop at length an example: the Directed
Polymer Model.
The Directed Polymer Model is based on a Simple Random Walk
interacting with a random environment, and it has seen an incredible
activity (and important progress) over the past decade. In fact, even though
the model is very simply defined (it is a disordered version of the simple
random walk), it exhibits a wide range of behaviours and in particular it
undergoes a phase transition as the intensity of disorder varies.
I will first describe important features of this model, for instance showing
the presence of a phase transition and discussing the role of the dimension
in the question of disorder relevance. I will then present very recent result
on several fronts:
• the recent characterization of the phase transition in dimension d≥3,
• the construction of a disordered scaling limit and its relation to the
Stochastic Heat Equation,
• the case of the critical dimension d=2.
