Schools and Workshops
Upcoming
Mathematical methods for high-dimensional data, Sapienza University of Rome (2025)
Summer school at the Department of Mathematics, Sapienza University of Rome, September 8-12, 2025.
I will give a mini-course on Dynamics and learning in recurrent neural networks, discussing the statistical mechanical theory of deterministic and stochastic recurrent networks.
Topics to be discussed:
- Equilibrium and dynamics of attractor network models
- Dynamical mean field theory for rate networks
- Learning algorithms for RNNs
Find a brief list of suggested readings at this link.
Past schools and workshops
Molecular Biophysics at the transition state: from statistical mechanics to artificial intelligence, StatPhys29 Satellite Meeting, Trento, July 2024
An event that gathered young researchers and leading experts in the fields of computational biophysics, statistical mechanics and machine learning, to discuss the challenges and crossroads that stand before biomolecular simulations in the machine learning era.
Co-organized with Raffaello Potestio and his lab.
Junior Scientists Workshop on Recent Advances in Theoretical Neuroscience, ICTP, June 2024
A workshop we organized at the ICTP, together with Francesca Mastrogiuseppe (SISSA), Sebastian Goldt (SISSA) and Agostina Palmigiano (Gatsby Computational Neuroscience Unit).
A great program featuring invited lectures and contributed talks from emerging scientists in Theoretical Neuroscience. The workshop showcased advances in the field of dynamics, plasticity, and computation in neuronal circuits, with an emphasis on both mathematical tools and biological implications.
Latin American Summer School in Computational Neuroscience (LACONEU), Valparaiso (2023)
A summer school built with the aim of promoting the field of Computational Neuroscience in Latin America.
A fantastic teaching and learning experience, with students from all over Latin America, alongside Cristina Savin, Enzo Tagliazucchi, Alessandro Treves and many others.
Find the jupyter notebooks used in the tutorials from my lectures “The shortest possible introduction to learning in recurrent neural networks” at this link.