Summer school at Sapienza University of Rome, September 8-12, 2025
Short course on Dynamics and learning in recurrent neural networks
Lecture 1: Equilibrium theory of binary attractor networks
Based on:
- Amit, Modeling brain function: The world of attractor neural networks (1989)
- Coolen, Statistical mechanics of recurrent neural networks I — Statics (2001)
Recommended readings:
- Mézard, Parisi, Virasoro, Spin glass theory and beyond: An Introduction to the Replica Method and Its Applications (1986)
- Castellani, Cavagna, Spin-glass theory for pedestrians (2005)
Lecture 2: Dynamics of binary attractor networks
Based on:
Recommended readings:
- Fischer, Hertz, Spin Glasses (1993)
- Eissfeller, Opper Mean-field Monte Carlo approach to the Sherrington-Kirkpatrick model with asymmetric couplings (1994)
Lecture 3: Chaotic dynamics in random neural networks
Based on:
- Sompolinsky, Crisanti, Sommers, Chaos in Random Neural Networks (1988)
- Rajan, Abbott, Sompolinsky, Stimulus-dependent suppression of chaos in recurrent neural networks (2010)
- Helias, Dahmen, Statistical Field Theory for Neural Networks (2020)
Recommended readings:
- Hertz, Roudi, Sollich, Path integral methods for the dynamics of stochastic and disordered systems (2016)
- Roy, Biroli, Bunin, Cammarota, Numerical implementation of dynamical mean field theory for disordered systems: application to the Lotka–Volterra model of ecosystems (2019)
- Clark, Abbott, Litwin-Kumar, Dimension of Activity in Random Neural Networks (2023)
- Galla, Generating-functional analysis of random Lotka-Volterra systems: A step-by-step guide (2024)
- Clark, Abbott, Theory of Coupled Neuronal-Synaptic Dynamics (2024)
- Clark, Marschall, van Meegen, Litwin-Kumar, Connectivity structure and dynamics of nonlinear recurrent neural networks (2025)
Lecture 4: Training recurrent networks
Based on:
- Jager, Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the echo state network approach (2002)
- Sussillo, Abbott, Generating Coherent Patterns of Activity from Chaotic Neural Networks (2009)
- Goodfellow, Bengio, Courville, Deep learning (2016)
Recommended readings:
- Sussillo, Barak, Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks (2013)
- Couillet, Wainrib, Tiomoko Ali, Sevi, A Random Matrix Approach to Echo-State Neural Networks (2016)
- Rivkind, Barak, Local Dynamics in Trained Recurrent Neural Networks (2017)
- Mastrogiuseppe, Ostojic, Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks (2018)
- Schuessler, Mastrogiuseppe, Dubreuil, Ostojic, Omri Barak, The interplay between randomness and structure during learning in RNNs (2020)
- Fanthomme, Monasson, Low-Dimensional Manifolds Support Multiplexed Integrations in Recurrent Neural Networks (2021)
- Fournier, Urbani, Statistical physics of learning in high-dimensional chaotic systems (2023)
- Bordelon, JordanCotler, Pehlevan, Zavatone-Veth, Dynamically Learning to Integrate in Recurrent Neural Networks (2025)
- Hakim, Karma, Theory of Temporal Pattern Learning in Echo State Networks (2025)