Session 7
Date and Time:
Wednesday, February 25, 2026 - 9:00am to 10:30am
Location:
Fields Institute, Room 230
Abstract:
- Distribution-Dependent Rates for Multi-Distribution Learning
Rafael Hanashiro, Patrick Jaillet - From Continual Learning to SGD and Back: Better Rates for Continual Linear Models
Itay Evron, Ran Levinstein, Matan Schliserman, Uri Sherman, Tomer Koren, Daniel Soudry, Nathan Srebro - Beyond Discrepancy: A Closer Look at the Theory of Distribution Shift
Robi Bhattacharjee, Nicholas Rittler, Kamalika Chaudhuri - Efficient and Provable Algorithms for Covariate Shift
Deeksha Adil, Jaroslaw Blasiok - Multi-distribution Learning: From Worst-Case Optimality to Lexicographic Min-Max Optimality
Guanghui Wang, Umar Syed, Robert E. Schapire, Jacob Abernethy - PAC-Bayesian Analysis of the Surrogate Relation between Joint Embedding and Supervised Downstream Losses
Theresa Wasserer, Maximilian Fleissner, Debarghya Ghoshdastidar - Bridging Lifelong and Multi-Task Representation Learning via Algorithm and Complexity Measure
Zhi Wang, Chicheng Zhang, Ramya Korlakai Vinayak

