Programme > Agenda
• 9:00 to 9:55 – Fast Rates for Prediction with Costly or Limited Expert Advice (Gilles Blanchard – University Paris-Saclay )
• 10:00 to 10:55 – Learning Dynamical Systems via Koopman Operator Regression in Reproducing Kernel Hilbert Spaces (Vladimir R. Kostic - Italian Institute of Technology)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – Online Learning and Potential Functions (Yoav Freund - UCSD)
• 12:30 to 13:30 – Lunch Time at BCAM
• 13:30 to 14:25 – A Theory of Weak-Supervicion and Zero-Shot Learning (Eli Upfal - University of Brown)
• 14:30 to 15:25 – The Role of Convexity in Data-Driven Decision-Making (Peyman Mohajerin - Delft University of Technology)
• 10:00 to 10:55 – Learning Dynamical Systems via Koopman Operator Regression in Reproducing Kernel Hilbert Spaces (Vladimir R. Kostic - Italian Institute of Technology)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – Online Learning and Potential Functions (Yoav Freund - UCSD)
• 12:30 to 13:30 – Lunch Time at BCAM
• 13:30 to 14:25 – A Theory of Weak-Supervicion and Zero-Shot Learning (Eli Upfal - University of Brown)
• 14:30 to 15:25 – The Role of Convexity in Data-Driven Decision-Making (Peyman Mohajerin - Delft University of Technology)
• 9:00 to 9:55 – On how PAC-Bayesian Bounds Help to Better Understand (and Improve) Bayesian Machine Learning (Andrés Masegosa - Aalborg University)
• 10:00 to 10:55 – Convergence of Nearest Neighbour Classification (Sanjoy Dasgupta - UCSD)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – Majorizing Measures, Codes, and Information (Maxim Raginsky - University of Illinois at Urbana-Champaign)
• 12:30 to 14:00 – Lunch Time at Second Floor - Iberdrola
• 14:00 to 14:55 – Beyond Empirical Risk Minimization (Santiago Mazuelas - BCAM)
• 15:00 to 15:55 – Generalization Bounds via Convex Analysis (Gergely Neu - Universidad Pompeu Fabra)
• 10:00 to 10:55 – Convergence of Nearest Neighbour Classification (Sanjoy Dasgupta - UCSD)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – Majorizing Measures, Codes, and Information (Maxim Raginsky - University of Illinois at Urbana-Champaign)
• 12:30 to 14:00 – Lunch Time at Second Floor - Iberdrola
• 14:00 to 14:55 – Beyond Empirical Risk Minimization (Santiago Mazuelas - BCAM)
• 15:00 to 15:55 – Generalization Bounds via Convex Analysis (Gergely Neu - Universidad Pompeu Fabra)
• 9:00 to 9:55 – Nonparametric Multiple-Output Center-Outward Quantile Pregression (Eustasio del Barrio - Universidad de Valladolid)
• 10:00 to 10:55 – Robust and Fair Multisource Learning (Christoph Lampert - Institute of Science and Technology Austria)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – Towards Collective Intelligence in Heterogeneous Learning (Krikamol Muandet - CISPA)
• 12:30 to 14:00 – Lunch Time at Second Floor - Iberdrola
• 14:00 to 14:55 – Stochastic Optimization for Large-Scale Machine Learning: Variance Reduction, Acceleration, and Robustness to Noise (Julien Mairal - Inria Grenoble)
• 15:00 to 15:55 – Non-convex Min-Max Optimization: Fundamental Limits, Acceleration, and Adaptivity (Niao He - ETH Zurich)
• 10:00 to 10:55 – Robust and Fair Multisource Learning (Christoph Lampert - Institute of Science and Technology Austria)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – Towards Collective Intelligence in Heterogeneous Learning (Krikamol Muandet - CISPA)
• 12:30 to 14:00 – Lunch Time at Second Floor - Iberdrola
• 14:00 to 14:55 – Stochastic Optimization for Large-Scale Machine Learning: Variance Reduction, Acceleration, and Robustness to Noise (Julien Mairal - Inria Grenoble)
• 15:00 to 15:55 – Non-convex Min-Max Optimization: Fundamental Limits, Acceleration, and Adaptivity (Niao He - ETH Zurich)
• 9:00 to 9:55 – Fast Rates for Noisy Interpolation Require Rethinking the Effects of Inductive Bias (Fanny Yang - ETH Zurich)
• 10:00 to 10:55 – Hyperplane Arrangement Classifiers in Overparameterized and Interpolating Settings (Clayton Scott - University of Michigan)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – E is the New P: the E-value as a Generic Tool for Robust, Anytime Valid Hypothesis Testing and Confidence Intervals (Peter Grunwald - CWI & Leiden University)
• 12:30 to 13:25 – Parameter Identifiability and Structure Learning for Linear Gaussian Graphical Models (Carlos Amendola - Berlin Institute of Technology)
• 13:30 to 14:30 – Lunch Time at BCAM
• 10:00 to 10:55 – Hyperplane Arrangement Classifiers in Overparameterized and Interpolating Settings (Clayton Scott - University of Michigan)
• 10:55 to 11:30 – Coffee break
• 11:30 to 12:25 – E is the New P: the E-value as a Generic Tool for Robust, Anytime Valid Hypothesis Testing and Confidence Intervals (Peter Grunwald - CWI & Leiden University)
• 12:30 to 13:25 – Parameter Identifiability and Structure Learning for Linear Gaussian Graphical Models (Carlos Amendola - Berlin Institute of Technology)
• 13:30 to 14:30 – Lunch Time at BCAM