Welcome to the
Institute of Mathematics
- Intranet -


Event

Liu Ziyin (University of Tokyo): The Probabilistic Stability and Low-Rank Bias of SGD

Ort: MPI für Mathematik in den Naturwissenschaften Leipzig, Videobroadcast

Video broadcast: Math Machine Learning seminar MPI MIS + UCLA Conventionally, the stability of stochastic gradient descent (SGD) is understood through a linear stability analysis, where the mean and variance of the parameter or the gradients are examined to determine the stability of SGD close to a stationary point. In this seminar, we discuss the limitations of linear stability theories and motivate a new notion of stability, which we call the probabilistic stability. We first explain why this notion of stability is especially suitable for understanding SGD at a large learning rate and a small batch size in toy problems. Then, with this new notion of stability, we study the implicit bias of SGD and show that SGD at a large learning rate converges to low-rank saddles in matrix factorization problems. The talk is mainly based on the following two works: [1] Liu Ziyin, Botao Li, James B. Simon, Masahito Ueda. SGD with a Constant Large Learning Rate Can Converge to Local Maxima. ICLR 2022. [2] The Probabilistic Stability of SGD. (tentative title, in preparation)

No Attachment

Beginn: Jan. 26, 2023, 5 p.m.

Ende: Jan. 26, 2023, 6:30 p.m.