Usage
  • 133 views
  • 149 downloads

Strange springs in many dimensions: how parametric resonance can explain divergence under covariate shift.

  • Author / Creator
    Banman, Kirby
  • Most convergence guarantees for stochastic gradient descent with momentum (SGDm) rely on independently and identically ditributed (iid) data sampling. Yet, SGDm is often used outside this regime, in settings with temporally correlated inputs such as continual learning and reinforcement learning. Existing work has shown that SGDm with decaying step-size can converge under Markovian temporal correlation. In this work, we show that SGDm under covariate shift with fixed step-size can be unstable and diverge. In particular, we show SGDm under covariate shift is a parametric oscillator, and so can suffer from a phenomenon known as resonance. We characterize the learning system as a time varying system of ordinary differential equations (ODEs), and leverage existing theory to characterize learning system divergence/convergence as resonant/nonresonant modes of the ODE system. The theoretical result is limited to the linear setting with periodic covariate shift, so we empirically supplement this result to show that resonance phenomena persist across other problem settings having non-periodic covariate shift, nonlinear dynamics with neural networks, and optimizers other than SGDm.

  • Subjects / Keywords
  • Graduation date
    Fall 2021
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-dt9j-ac65
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.