This decommissioned ERA site remains active temporarily to support our final migration steps to https://ualberta.scholaris.ca, ERA's new home. All new collections and items, including Spring 2025 theses, are at that site. For assistance, please contact erahelp@ualberta.ca.
Search
Skip to Search Results- 2Machine Learning
- 1Centered Mirror Descent
- 1Dynamic Regret
- 1Meta-descent
- 1Meta-gradient
- 1Non-stationarity
-
Fall 2024
Over the last decade, machine learning (ML) has lead to advances in many fields, such as computer vision, online decision-making, robotics, natural language processing, and many others. The algorithms driving these successes typically have one or more user-specified free variables called...
-
Fall 2019
In this thesis, we investigate different vector step-size adaptation approaches for continual, online prediction problems. Vanilla stochastic gradient descent can be considerably improved by scaling the update with a vector of appropriately chosen step-sizes. Many methods, including AdaGrad,...