Search
Skip to Search Results
Filter
Author / Creator / Contributor
Departments
Languages
Supervisors
Subject / Keyword
Year
Collections
Item type
-
Fall 2019
In this thesis, we investigate different vector step-size adaptation approaches for continual, online prediction problems. Vanilla stochastic gradient descent can be considerably improved by scaling the update with a vector of appropriately chosen step-sizes. Many methods, including AdaGrad,...
1 - 1 of 1