Usage
  • 41 views
  • 72 downloads

Parameter Search Transfer Learning

  • Author / Creator
    Singamsetti, Mohan Sai
  • Deep learning approaches have had success in many domains recently, particularly in domains with large amounts of training data. However, there are domains without a sufficient quantity of training data, or where the training data present is of insufficient quality. Transfer learning approaches can help in such low-data problems, but still tends to assume access to sufficient source domain data and a sufficient signal for transfer. In this work, we propose a novel approach for transfer learning called Parameter Search Transfer Learning (PSTL) which directly searches over parameters of a neural network in order to minimize the impact of low training samples in both source and target domains. Across Reinforcement Learning (RL), Regression, and Classification tasks we demonstrate that PSTL meets or exceeds the performance of transfer learning baselines, which we hypothesize is due to its ability to identify a better gradient.

  • Subjects / Keywords
  • Graduation date
    Fall 2023
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-34hf-ht23
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.