This decommissioned ERA site remains active temporarily to support our final migration steps to https://ualberta.scholaris.ca, ERA's new home. All new collections and items, including Spring 2025 theses, are at that site. For assistance, please contact erahelp@ualberta.ca.
Theses and Dissertations
This collection contains theses and dissertations of graduate students of the University of Alberta. The collection contains a very large number of theses electronically available that were granted from 1947 to 2009, 90% of theses granted from 2009-2014, and 100% of theses granted from April 2014 to the present (as long as the theses are not under temporary embargo by agreement with the Faculty of Graduate and Postdoctoral Studies). IMPORTANT NOTE: To conduct a comprehensive search of all UofA theses granted and in University of Alberta Libraries collections, search the library catalogue at www.library.ualberta.ca - you may search by Author, Title, Keyword, or search by Department.
To retrieve all theses and dissertations associated with a specific department from the library catalogue, choose 'Advanced' and keyword search "university of alberta dept of english" OR "university of alberta department of english" (for example). Past graduates who wish to have their thesis or dissertation added to this collection can contact us at erahelp@ualberta.ca.
Items in this Collection
- 2Machine Learning
- 1Car-Parrinello Molecular Dynamics
- 1Computational Chemistry
- 1Domain adaptation
- 1Long-Short Term Memory
- 1Medical applications
-
Machine learning for medical applications with limited data: Incorporating domain expertise and addressing domain-shift
DownloadFall 2022
labels used during training, and (3) Differences between the distributions that generated the training and test data. This dissertation focuses on strategies for effectively applying machine learning under these circumstances.For learning models from a limited number of labeled instances, we propose
the labels, we use probabilistic graphical models. Instead of providing a point-estimate, probabilistic models predict an entire probability distribution, which accounts for the uncertainty in the data. Probabilistic models are a key component of the probabilistic labels mentioned above, and they also
the data used during inference -- in particular, the test set might not follow the same probability distribution that generated the training data. This means that a predictor learned from one dataset might do poorly when applied to a second dataset. This problem is known as batch effects or dataset
-
Spring 2024
locations. A binary relevance 3D CNN-LSTM autoencoder, employing different loss functions, showed marginal improvement but struggled to predict probability locations over a large horizon. Models trained on principal component analysis (PCA)-transformed and dynamic PCA (DPCA)-transformed data showed promise
in training but failed in testing. Models trained on PDFs without "dead voxels" (zero probability voxels independent of time) and atomic Cartesian coordinates perform well during training but encounter challenges in testing due to teacher forcing. Teacher forcing is a training method that can
. This work focuses on the development of machine learning (ML) models as proxy models for Car-Parrinello molecular dynamics (CPMD) metadynamics simulations in condensed-phase biomass reactions. Explicit solvation CPMD metadynamics simulation data of HMF undergoing protonation in a solution of dimethyl