Theses and Dissertations

This collection contains theses and dissertations of graduate students of the University of Alberta. The collection contains a very large number of theses electronically available that were granted from 1947 to 2009, 90% of theses granted from 2009-2014, and 100% of theses granted from April 2014 to the present (as long as the theses are not under temporary embargo by agreement with the Faculty of Graduate and Postdoctoral Studies). IMPORTANT NOTE: To conduct a comprehensive search of all UofA theses granted and in University of Alberta Libraries collections, search the library catalogue at www.library.ualberta.ca - you may search by Author, Title, Keyword, or search by Department.
To retrieve all theses and dissertations associated with a specific department from the library catalogue, choose 'Advanced' and keyword search "university of alberta dept of english" OR "university of alberta department of english" (for example). Past graduates who wish to have their thesis or dissertation added to this collection can contact us at erahelp@ualberta.ca.

Items in this Collection

Skip to Search Results

Results for "Probability Distributions on a Circle"

  • Fall 2022

    Vega Romero, Roberto Ivan

    labels used during training, and (3) Differences between the distributions that generated the training and test data. This dissertation focuses on strategies for effectively applying machine learning under these circumstances.For learning models from a limited number of labeled instances, we propose

    the labels, we use probabilistic graphical models. Instead of providing a point-estimate, probabilistic models predict an entire probability distribution, which accounts for the uncertainty in the data. Probabilistic models are a key component of the probabilistic labels mentioned above, and they also

    the data used during inference -- in particular, the test set might not follow the same probability distribution that generated the training data. This means that a predictor learned from one dataset might do poorly when applied to a second dataset. This problem is known as batch effects or dataset

  • Spring 2024

    Mao, Yiren

    locations. A binary relevance 3D CNN-LSTM autoencoder, employing different loss functions, showed marginal improvement but struggled to predict probability locations over a large horizon. Models trained on principal component analysis (PCA)-transformed and dynamic PCA (DPCA)-transformed data showed promise

    in training but failed in testing. Models trained on PDFs without "dead voxels" (zero probability voxels independent of time) and atomic Cartesian coordinates perform well during training but encounter challenges in testing due to teacher forcing. Teacher forcing is a training method that can

    . This work focuses on the development of machine learning (ML) models as proxy models for Car-Parrinello molecular dynamics (CPMD) metadynamics simulations in condensed-phase biomass reactions. Explicit solvation CPMD metadynamics simulation data of HMF undergoing protonation in a solution of dimethyl

1 - 2 of 2