Theses and Dissertations

This collection contains theses and dissertations of graduate students of the University of Alberta. The collection contains a very large number of theses electronically available that were granted from 1947 to 2009, 90% of theses granted from 2009-2014, and 100% of theses granted from April 2014 to the present (as long as the theses are not under temporary embargo by agreement with the Faculty of Graduate and Postdoctoral Studies). IMPORTANT NOTE: To conduct a comprehensive search of all UofA theses granted and in University of Alberta Libraries collections, search the library catalogue at www.library.ualberta.ca - you may search by Author, Title, Keyword, or search by Department.
To retrieve all theses and dissertations associated with a specific department from the library catalogue, choose 'Advanced' and keyword search "university of alberta dept of english" OR "university of alberta department of english" (for example). Past graduates who wish to have their thesis or dissertation added to this collection can contact us at erahelp@ualberta.ca.

Items in this Collection

Skip to Search Results
Filter
Year
to
Languages
Item type

Results for "Probability Distributions on a Circle"

  • Fall 2016

    Feng, Chi

    analysis of single-cell multi-user massive MIMO downlink. Perfect channel state information (CSI) is assumed at the BS and maximum ratio transmission (MRT) precoding scheme is adopted. We first investigate the distribution of the interference power and derive its probability density function (pdf) by

    central limit theory. After that, analytical results on the outage probability and the sum-rate are derived. Different to existing work using the law of large numbers to derive the asymptotic deterministic signal-to-interference-plus-noise-ratio (SINR), the randomness of the interference in the SINR is

    kept intact in our work, which allows the derivation of the outage probability. We further extend to networks with per-antenna power constraint. A modified MRT precoding scheme is proposed and the performance of the modified scheme is analyzed. Our work show that the modified MRT precoding can achieve

  • Fall 2022

    Vega Romero, Roberto Ivan

    labels used during training, and (3) Differences between the distributions that generated the training and test data. This dissertation focuses on strategies for effectively applying machine learning under these circumstances.For learning models from a limited number of labeled instances, we propose

    the labels, we use probabilistic graphical models. Instead of providing a point-estimate, probabilistic models predict an entire probability distribution, which accounts for the uncertainty in the data. Probabilistic models are a key component of the probabilistic labels mentioned above, and they also

    the data used during inference -- in particular, the test set might not follow the same probability distribution that generated the training data. This means that a predictor learned from one dataset might do poorly when applied to a second dataset. This problem is known as batch effects or dataset

  • Fall 2013

    Macciotta, Renato

    through a Monte Carlo simulation technique, and the outcome of the analysis is a probability distribution of the estimated risk. This methodology shows the potential for evaluating the uncertainties related to risk estimations. The full potential of the risk management framework is best met with the

    carried out for two case histories, where population of the analyses input parameters is done as probability distributions rather than fixed values. The probability distributions of the input parameters cover the range of values believed realistic for each input parameter. The risk is then estimated

    establishment of risk evaluation criteria. The other objective of this work focuses on the development of risk evaluation criteria. It is not the intention of this work to develop case specific criteria, as this responsibility should lie with owners and regulators, but to propose a framework for developing the

  • Fall 2023

    Kuan, Li-Hao

    survival time for some patients. In general, an ISD model maps each patient x to his/her survival distribution, which is the probability that patient x will survive until time t, for each t > 0. We focus on discrete-time ISD models, which partition the future time into multiple time intervals and then

    Given a patient's description, a survival prediction model estimates that patient's survival time. We consider the challenge of learning an individual survival distribution (ISD) model from a dataset that includes censored training instances – i.e., data that provides only the lower bound of the

    apply machine learned regressors to estimate the survival probability in each time interval. These discrete-time ISD models can usually use fewer parameters than continuous models to describe different shapes of survival distributions by discretizing the survival time. We compare four survival models

  • Spring 2016

    Usman, Iram

    simulation studies to investigate right censoring. Different datasets from the exponential, Weibull, log-Normal, and gamma probability distributions have been generated in order to test the robustness of the SSS's. Three differential censoring settings were imposed on the generated datasets to test the

    time to events if the SSS uses appropriate distribution. Other authors have proposed the exponential and Weibull distributions for the event times. We have established the log-Weibull distribution as a new and alternative approach for the SSS, and compared and contrasted the three distributions through

    The spatial scan statistic (SSS) has been used for the identification of geographical clusters of higher than expected numbers of cases of a condition such as an illness. Disease outbreaks in a geographic area are a typical example. These statistics can also identify geographic areas with longer

  • Fall 2014

    Soliman, Samy Soliman Shokry Botros

    developed to obtain novel, exact analytical expressions for the probability density function (PDF) and the cumulative distribution function (CDF) of the instantaneous end-to-end signal-to-noise ratio (SNR) of variable gain AF relaying systems operating over Rayleigh, Nakagami- extit{m} and Rician fading

    framework for exact analysis of generic multihop cooperative relaying systems. This framework is valid for any modulation scheme, any fading channel distribution and any number of relays. The GTCF method is used in the thesis to obtain exact solutions for the ergodic capacity, outage probability and the

    average symbol error probability of multihop AF relaying systems. A strength of the GTCF approach is that it can be used with tractable computational effort. The thesis shows the cases where the strength of the GTCF method is paramount, and identifies as well the cases where the use of the GTCF method

  • Fall 2011

    Gong, Jiafen

    model from a birth-death process. The calculation of this NTCP model provides an alternative proof to a formula derived by Hanin (Hanin, 2004) to compute the probability distribution of the tumor size from its generating function. My formula is computationally more efficient, compared to Hanin’s

    , used for quantifying normal tissue complication. In this thesis, I begin with a simple Poisson TCP based on mean cell population dynamics. Optimal treatment schedules are obtained by maximizing this TCP while constraining the CRE under a given threshold. Some of the optimal results suggest the usage

    Cancer is one of the major causes of death in the world. In the field of Oncology, clinical trials form the crux of medical effort to find better treatment schedules. These trials are expensive, time consuming, and carry great risks for the patients involved. Mathematical models provide a

  • Fall 2019

    Rajabpour Ashkiki, Alireza

    of particle size distribution and composition, on trommel’s screening performance during full-scale operation throughout the year. Also investigated was the impact of clogging of screen apertures on screening of material. The second set of objectives were defined to characterize the operation

    performance of the waste processing system, with a primary focus on the trommel, using system analysis methods including system availability, maintainability and throughput. A two-stage trommel, respectively, with 5 cm and 23 cm screens was evaluated in this study. The trommel design capacity was 55 tonnes

    compositional analyses. Separation efficiency and recovery results verified that the performance of the first stage varied seasonally, primarily due to changes in the particle size distribution of the feedstock; secondly, because of a greater feed rate. The seasonal variation in the compostable fraction of the

  • Fall 2024

    Kammammettu, Sanjula

    years. The optimal transport problem seeks to transport probability mass from one probability distribution to another at the least total cost. This thesis uses this underlying concept in three main ways. Firstly, the optimal transport distance is used as a measure of similarity between probability

    ambiguity on multimodal uncertainty that is modeled as a Gaussian mixture. An optimal transport variant for Gaussian mixtures is further used to construct an ambiguity set of distributions around this reference model, and a tractable formulation is presented. The superior performance of this proposed

    formulation is contrasted with the established Wasserstein method on an illustrative study, as well as on a portfolio optimization problem. The thesis then uses the proposed formulation to tackle chance-constrained optimization in a distributionally robust setting, wherein the worst-case expected constraint

  • Spring 2016

    Rastpour, Amir

    algorithm such that we can calculate the stationary probabilities with a desired error tolerance---current methods do not provide bounds on the stationary probabilities. Essay 3: We propose a tool to accurately predict the number of heart attack patients in sufficiently small geographical areas of Alberta

    handle new emergency calls. We propose a simple recursion to calculate the expected duration of ambulance shortage periods and validate our recursion with data from Calgary, Canada, EMS. We develop analytical results for the second and higher moments, distribution, and Laplace transform of the shortage

    probability mass in the truncated upper tail is guaranteed to be smaller than a pre-specified value. This method can potentially substitute the currently-used heuristics that are exploited within algorithms that truncate the system first and then calculate its performance measures. (2) we extend an existing

11 - 20 of 92