Usage
  • 54 views
  • 83 downloads

Directly Learning Predictors on Missing Data with Neural Networks

  • Author / Creator
    Awwal, Alvina
  • The problem of missing data is omnipresent in a wide range of real-world datasets. When learning and predicting on this data with neural networks, the typical strategy is to fill-in or complete these missing values in the dataset, called impute-then-regress. Much less common is to attempt to directly learn neural networks on the missing data, without imputing; one such approach, called NeuMiss, introduces a novel layer in the network but can be finicky to train. In this work, we explore two simple augmentations that make it simple to use standard neural network architectures: augmenting the input by concatenating a missingness indicator and introducing synthetic missingness. Synthetic missingness involves masking additional input attributes; this simple data augmentation technique expands the dataset, but surprisingly has not been explored. We show that both of these augmentations improve prediction performance across several datasets, and levels of missingness.

  • Subjects / Keywords
  • Graduation date
    Fall 2023
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-vnf4-2143
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.