- 168 views
- 235 downloads
A Universal Approximation Theorem for Tychonoff Spaces with Application to Spaces of Probability and Finite Measures
-
- Author / Creator
- Richard, Daniel
-
Universal approximation refers to the property of a collection of functions to approximate continuous functions. Past literature has demonstrated that neural networks are dense in continuous functions on compact subsets of finite-dimensional
spaces, and this document extends those findings to non-compact and infinite dimensional spaces using homeomorphism methods. The first result herein is a universal approximation theorem for Tychonoff spaces, which is
where the input to the neural network comes from some Tychonoff space. The resulting theorem shows that neural networks can arbitrarily approximate uniformly
continuous functions (with respect to the sup metric) associated with a unique uniformity. The result relies on constructing a homeomorphism from a collection of real-valued functions defined on the same space that collectively
separate and strongly separate points. The Tychonoff space is shown to be metrizable in the case where only a countable number of such functions is required. The second result, as a product of the Tychonoff result, is a universal approximation theorem for spaces of positive-finite measures. The motivation
for our second result comes from particle filtering with the goal of making a decision based on the state distribution. We also provide some discussion showing that neural networks on positive-finite measures are a generalization of deep sets. -
- Graduation date
- Fall 2022
-
- Type of Item
- Thesis
-
- Degree
- Master of Science
-
- License
- This thesis is made available by the University of Alberta Library with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.