Download the full-sized PDF of Numeric Tensor Framework: Toward a New Paradigm in Technical ComputingDownload the full-sized PDF



Permanent link (DOI):


Export to: EndNote  |  Zotero  |  Mendeley


This file is in the following communities:

Graduate Studies and Research, Faculty of


This file is in the following collections:

Theses and Dissertations

Numeric Tensor Framework: Toward a New Paradigm in Technical Computing Open Access


Other title
Computer Vision
Technical Computing
High-Dimensional Data
Thomas S. Kuhn
Einstein Notation
Tensor Algebra
Tensor Computations
Type of item
Degree grantor
University of Alberta
Author or creator
Harrison, Adam P
Supervisor and department
Joseph, Dileepen (Electrical and Computer Engineering)
Examining committee member and department
Sadayappan, P. (Computer Science and Engineering, Ohio State University)
Zhao, H. Vicky (Electrical and Computer Engineering)
Boulanger, Pierre (Computing Science)
Jagersand, Martin (Computing Science)
Department of Electrical and Computer Engineering
Signal and Image Processing
Date accepted
Graduation date
Doctor of Philosophy
Degree level
Technical computing is a cornerstone of modern scientific practice. Within technical computing, the matrix-vector (MV) framework, composed of MV algebra and MV software, dominates the discipline in representing and manipulating linear mappings applied to vectors. Indeed, prominent technical computing packages, e.g., MATLAB, revolve around the MV framework. Applying Thomas S. Kuhn's theory of paradigms, the MV framework is technical computing's paradigm. One may then reasonably ask whether the MV paradigm imposes significant restrictions on technical computing's practice. This question may be answered by synthesising the literature on widespread and disparate research efforts on frameworks beyond the MV paradigm. Two categories of anomalous practice emerge, namely special linear mappings, i.e., high-dimensional and entrywise linear mappings, and mappings beyond linear, i.e., polynomial and multilinear mappings. To tackle these anomalies, a framework for numeric tensors (NTs), i.e., high-dimensional data invested with arithmetic operations, proves well-equipped. The proposed NT framework uses an NT algebra that exploits and extends the storied Einstein notation, offering unmatched capabilities, e.g., N-dimensional operators, associativity, commutativity, entrywise products, and linear invertibility, complemented by distinct ease-of-use. This expressiveness is comprehensively supported by innovative NT software, embodied by open-source C++ and MATLAB libraries. Novelties include a lattice data structure, which can execute or invert any NT product, of any dimensions, using optimised algorithms. Regarding sparse NT computations, which are essential to address the curse of dimensionality, the software takes new approaches for data storage, rearrangement, and multiplication. Moreover, the software performs competitively on representative benchmarks, matching or surpassing leading competitors, including the MATLAB Tensor Toolbox, NumPy, FTensor, and Blitz++, while providing a more general set of arithmetic operations. To illustrate these contributions, two original problems from computer vision are solved using the NT framework. The selected exemplars, concerning image segmentation and depth-map estimation, involve high-dimensional differential operators, linking them to the partial-differential equations found in countless other disciplines. Returning to Kuhn, the contributions of this thesis, literature review included, help make a case that technical computing is experiencing a revisionary period. As such, the NT framework, with its expressive algebra and innovative software, represents a timely and significant contribution to the evolution of technical computing's paradigm.
This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for the purpose of private, scholarly or scientific research. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.
Citation for previous publication
A. P. Harrison, N. Birkbeck, and M. Sofka, “IntellEditS: Intelligent Learning-Based Editor of Segmentations,” in Medical Image Computing and Computer-Assisted Intervention - MICCAI 2013, ser. Lecture Notes in Computer Science, K. Mori, I. Sakuma, Y. Sato, C. Barillot, and N. Navab, Eds. Springer Berlin Heidelberg, 2013, vol. 8151, pp. 235-242A. P. Harrison and D. Joseph, “Maximum Likelihood Estimation of Depth Maps Using Photometric Stereo,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, no. 7, pp. 1368-1380, 2012;A. P. Harrison and D. Joseph, “Depth-Map and Albedo Estimation with Superior Information-Theoretic Performance,” in Image Processing: Machine Vision Applications VIII, ser. Proceedings of the SPIE, E. Y. Lam and K. S. Niel, Eds. SPIE, 2015, vol. 9405, pp. 94050C-94050C15.

File Details

Date Uploaded
Date Modified
Audit Status
Audits have not yet been run on this file.
File format: pdf (PDF/A)
Mime type: application/pdf
File size: 36508981
Last modified: 2016:06:16 17:03:30-06:00
Filename: Harrison_Adam_P_201601_PhD.pdf
Original checksum: 5fb5134d1b3379816dd9e414d5a68f78
Copyright note: Copyright © 2016 "Adam P. Harrison"
Activity of users you follow
User Activity Date