ERA

Download the full-sized PDF of Gesture Learning in Human Computer InteractionDownload the full-sized PDF

Analytics

Share

Permanent link (DOI): https://doi.org/10.7939/R3XS5JR30

Download

Export to: EndNote  |  Zotero  |  Mendeley

Communities

This file is in the following communities:

Graduate Studies and Research, Faculty of

Collections

This file is in the following collections:

Theses and Dissertations

Gesture Learning in Human Computer Interaction Open Access

Descriptions

Other title
Subject/Keyword
Gesture Learning
Gestural Interaction
Gestures
Movement Training
Human Computer Interaction
Motor Learning
Type of item
Thesis
Degree grantor
University of Alberta
Author or creator
Anderson, Fraser S
Supervisor and department
Bischof, Walter F (Computing Science)
Examining committee member and department
Boulanger, Pierre (Computing Science)
Gagne, Christine (Psychology)
Maraj, Brian (Physical Education and Recreation)
Sharlin, Ehud (Computing Science, University of Calgary)
Department
Department of Computing Science
Specialization

Date accepted
2014-09-24T10:52:03Z
Graduation date
2014-11
Degree
Doctor of Philosophy
Degree level
Doctoral
Abstract
After decades of research, gestural interfaces are becoming increasingly commonplace in our interactions with modern devices. They promise natural and efficient interaction, but suffer from a lack of affordances and thus require learning on the part of the user. This thesis examines the declarative and procedural components of learning gestural interaction, and how designers can best support gesture learning within their interfaces. First, we show that user-defined gestures are not always consistent, even when the same user is defining a gesture for the same task, indicating that even when the user is able to select their own gestures some amount of gesture learning still may be necessary. Next, we present two studies that help us better understand the role of visual feedback, finding that it has a dramatic effect on the degree to which gestures are learned. Next, we examine the procedural component of gesture learning by varying the scale, location, and animation of visual feedback presented during training. We also show that evaluation using a retention and transfer paradigm is more appropriate for evaluating gestures than the other methodologies used previously. Lastly, we present YouMove, a full-body gesture training system that incorporates the lessons learned from the present work on stroke-based gestures.
Language
English
DOI
doi:10.7939/R3XS5JR30
Rights
Permission is hereby granted to the University of Alberta Libraries to reproduce single copies of this thesis and to lend or sell such copies for private, scholarly or scientific research purposes only. Where the thesis is converted to, or otherwise made available in digital form, the University of Alberta will advise potential users of the thesis of these terms. The author reserves all other publication and other rights in association with the copyright in the thesis and, except as herein before provided, neither the thesis nor any substantial portion thereof may be printed or otherwise reproduced in any material form whatsoever without the author's prior written permission.
Citation for previous publication
Anderson, F., Grossman, T., Matejka, J., and Fitzmaurice, G. YouMove: Enhancing Movement Training with an Augmented Reality Mirror. In Proceedings of User Interfaces and Software Technology (UIST), 2013, pp. 311-320.Anderson, F. and Bischof, W.F. Learning and Performance with Gesture Guides. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), 2013, pp. 1109-1118.

File Details

Date Uploaded
Date Modified
2014-11-15T08:17:43.846+00:00
Audit Status
Audits have not yet been run on this file.
Characterization
File format: pdf (PDF/A)
Mime type: application/pdf
File size: 3721992
Last modified: 2015:10:12 17:09:43-06:00
Filename: Anderson_Fraser_S_201409_PhD.pdf
Original checksum: 173770c392e9d069332c5a8e6c6001f0
Activity of users you follow
User Activity Date