Download the full-sized PDF of Pointing gestures for Cooperative Human-Robot Manipulation Tasks in Unstructured EnvironmentsDownload the full-sized PDF



Permanent link (DOI):


Export to: EndNote  |  Zotero  |  Mendeley


This file is in the following communities:

Graduate Studies and Research, Faculty of


This file is in the following collections:

Theses and Dissertations

Pointing gestures for Cooperative Human-Robot Manipulation Tasks in Unstructured Environments Open Access


Other title
Computer Interface
Robot Telemanipulation
Robot Manipulators
Spatial Pointing for Household Environments
Robot Teleoperation
Remote Display-based pointing in Tele-Manipulation
Proximate Display-based pointing for upper body-disabled persons
Robot Control
Robot Control for Human Robot Interaction
Human Robot Interaction
Type of item
Degree grantor
University of Alberta
Author or creator
Perez Quintero, Camilo A
Supervisor and department
Jagersand, Martin (Computing Science)
Examining committee member and department
Elizabeth Croft (UBC, Mechanical Engineering)
Martin Jagersand(Computing Science)
Patrick Pilarski (Computing Science)
Hong Zhang (Computing Science)
Mahdi Tavakoli (Electrical and Computer Engineering)
Department of Computing Science

Date accepted
Graduation date
2017-11:Fall 2017
Doctor of Philosophy
Degree level
In recent years, robots have started to migrate from industrial to unstructured human environments, some examples include home robotics, search and rescue robotics, assistive robotics and service robotics. However, this migration has been at a slow pace and with only a few successes. One key reason is that current robots do not have the capacity to interact well with humans in dynamic environments. Finding natural communication mechanisms that allow humans to effortlessly interact and collaborate with robots is a fundamental research direction to integrate robots into our daily living. In this thesis, we research pointing gestures for cooperative human-robot manipulation tasks in unstructured environments. By interacting with a human, the robot can solve tasks that are too complex for current artificial intelligence agents and autonomous control systems. Inspired by human-human manipulation interaction, in particular how humans use pointing and gestures to simplify communication during collaborative manipulation tasks; we developed three novel non-verbal pointing based interfaces for human-robot collaboration. 1) Spatial pointing interface: In this interface, both human and robot are collocated and the communication format is done through gestures. We studied human pointing gesturing in the context of human manipulation and using computer vision, we quantified accuracy and precision of human pointing in household scenarios. Furthermore, we designed a robot and vision system that is able to see, interpret and act using a gesture-based language. 2) Assistive vision-based interface: We designed an intuitive 2D image-based interface for upper body disabled persons to manipulate daily household objects through an assistive robotic arm (both human and robot are collocated sharing the same environment). The proposed interface reduces operation complexity by providing different levels of autonomy to the end user. 3) Vision-Force Interface for Path Specification in Tele-Manipulation: This is a remote visual interface that allows a user to specify in an on-line fashion a path constraint to a remote robot. By using the proposed interface the operator is able to guide and control a 7-DOF remote robot arm through the desired path using only 2 DOF. We validate each of the proposed interfaces through user studies. The proposed interfaces explore the important direction of letting robots and humans work together and the importance of using a good communication channel/interface during the interaction. Our research involved the integration of several knowledge areas. In particular, we studied and developed algorithms for vision control, object detection, object grasping, object manipulation and human-robot interaction.
This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for the purpose of private, scholarly or scientific research. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.
Citation for previous publication
Camilo Perez Quintero, Romeo Tatsambon, Mona Gridseth, and Martin Jagersand. Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task. In Robot and Human Interactive Communication (ROMAN), IEEE 2015.Camilo Perez Quintero, Romeo Tatsambon Fomena, Azad Shademan, Nina Wolleb, Travis Dick, and Martin Jagersand. Sepo: Selecting by pointing as an intuitive human-robot command interface. In Robotics and Automation (ICRA), IEEE 2013.Camilo Perez Quintero and Martin Jagersand. Robot making pizza. In 3rd place in the IEEE Robotics and Automation Society (RAS) SAC Video contest, may, 2013.Camilo Perez Quintero, Oscar Ramirez, and Martin Jagersand. Vibi: Assistive vision-based interface for robot manipulation. In 2015 IEEE International Conference on Robotics and Automation (ICRA).Camilo Perez Quintero, Oscar Ramirez, Mona Gridseth, and Martin Jagersand. Small object manipulation in 3d perception robotic systems using visual servoing. In Robot Manipulation: What has been achieved and what remains to be done? Workshop, IROS, 2014.Camilo Perez Quintero, Masood Dehghan, Oscar Ramirez, and Martin Jagersand. Flexible virtual fixture interface for path specification in tele-manipulation. In Robotics and Automation (ICRA), 2017 IEEE.Camilo Perez Quintero, Masood Dehghan, Oscar Ramirez, Marcelo H. Ang, and Martin Jagersand. Vision-force interface for path specification in telemanipulation. In Human-Robot Interfaces for Enhanced Physical Interactions Workshop, ICRA, 2016.Camilo Perez Quintero, Romeo Tatsambon Fomena, Azad Shademan, Oscar Ramirez, and Martin Jagersand. Interactive teleoperation interface for semiautonomous control of robot arms. In Computer and Robot Vision (CRV), IEEE 2014.

File Details

Date Uploaded
Date Modified
Audit Status
Audits have not yet been run on this file.
File format: pdf (PDF/A)
Mime type: application/pdf
File size: 16049765
Last modified: 2017:11:08 16:58:48-07:00
Filename: PerezQuintero_Camilo_A_201705.pdf
Original checksum: a4813ef24489f6ddb0e87c6fda60c626
Activity of users you follow
User Activity Date