Usage
  • 160 views
  • 179 downloads

DEVELOPMENT OF AN ATTENTIVE USER INTERFACE FOR CONTROLLING A TELEROBOTIC HAPTIC SYSTEM TO SUPPORT PLAY IN CHILDREN WITH PHYSICAL DISABILITIES

  • Author / Creator
    Castellanos Cruz, Javier L.
  • BACKGROUND: Children with physical impairments may face difficulties when playing because of limitations in reaching and handling objects. Children with physical disabilities may have limited opportunities to play, and they may experience negative impacts on their social, emotional, or psychological development. Children are able to control robots to play, explore and manipulate the environment. Telerobotic systems could allow children to control the robot at a distance, for example, from their wheelchairs. Haptic interfaces are capable of providing the sense of touch, so that the children can feel the properties (e.g., hardness) of the objects that the robot is interacting with in the environment. Additionally, haptic interfaces can provide guidance to help children with physical impairments to reach different locations with the robots.There two types of eye gaze interfaces for the control of robots. Explicit eye input interfaces require the user to voluntarily control the eye movements for fixating at objects for a dwell time or for doing eye gestures. Children may have difficulties controlling their eye movements and the robot at the same time. Attentive user interfaces respond according to the user’s visual behavior when they interact with the technology. Attentive interfaces can potentially remove the cognitive demand of thinking about controlling the eye movements. An attentive interface could predict the toy that a child wants reach with the robot and apply haptic guidance to help him/her get to that location. OBJECTIVES: The main objective of this thesis was to develop and test an attentive user interface for activating the guidance of a telerobotic haptic system for supporting children with physical impairments to reach toys in a playful activity. A secondary objective of this thesis was to develop an explicit eye input interface for activating the haptic guidance and compare its performance with the attentive user interface.METHODS: The robotic system included two haptic robots, one for the user to control the movements of the other robot, which interacted with the objects in the environment. It also included an eye tracking system to track the user’s eye gaze. Adults without physical disabilities, typically developing children, and a child and an adult with cerebral palsy were recruited to test the system. Studies with the adults without disabilities contributed towards developing the attentive user interface and testing the robotic system before children used it. Different algorithms for predicting the target object that the user wanted to reach with robotic system were tested and compared. Two different attentive user interfaces were developed to predict and activate the haptic guidance towards the predicted target toy. One of the interfaces predicted the target object by using a neural network (tested with adults) and the other was based on the participant’s eye-robot coordination (tested by children and participants with disabilities). Also, two explicit eye input interfaces were developed for activating the haptic guidance (one tested by the adults, and the other by the children and participants with disabilities), and their performance was compared to the attentive interfaces. RESULTS: The average accuracy of the target predictions made by the attentive interfaces was higher than 85%. Adults had 100% success rate at selecting the correct moles using the explicit interface. Adults did not feel that their eyes were tired after using the attentive interface, as they did with the explicit interface. Three of ten adults without disabilities and an adult with disabilities preferred using the attentive interface because it was faster than the explicit interface. In the case of children, they did not have 100% success rate and required prompting to use to use the explicit interface. When participants used the attentive interface, they spent less time to whack each mole than when they used the explicit interface. When children and the adult with cerebral palsy used the attentive interface, the distance travelled by the robot was less than when they used the explicit interface.CONCLUSIONS: The eye gaze represented a significant predictor of the target mole that participants wanted to whack using the robot. Results demonstrated that the attentive user interface was faster to use than the explicit eye input interface, especially for children. Attentive user interfaces may reduce the cognitive load of having to control eye movements, thus children can focus on playing using the robot.

  • Subjects / Keywords
  • Graduation date
    Spring 2019
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-qz5z-q284
  • License
    Permission is hereby granted to the University of Alberta Libraries to reproduce single copies of this thesis and to lend or sell such copies for private, scholarly or scientific research purposes only. Where the thesis is converted to, or otherwise made available in digital form, the University of Alberta will advise potential users of the thesis of these terms. The author reserves all other publication and other rights in association with the copyright in the thesis and, except as herein before provided, neither the thesis nor any substantial portion thereof may be printed or otherwise reproduced in any material form whatsoever without the author's prior written permission.