Usage
  • 100 views
  • 122 downloads

Development and Testing of an Eye Gaze and Brain-Computer Interface with Haptic Feedback for Robot Control for Play by Children with Severe Physical Disabilities

  • Author / Creator
    Sakamaki, Isao
  • BACKGROUND: The process through which children learn about the world and develop perceptual, cognitive and social skills relies heavily on spatial and object exploration; specifically, manipulation of toys and tools in the environment. However, some children with motor impairment have difficulties with object manipulation due to issues with selective motor control that affects reaching or grasping. Robots controlled through simple human interfaces (e.g., joysticks and switches) can be a bridging tool for children with disabilities to access play and related development opportunities because robots can perform reaching, grasping, and manipulation functions to compensate for the children’s motor difficulties. These interfaces, however, generally still require a certain degree of physical ability to access and operate. Human-robot interfaces utilizing usable biological signals could be a potential solution to enable environmental exploration and object manipulation via a robot for people with motor impairments.

    OBJECTIVE: The main objective of this thesis was to develop a human-robot interface which integrated low-cost eye tracker and brain-computer interfaces (BCI) to directly control a robot. The systems were adapted in order to interact in a physical play environment, i.e., without the need for a computer display. Alternatives to visual feedback were examined, such as auditory and haptic feedback, for their effectiveness in improving task performance.

    METHODS: This dissertation work was divided into four phases involving experiments with adults and children with and without disabilities: 1) An eye gaze interface that mapped gaze direction into the physical environment was developed using homographic mapping. Participants used the system with different feedback conditions (i.e., visual, no-feedback, auditory, and vibrotactile haptic feedback) to select targets on a computer display and in the physical environment. 2) The eye gaze interface was then used in a physical task to sort cards using a teleoperated robot. First, the participant's desired target was determined using the eye gaze system. Then, a Forbidden Region Virtual Fixture (FRVF) was created towards the selected target to help the participant move the robot end effector towards it. The effects of no-feedback, auditory and vibrotactile haptic feedback for gaze fixations were examined. 3) Open BCI was used to implement a BCI based on event related desynchronization/synchronization (ERD/ERS). A motor imagery task was performed with feedback according to the detected movement intention, and the effectiveness of two feedback conditions was examined, the classic Graz training using visual feedback and kinesthetic haptic feedback using passive movement of the participant's hand. 4) The eye gaze interface and BCI were combined and tested in a physical play task with a mobile robot. Vibrotactile haptic feedback was given for feedback about gaze fixation and kinesthetic haptic feedback was given as feedback for motor imagery. The performance at selecting targets and moving towards them with and without the feedback was compared.

    RESULTS: 1) Gaze interaction was performed significantly faster during feedback conditions compared to no-feedback (p=0.019). However, no significant difference in performance between the feedback modalities (i.e., visual feedback, no-feedback, auditory feedback, and vibrotactile feedback) were found. 2) Feedback for the gaze fixation and guidance of the FRVF did not improve the performance of the robot control task for the adults without impairments, however, it did improve the speed and accuracy of the task for the child and adult with impairments. 3) The BCI task with the kinesthetic haptic feedback was significantly more accurate than the task with visual feedback only (p=0.01). No significant improvement was observed over 12 BCI runs in both feedback conditions, however, the participants reported that the task with the kinesthetic haptic feedback had a lower workload compared with the visual feedback only. 4) Using the mobile robot control task using the integrated eye gaze and BCI human-robot interface, all the adults without impairments and the adult with cerebral palsy performed faster during the no-feedback condition, and two of them showed significance (p=0.01 for the two adults without impairments). All the participants reported that task with the haptic feedback required less task workload.

    CONCLUSION: Feasibility of the eye gaze interface and BCI for the integrated human-robot interface were confirmed throughout this research series. Adding feedback to the human-robot interface could improve the performance of the robot operation and would enable people with physical impairments to access play and subsequent learning opportunities.

  • Subjects / Keywords
  • Graduation date
    Fall 2019
  • Type of Item
    Thesis
  • Degree
    Doctor of Philosophy
  • DOI
    https://doi.org/10.7939/r3-b2ac-gh47
  • License
    Permission is hereby granted to the University of Alberta Libraries to reproduce single copies of this thesis and to lend or sell such copies for private, scholarly or scientific research purposes only. Where the thesis is converted to, or otherwise made available in digital form, the University of Alberta will advise potential users of the thesis of these terms. The author reserves all other publication and other rights in association with the copyright in the thesis and, except as herein before provided, neither the thesis nor any substantial portion thereof may be printed or otherwise reproduced in any material form whatsoever without the author's prior written permission.