You are here

Investigation of Tactile Displays for Robot to Human Communication

Download pdf | Full Screen View

Date Issued:
2012
Abstract/Description:
Improvements in autonomous systems technology and a growing demand within military operations are spurring a revolution in Human-Robot Interaction (HRI). These mixed-initiative human-robot teams are enabled by Multi-Modal Communication (MMC), which supports redundancy and levels of communication that are more robust than single mode interaction. (Bischoff (&) Graefe, 2002; Partan (&) Marler, 1999). Tactile communication via vibrotactile displays is an emerging technology, potentially beneficial to advancing HRI. Incorporation of tactile displays within MMC requires developing messages equivalent in communication power to speech and visual signals used in the military. Toward that end, two experiments were performed to investigate the feasibility of a tactile language using a lexicon of standardized tactons (tactile icons) within a sentence structure for communication of messages for robot to human communication. Experiment one evaluated tactons from the literature with standardized parameters grouped into categories (directional, dynamic, and static) based on the nature and meaning of the patterns to inform design of a tactile syntax. Findings of this experiment revealed directional tactons showed better performance than non-directional tactons, therefore syntax for experiment two composed of a non-directional and a directional tacton was more likely to show performance better than chance. Experiment two tested the syntax structure of equally performing tactons identified from experiment one, revealing participants' ability to interpret tactile sentences better than chance with or without the presence of an independent work imperative task. This finding advanced the state of the art in tactile displays from one to two word phrases facilitating inclusion of the tactile modality within MMC for HRI.
Title: Investigation of Tactile Displays for Robot to Human Communication.
51 views
31 downloads
Name(s): Barber, Daniel, Author
Reinerman, Lauren, Committee Chair
Jentsch, Florian, Committee Member
Lackey, Stephanie, Committee Member
Leonessa, Alexander, Committee Member
University of Central Florida, Degree Grantor
Type of Resource: text
Date Issued: 2012
Publisher: University of Central Florida
Language(s): English
Abstract/Description: Improvements in autonomous systems technology and a growing demand within military operations are spurring a revolution in Human-Robot Interaction (HRI). These mixed-initiative human-robot teams are enabled by Multi-Modal Communication (MMC), which supports redundancy and levels of communication that are more robust than single mode interaction. (Bischoff (&) Graefe, 2002; Partan (&) Marler, 1999). Tactile communication via vibrotactile displays is an emerging technology, potentially beneficial to advancing HRI. Incorporation of tactile displays within MMC requires developing messages equivalent in communication power to speech and visual signals used in the military. Toward that end, two experiments were performed to investigate the feasibility of a tactile language using a lexicon of standardized tactons (tactile icons) within a sentence structure for communication of messages for robot to human communication. Experiment one evaluated tactons from the literature with standardized parameters grouped into categories (directional, dynamic, and static) based on the nature and meaning of the patterns to inform design of a tactile syntax. Findings of this experiment revealed directional tactons showed better performance than non-directional tactons, therefore syntax for experiment two composed of a non-directional and a directional tacton was more likely to show performance better than chance. Experiment two tested the syntax structure of equally performing tactons identified from experiment one, revealing participants' ability to interpret tactile sentences better than chance with or without the presence of an independent work imperative task. This finding advanced the state of the art in tactile displays from one to two word phrases facilitating inclusion of the tactile modality within MMC for HRI.
Identifier: CFE0004778 (IID), ucf:49800 (fedora)
Note(s): 2012-12-01
Ph.D.
Engineering and Computer Science, Industrial Engineering and Management Systems
Doctoral
This record was generated from author submitted information.
Subject(s): Human Robot Interaction -- Tactile Displays -- Tacton -- Multi-Modal Communication
Persistent Link to This Record: http://purl.flvc.org/ucf/fd/CFE0004778
Restrictions on Access: campus 2016-06-15
Host Institution: UCF

In Collections