NCSU Institutional Repository >
NC State Theses and Dissertations >
Please use this identifier to cite or link to this item:
|Title: ||Auditory Cueing Effects on Human Performance with an Adaptive System|
|Authors: ||Warren, Heather Lynn|
|Advisors: ||Gary A. Mirka, Committee Member|
Michael S. Wogalter, Committee Member
David B. Kaber, Committee Chair
|Keywords: ||adaptive automation|
Multiple Resource Theory
|Issue Date: ||17-May-2002|
|Discipline: ||Industrial Engineering|
|Abstract: ||Adaptive automation (AA) is the dynamic allocation of complex system functions to a human operator and/or automated controller on the basis of the state of the human-task-environment and having the objective of optimizing overall system performance. Adaptive automation has been successfully applied to different types of simulated tasks in laboratory settings; however, there have been few field applications and problems remain in providing human operators of adaptive systems with adequate feedback on changing system states and modes of operation. This may result in poor operator situation awareness (SA). Previous AA research on system feedback mechanisms has primarily focused on visual cues of system state changes. Some work has investigated complex auditory icons, but no research has considered the use of verbal cues on adaptive system state changes.
The current research investigated the potential for multimodal interfaces to improve adaptively automated system performance by considering multiple resource theory (MRT) of attention in system interface design. Subjects were provided with feedback on AA states via their visual and auditory senses in order to improve overall system performance and human-automation interaction. An experiment was conducted to compare the use of visual (icons), auditory (earcons) and verbal cues for conveying the state of an adaptive teleoperator (remote-control robot) in a high-fidelity, virtual reality simulation of an underwater mine disposal task. Earcons have been found to be beneficial for cueing operators of automated system states, but there has been no research to investigate earcons as feedback mechanisms in either complex human-machine interaction or in adaptively automated systems. In this study, modal cues were associated with task phase changes and teleoperator control mode changes.
The type of cue and level of cue complexity (level of detail) was varied between and within subjects, respectively. Operator performance was evaluated in terms of system-state awareness, accurate control commands, and time-to-task completion. The research sought to discover which cue type was most effective for facilitating overall system performance and maintaining operator SA.
Results demonstrated the manner in which humans use visual and auditory sensory cues for feedback when dealing with adaptively automated systems is in agreement with MRT. Verbal cues were identified as being superior for warning operators of system-state changes, maintaining SA (attentional resources) and facilitating overall complex system performance. The results of this study are applicable to the design of future automated systems and may serve to improve efficiency and effectiveness of performance.|
|Appears in Collections:||Theses|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.