AudioMORPH Automated Auditory Interface Adaptation

Melody Moore

AudioMORPH is a technique and toolset that automates the process of adapting graphical user interfaces for computer applications in the workplace to auditory interfaces. The goal of the AudioMORPH project is to speed the production of auditory interfaces to quickly and easily accommodate workers who are visually impaired, thereby removing barriers to employment. In the workplace, web-based systems are well supported by commercial screen readers. However, the majority of businesses still use proprietary, non-web based business applications. Commercial screen readers can be used to navigate any software application, but they may be cumbersome without customization in the form of scripts that define navigation paths and shortcuts. These scripts are usually written by a rehabilitation engineer with a programming background. Fully customizing a screen reader for a complex workplace application can take weeks or months.

The AudioMORPH toolkit can be used by a "domain expert" a supervisor or co-worker familiar with the business application that the visually impaired employee will use. It creates scripts automatically, removing the need for a programming background, reducing dependence on outside professional customization. AudioMORPH allows common navigation paths to be captured during normal use, and automatically generates scripts for screen reader customization. It also allows flexibility, providing arbitrary mapping for keyboard shortcuts and allowing screen reader-specific commands to be added. AudioMORPH is not dependent on any particular screen reader, although its initial code generator is for the popular system JAWS. The toolkit is currently being tested in the field and will be available at the end of this year.


Presentation Slides:
Slides 1-15


Melody Moore, Ph.D is the Director of the GSU BrainLab, whose mission is to research innovative human-computer interaction for people with severe disabilities, including direct brain interfaces and other biometric interfaces. She also maintains a strong interest in software evolution technologies, researching context-dependent user interface reengineering. Her work has been funded by the National Science Foundation, National Institutes of Health (NINDS), NIDRR, and DARPA. Prior to GSU, Dr. Moore was a Research Scientist at the College of Computing at Georgia Tech for nine years, creating and directing the Open Systems lab. Before coming to academia, she worked for nine years in industry as a professional software engineer developing real-time embedded systems, secure operating systems, networking, and compilers. Dr. Moore holds a Ph.D. in Computer Science from the Georgia Institute of Technology (1998). Her dissertation work in user interface reengineering combined the areas of Human-Computer Interface and Software Engineering.

Presentation from the 'Workplace Accommodations: State of the Science' conference, September 15-16, 2005, Atlanta, GA


NIDRR

CATEA