Refine Your Search

Search Results

Viewing 1 to 2 of 2
Technical Paper

Multimodal HCI Integration

1999-10-19
1999-01-5509
A multipurpose test-bed for integrating user interface and sensor technologies has been developed, based on a client- server architecture. Various interaction modalities (Speech recognition, 3-D Audio, Pointing, wireless Handheld- PC-based control and interaction, sensor interaction, etc.) are implemented as servers, encapsulating and exposing commercial and research software packages. The system allows for integrated user interaction with large and small displays using speech commands combined with pointing, spatialized audio, and other modalities. Simultaneous and independent speech recognition for two users is supported; users may be equipped with conventional acoustic or new body-coupled microphones.
Technical Paper

Multimodal Maintenance Application HCI

1999-10-19
1999-01-5508
Human Computer Interface (HCI) in applications for the maintenance of complex machinery such as an aircraft can be enhanced by exploiting new developments in HCI. We have developed a multimodal HCI demonstration system for maintenance applications, incorporating Augmented Reality (AR), Speech Recognition, and 3- dimensional audio technologies. The Augmented Reality interface is based on an original dynamic tracking approach to provide rapid update of the scene with graphical overlays. We enhance the use of this interface with speech recognition to control the system and to add annotations using dictation-based text information. A combination of 3-D audio, graphic animations, and text displays is used to communicate information to the user.
X