Refine Your Search

Search Results

Author:
Viewing 1 to 4 of 4
Technical Paper

Can You Still Look Up? Remote Rotary Controller vs. Touchscreen

2017-03-28
2017-01-1386
The popularity of new Human-Machine-Interfaces (HMIs) comes with growing concerns for driver distraction. In part, this concern stems from a rising challenge to design systems that can make functions accessible to drivers while maintaining drivers’ ability to cope with the complex driving task. Therefore, engineers need assessment methods which can evaluate how well a user interface achieves the dual-goal of making secondary tasks accessible, while allowing safe driving. Most prior methods have emphasized measuring off-road glances during HMI use. An alternative to this is to consider both on-road and off-road glances, as done in Kircher and Ahlstrom’s AttenD algorithm [1]. In this study, we compared two types of prevalent visual-manual user interfaces based on AttenD. The two HMIs of interest were a touchscreen-based interface (already in production) and a remote-rotary-controller-based interface (a high-fidelity prototype).
Journal Article

Bench-Marking Drivers' Visual and Cognitive Demands: A Feasibility Study

2015-04-14
2015-01-1389
Objective tools that can assess the demands associated with in-vehicle human machine interfaces (HMIs) could assist automotive engineers designing safer interaction. This paper presents empirical evidence supporting one objective assessment approach, which compares the demand associated with in-vehicle tasks to the demand associated with “benchmarking” or “comparison tasks”. In the presented study, there were two types of benchmarking tasks-a modified surrogate reference task (SuRT) and a delayed digit recall task (n-back task) - representing different levels of visual demand and cognitive demand respectively. Twenty-four participants performed these two types of benchmarking tasks as well as two radio tasks while driving a vehicle on a closed-loop test track. Response measures included physiological (heart rate), glance metrics, driving performance (steering entropy) and subjective workload ratings.
Technical Paper

HUD Future in the Driverless Vehicle Society: Technology Leadership Brief

2012-10-08
2012-01-9022
New sensing and fast processing technologies will create an electronic driver in every car by 2025. All people in the vehicle will be passengers! The vehicle will drive by itself from A to B. In this case what will be need for HUD? Below is an investigation of the key issues and some possible solutions.
Technical Paper

Adaptation of the Cognitive Avionic Tool Set (CATS) into Automotive Human Machine Interface Design Process

2011-04-12
2011-01-0594
DENSO International America, Inc. and the University of Iowa-Operator Performance Laboratory (OPL) have developed a series of new Multi-Modal Interface for Drivers (MMID) in order to improve driver safety, comfort, convenience and connectivity. Three MMID concepts were developed: GUI 1, GUI 2 and GUI 1-HUD. All three of the MMIDs used a new Reconfigurable Haptic Joystick (RHJ) on the steering wheel and new concept HMI Dual Touch Function Switches (DTFS) device. The DTFS use capacitive and mechanic sensing located on the back of the steering wheel as input operation devices. Inputs from the new controls were combined with a large TFT LCD display in the instrument cluster, a Head Up Display (HUD) and Sound as output devices. The new MMID system was installed in a Lexus LS-430. The climate control panel and radio panels of the LS-430 were used as a baseline condition to which the new designs were compared.
X