Article Preview
TopIntroduction
In 2010 alone, driver distraction due to an in-vehicle device or display/control is estimated to have caused 26,000 crashes in the United States (NHTSA, 2012). While in an ideal world, drivers would be completely focused on the driving task at hand, drivers are instead frequently distracted by secondary tasks (NHTSA, 2012). Driver distraction refers to manual, visual, or cognitive distraction as defined in the NHTSA guidelines (NHTSA, 2012). The driving task is inherently a visual task, as most of the information (e.g. visual flow, speed relative to vehicle ahead, navigation cues, roadway hazards etc.) is visually conveyed to drivers. Therefore, when drivers take their eyes off of the roadway for any reason, they may miss information necessary for safe driving. While in-vehicle displays can provide timely navigation instructions or other driving related information, the task of driving is inherently susceptible to visual distraction when using displays in vehicle. Thus, it is essential to understand how new in-vehicle devices may distract drivers and apply this knowledge to the design of future vehicle-based displays.
Augmented reality (AR) uses a visual display to overlay virtual images onto a person’s view of the real world (Azuma, 1997). AR displays can provide information that is not readily available to the user when simply viewing the surrounding environment. In some cases, AR displays may help remove distractions inherent in viewing information through a separate display because visual information innate to the environment can be gathered simultaneously with relevant visual information provided via the AR display (Azuma, 1997). Broadly speaking, AR interfaces can contain both conformal (registered) graphics, which are perceptually attached to the real world as well as non-conformal (screen fixed) graphics that remain fixed in the screen space but are overlaid atop of the users’ view of the real world nonetheless. Conformal and non-conformal AR images can be conveyed to the user via either head-up displays (HUDs) or head-down displays (HDD). Optical see-through HUDs allow users to potentially continue looking at the road scene while using in-vehicle AR display because they can be positioned very near drivers’ natural line of sight. Conversely, both video-based AR and traditional HDDs require users to look away from their preferred line of sight to gather information. In addition, the focal distance of HDDs require drivers to accommodate to near distances (less than one meter), while HUDs may provide a range of focal distances, generally between two meters and optical infinity depending upon the display hardware design.
HDDs have dominated in-vehicle displays with the center console and dashboard serving as the main operating centers for drivers’ secondary and tertiary tasks. While HUDs have been widely used in aircrafts, recent years have seen renewed interest in implementing HUDs into ground vehicles. In the coming years, it is likely that AR-based HUDs will increasingly become available in commercial vehicles offering a range of driving related functions (e.g., supporting primary, secondary and tertiary tasks). However, the potential effects of AR HUDs on driver distraction and thus driving performance need further exploration.