Consumer Electronics May Curb Distracted Driving
Augmented reality and eye-tracking systems could help prevent car crashes – By KATHY PRETZ, IEEE
Have you been so caught up in reading something while driving that you nearly hit the car in front of you? Missed an exit because you were absorbed in a phone conversation? Or swerved from one lane to the next because you spilled hot coffee in your lap?
These are all examples of distracted driving—any activity that takes your eyes off the road, your hands off the wheel, or your mind off driving. It’s the third leading cause of motor vehicle crashes, according to the U.S. National Highway Traffic Safety Administration. That’s why transportation research labs and car manufacturers are testing augmented reality displays and eye-tracking technology to help keep drivers’ attention on the road.
Projecting information on the windshield directly into the driver’s field of vision to make driving safer has been done with head-up displays (HUDs) in some high-end cars since the late 1980s, according to Gregory M. Fitch, a senior research associate at Virginia Tech’s Transportation Institute, in Blacksburg. He, along with lead author Joseph L. Gabbard and Hyungil Kim, consider AR displays in “Behind the Glass: Driver Challenges and Opportunities for AR Automotive Applications.” The article, published in the February 2014 issue of Proceedings of the IEEE, is available in the IEEE Xplore Digital Library. The transportation institute conducts research on driver, passenger, and pedestrian safety and works with carmakers and transportation agencies to improve the design of vehicles and roadways.
Modeled after head-up cockpit displays for pilots, the first HUD was introduced as an option by General Motors in 1988. The company used a projector and a video generator to project onto a small square at the bottom of the windshield such basic information as the car’s speed, the rpm for manual gear shifting, and the radio’s controls.
Today’s HUDs are offered by carmakers like BMW, Ford, Honda, Mercedes-Benz, Kia, and GM. They still use a projector, but the light source comes from an LED array in the dashboard behind the instrument panel. It bounces off one mirror in the dashboard, then a second, silvered film at the base of the windshield. Displays can show the roadway’s speed limit, the outside temperature, turn-by-turn navigation instructions, and caller ID for incoming phone calls.
And now a new generation of HUDs is integrating these and newer features with augmented reality software. These systems enhance the real world with information collected from the vehicle’s cameras, radar, other sensors, digital map data, GPS positioning signals, and Internet access. Models of outside objects calculated from the driver’s perspective can be taken from a much larger space around the vehicle than before. The augmentations are then generated and projected on the windshield.
Many AR HUDs are under development, notes Fitch. The German auto parts supplier Continental, for example, uses a digital mirror device to magnify the HUD’s images to a much larger size as they’re projected on the windshield. This means that navigation commands, enhanced lane markings, and other graphics would appear to the driver as though they are up ahead on the road, as opposed to traditional HUDs, whose images cannot appear to be past the hood of the car. Continental announced last July that it would have an AR HUD system ready in 2017.
An AR HUD could, for example, paint virtual directional arrows that appear to be laid over the road surface, leading the driver to and through a turn. And the car’s sensors could look farther ahead and warn drivers of hazards in, say, a snowstorm by adding to the windshield an image of a disabled car on the road, according to Gabbard, an associate professor of human factors in Virginia Tech’s Grado department of industrial and systems engineering.
“In the coming years vehicles will have access to much more information, and the question will be how best to present it,” Gabbard continues. “There’s a lot of promise for making driving safer, keeping drivers’ eyes on the road, and helping identify hazards and other objects.”
Some AR HUDs could even display information from the car’s adaptive cruise control and driver-assistance systems, says Fitch. For example, to keep a driver from drifting into another lane, the system could paint virtual road markers. It could also show the driver how close she is to a vehicle in front by overlaying a distance measurement on the windshield. Or it could superimpose the image of a pedestrian who might turn up in the driver’s blind spot.
But AR displays still require a driver to shift focus between virtual objects on the display and real objects up ahead, says Fitch, and there’s little research on the problems of doing so. Obviously, if too many images are presented, the driver’s view could be obscured.
“There are things about the application that we don’t know yet, so car manufacturers have to be careful about what they offer,” adds Gabbard.
EYES WIDE OPEN
Eye tracking is also being explored. Tracking sensors measure the rotation of the head to alert drivers with, for example, a vibration if they become inattentive at the wheel, according to “Driver Inattention Monitoring Systems for Intelligent Vehicles: A Review.” The article, by IEEE members Yanchao Dong and Zhencheng Hu and others at Kumamoto University, in Japan, was published in June 2011 in IEEE Transactions on Intelligent Transportation Systems. It analyzed several techniques being considered for detecting driver distraction.
Several researchers tested commercial eye trackers, including ones from Seeing Machines, in Canberra, Australia, to obtain the physical characteristics related to the face and eyes to analyze inattention. These signals include the width/height ratio of the pupils, eye closing and opening speeds, blink rate, and eyelid movement.
Many eye trackers rely on video cameras to observe the eyes and produce high-resolution images of irises, pupils, and scleras, the white of the eyes. LEDs illuminate the eyes and produce reflections off the corneal surfaces that indicate the orientation of the eyeball. Essential to an eye tracker is its image-processing function, which identifies the eyes within the images, measures the geometric features of the eye’s elements, and computes the spatial positions and orientation of the eyes.
Last September several media outlets reported that Takata, the Japanese automotive safety systems supplier, had signed a contract with Seeing Machines to supply its tracking devices for 500,000 GM vehicles over five years. These would be the first mass-produced vehicles with eye-tracking technology.
IEEE Member Scott Geisler—whose title at GM’s Global Safety Center, in Detroit, is vehicle performance owner for driver distraction, would not confirm these reports forThe Institute but said the company is looking at how new consumer electronics could be applied to guard against distracted driving.
“We’ve considered for many years the role the eyes play in looking to understand whether the driver is registering and processing the environment to maintain control of the vehicle,” he says. “It’s among the car companies’ largest areas of research. We are engaged in a number of things that try to reduce harm yet allow drivers to maintain connectivity with the car without posing additional risk.”