Researchers from the University of Nottingham in the UK conducted a study to understand how pedestrians react to autonomous vehicles. To do this, they used a clever technique involving a camouflaged driver who appeared to be a part of the car.
The driver wore clothing that made them look like a car seat, complete with headgear that resembled a headrest. This allowed the driver to control the vehicle while giving the impression of a fully autonomous car.
The main objective of the study was to gauge public trust in autonomous vehicles and explore different ways to communicate the car’s intentions or driving behavior to pedestrians using External Human-Machine Interfaces (eHMI).
The researchers experimented with three types of visual displays on the car. These displays were created using an addressable RGB LED matrix placed on the front of the hood and an LED strip on top of the windshield.
The first design employed the LED strip to mimic “the papillary response of an eye: lateral movement demonstrated scanning/awareness, and blinking provided an implicit cue of the vehicle’s intention to give way.” A second design made use of a face and eyes on the matrix display accompanied by “humanlike language” text prompts as the car approached a pedestrian (such as “I have seen you” or “I am giving way”), while a third produced a vehicle icon and used “vehicle-centric language” to try and get the message across.
The eHMIs were programmed using an Arduino Mega microcontroller board and controlled by a team member in the rear seat using push-button controls. The test vehicle was driven around the university campus for several days, and the interactions between the car and 520 pedestrians were recorded using front and rear dashcams.
Additionally, researchers stationed themselves at crossing points to survey pedestrians about their experience.
After analyzing the data, the researchers found that the eHMI design featuring expressive eyes was the most effective method of communicating the vehicle’s intent to pedestrians.
“With regards to the displays, the explicit eyes eHMI not only captured the most visual attention, but it also received good ratings for trust and clarity as well as the highest preference, whereas the implicit LED strip was rated as less clear and invited lower ratings of trust,” said Professor Gary Burnett, Head of the Human Factors Research Group and Professor of Transport Human Factors in the Faculty of Engineering.
“An interesting additional discovery was that pedestrians continued to use hand gestures, for example thanking the car, despite most survey respondents believing the car was genuinely driverless – showing that there is still an expectation of some kind of social element in these types of interaction,” he added.
The findings of the study were presented at the Ergonomics & Human Factors 2023 Conference. Moving forward, the research team plans to investigate how other vulnerable road users naturally interact with autonomous vehicles. They also recommend conducting research over longer periods to better understand how public perceptions of driverless cars may change over time.