This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| en:safeav:hmc:hmi [2025/05/09 09:27] – [Passenger Communication] raivo.sell | en:safeav:hmc:hmi [2025/10/20 18:52] (current) – raivo.sell | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | ====== Human Machine Interface and Communication ====== | + | ====== Human–Machine Interface and Communication ====== |
| {{: | {{: | ||
| - | This chapter explores the specificities of Human-Machine Interaction (HMI) in the context of autonomous vehicles. It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehicle, the challenge arises: how should AI-driven systems communicate effectively with passengers, pedestrians, | ||
| - | This section addresses | + | This chapter explores |
| - | A concept, the Language of Driving (LoD), will be introduced, offering | + | HMI in AVs extends far beyond the driver’s dashboard. It defines the communication bridge between **machines, people, and infrastructure** — shaping how autonomy is perceived and trusted. Effective HMI determines whether automation is experienced as *intelligent and reliable* or *opaque and alien.* |
| + | |||
| + | ===== Changing Paradigms of Communication ===== | ||
| + | |||
| + | Traditional driver interfaces were designed to support manual control. In contrast, autonomous vehicles must communicate *intent*, *status*, and *safety* both inside and outside the vehicle. The absence of human drivers requires new communication models to ensure safe interaction among all participants. | ||
| + | |||
| + | This section addresses the available communication channels and discusses how these channels must be redefined to accommodate the new paradigm. Additionally, | ||
| + | |||
| + | A key concept | ||
| ===== Human Perception and Driving ===== | ===== Human Perception and Driving ===== | ||
| - | Understanding how humans perceive the world is crucial for autonomous vehicles to effectively | + | Understanding how humans perceive the world is crucial for autonomous vehicles to communicate |
| - | ===== Cultural and Social Interactions ===== | + | By studying these perceptual |
| - | This chapter explores how AVs might adopt human-like communication methods, such as facial expressions or humanoid interfaces, to effectively interact in complex social driving environments. | + | * Body orientation or focus of attention. |
| - | ===== Language of Driving | + | * Gesture and trajectory anticipation. |
| + | * Subtle speed or direction changes as non-verbal cues. | ||
| + | |||
| + | Such behaviorally inspired signaling helps AVs become socially legible, supporting *shared understanding* on the road. | ||
| + | |||
| + | ===== Cultural and Social Interactions | ||
| - | Human communities build languages for cooperative teaming. To participate in the act of cooperative transportation, | + | Driving is a social |
| - | ==== Passenger Communication ==== | + | Autonomous vehicles may need to adapt their communication style — from light colors and icons to audio tones and message phrasing — depending on cultural and regional expectations. |
| - | A key requirement of an effective Passenger Communication system is to have in-built fail-safe mechanisms based on the environment. AVSC has worked with SAE ITC to build group standards around the safe deployment of SAE Level 4 and Level 5 ADS and has recently released an “AVSC Best Practice for Passenger-Initiated Emergency Trip Interruption.”However, passenger communication extends beyond emergency stop and call functions. Warnings and explanations of unexpected maneuvers may need to be communicated to passengers even when there is no immediate danger. This should replicate and replace the function that a human bus driver would typically perform | + | Research explores whether AVs could adopt **human-like communication methods**, such as digital facial expressions or humanoid gestures, to support more natural interactions |
| - | ==== Pedestrian Communication ==== | + | |
| + | ===== AI Role in Communication ===== | ||
| + | Modern HMI systems increasingly rely on **artificial intelligence**, | ||
| + | AI enables: | ||
| + | * **Context-aware dialogue systems** for passengers and operators. | ||
| + | * **Adaptive message prioritization** based on urgency and environment. | ||
| + | * **Natural language explanations** of AV behavior (e.g., *“Slowing down for crossing pedestrian”*). | ||
| + | The evolution toward AI-mediated interfaces marks a shift from fixed UI design toward *conversational and contextual* vehicle communication. | ||
| + | {{: | ||