Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
en:safeav:hmc:hmi [2025/05/09 09:38] raivo.sellen:safeav:hmc:hmi [2025/10/20 18:52] (current) raivo.sell
Line 1: Line 1:
-====== Human Machine Interface and Communication ======+====== HumanMachine Interface and Communication ======
 {{:en:iot-open:czapka_b.png?50| Bachelors (1st level) classification icon }} {{:en:iot-open:czapka_b.png?50| Bachelors (1st level) classification icon }}
-This chapter explores the specificities of Human-Machine Interaction (HMI) in the context of autonomous vehicles. It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehicle, the challenge arises: how should AI-driven systems communicate effectively with passengers, pedestrians, and other road users? 
  
-This section addresses the available communication channels and discusses how these channels must be redefined and implemented to accommodate the new paradigmAdditionallyit considers how various environmental factors—including culturalgeographical, seasonal, and spatial elements—can impact communication strategies.+This chapter explores the specificities of **Human–Machine Interaction (HMI)** in the context of autonomous vehicles (AVs)It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehiclethe challenge arises: *how should AI-driven systems communicate effectively with passengerspedestrians, and other road users?*
  
-A conceptthe Language of Driving (LoD), will be introduced, offering a framework for structuring and standardizing communication in autonomous vehicle contexts.+HMI in AVs extends far beyond the driver’s dashboard. It defines the communication bridge between **machines, people, and infrastructure** — shaping how autonomy is perceived and trusted. Effective HMI determines whether automation is experienced as *intelligent and reliable* or *opaque and alien.* 
 + 
 +===== Changing Paradigms of Communication ===== 
 + 
 +Traditional driver interfaces were designed to support manual control. In contrast, autonomous vehicles must communicate *intent*, *status*, and *safety* both inside and outside the vehicle. The absence of human drivers requires new communication models to ensure safe interaction among all participants. 
 + 
 +This section addresses the available communication channels and discusses how these channels must be redefined to accommodate the new paradigm. Additionally, it considers how various environmental factors—including cultural, geographical, seasonal, and spatial elements—impact communication strategies. 
 + 
 +key concept in this transformation is the **Language of Driving (LoD)** — a framework for structuring and standardizing how autonomous vehicles express awareness and intent toward humans (Kalda et al., 2022).
  
 ===== Human Perception and Driving ===== ===== Human Perception and Driving =====
  
-Understanding how humans perceive the world is crucial for autonomous vehicles to effectively communicate and interact with themThis chapter explores how human perception, driven by sensory input and cognitive processingcan inform the development of autonomous perception systems, emphasizing the parallels between human and animal intelligence in recognizing focus, body positioning, gestures, and movement. By examining innate perceptual capabilities such as basic physics calculations and environmental modelingAVs can better anticipate human behavior and respond appropriately in complex traffic environments.+Understanding how humans perceive the world is crucial for autonomous vehicles to communicate effectively. Human perception is multimodal — combining sightsoundmotion cues, and social awareness  
 +By studying these perceptual mechanisms, AV designers can emulate intuitive human signals such as
 + 
 +  * Body orientation or focus of attention.   
 +  * Gesture and trajectory anticipation.   
 +  * Subtle speed or direction changes as non-verbal cues.   
 + 
 +Such behaviorally inspired signaling helps AVs become socially legiblesupporting *shared understanding* on the road. 
 ===== Cultural and Social Interactions ===== ===== Cultural and Social Interactions =====
  
-This chapter explores how AVs might adopt human-like communication methodssuch as facial expressions or humanoid interfacesto effectively interact in complex social driving environments+Driving is a social act. Culturenormsand environment shape how humans interpret signals and movements  
-===== Language of Driving =====+Autonomous vehicles may need to adapt their communication style — from light colors and icons to audio tones and message phrasing — depending on cultural and regional expectations.
  
-Human communities build languages for cooperative teaming. To participate in the act of cooperative transportationAVs will have to understand this language. Depending on the level of expectation communicated by the AV, this language may extend into social interaction models. +Research explores whether AVs could adopt **human-like communication methods**, such as digital facial expressions or humanoid gestures, to support more natural interactions in complex social driving contexts.
-==== Passenger Communication ====+
  
-A key requirement of an effective Passenger Communication system is to have in-built fail-safe mechanisms based on the environment. AVSC has worked with SAE ITC to build group standards around the safe deployment of SAE Level 4 and Level 5 ADS and has recently released an “AVSC Best Practice for Passenger-Initiated Emergency Trip Interruption.”However, passenger communication extends beyond emergency stop and call functions. Warnings and explanations of unexpected maneuvers may need to be communicated to passengers even when there is no immediate danger. This should replicate and replace the function that a human bus driver would typically perform in such situations. +===== AI Role in Communication =====
-==== Pedestrian Communication ====+
  
-Communication between the car and pedestrians at a crosswalk is a difficult and important problem for automation.+Modern HMI systems increasingly rely on **artificial intelligence**, including large language models (LLMs), to process complex situational data and adapt communication in real time  
 +AI enables:
  
-{{  :en:safeav:hmc:hmi_taltech.png?600|  }}+  * **Context-aware dialogue systems** for passengers and operators.   
 +  * **Adaptive message prioritization** based on urgency and environment.   
 +  * **Natural language explanations** of AV behavior (e.g., *“Slowing down for crossing pedestrian”*).  
  
-==== AI role on communication ====+The evolution toward AI-mediated interfaces marks a shift from fixed UI design toward *conversational and contextual* vehicle communication.
  
-The role of conventional and LLM based AI in HMI.+{{:en:safeav:hmc:hmi_taltech.png?600| Example of multimodal HMI used in TalTech autonomous shuttle research (source: Kalda et al., 2022).}}
  
  
en/safeav/hmc/hmi.1746783510.txt.gz · Last modified: 2025/05/09 09:38 by raivo.sell
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0