| Next revision | Previous revision |
| en:safeav:curriculum:hmc-b [2025/09/24 13:29] – created larisas | en:safeav:curriculum:hmc-b [2025/11/05 09:09] (current) – airi |
|---|
| ====== Module: Human Machine communication (Part 1) ====== | ====== Module: Human Machine communication (Part 1) ====== |
| | **Study level** | Bachelor 1 || | |
| | **ECTS credits** | 3-6 || | ^ **Study level** | Bachelor | |
| | **Study forms** | Hybrid or fully online || | ^ **ECTS credits** | 1 ECTS | |
| | **Module aims** | to be added || | ^ **Study forms** | Hybrid or fully online | |
| | **Pre-requirements** | Motivation to study AV, recommended to have basics on programming, electronics and mechatronics || | ^ **Module aims** | The aim of the module is to introduce human–machine interaction and communication concepts for autonomous vehicles. The course develops students’ understanding of how autonomous systems perceive, interpret and communicate with humans using AI-driven, multimodal and human-centred interfaces that support safety, trust and usability. | |
| | **Learning outcomes** | After completing this module (for every topic listed below), the student:\\ - knows x\\ - knows y\\ - understands z\\ - can w || | ^ **Pre-requirements** | Basic knowledge of human factors or cognitive science and interest in user-centred design. Familiarity with control systems and programming and AI-based or embedded systems is recommended but not mandatory. | |
| | ** Topics ** | __Topic AV1 __ (1 ECTS) \\ \\ __Topic AV2 __ (2 ECTS)) \\ \\ __Topic AV3 __ (2 ECTS)\\ \\ __Topic AV4 __ (1 ECTS) \\ || | ^ **Learning outcomes** | **Knowledge**\\ • Explain the principles of HMI and multimodal communication in autonomous systems.\\ • Describe human perceptual and cognitive models relevant to interaction with machines.\\ • Understand the cultural, ethical, and social dimensions influencing communication design.\\ • Recognize standards and best practices in safety-critical HMI.\\ **Skills**\\ • Design and prototype human–machine interfaces that enhance trust and situational awareness.\\ • Evaluate user experience using qualitative and quantitative assessment techniques.\\ • Integrate AI-based dialogue, gesture, and visual communication components within simulation environments.\\ **Understanding**\\ • Appreciate the need for transparency, inclusivity, and cultural sensitivity in AI communication.\\ • Critically assess the ethical implications of human–autonomy collaboration.\\ • Foster responsible design thinking in developing communication frameworks for AVs. | |
| | **Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation || | ^ **Topics** | 1. Introduction to Human–Machine Interaction in Autonomous Vehicles:\\ – Role of HMI in safety, trust, and usability.\\ – Transition from driver-operated to fully autonomous systems.\\ 2. Human Perception and Cognition:\\ – Human sensory systems, attention, and response time.\\ – Comparing human and AI perception models.\\ 3. Communication Modalities:\\ – Visual, auditory, and haptic feedback; external vehicle communication signals.\\ – Pedestrian and passenger interaction mechanisms.\\ 4. The Language of Driving:\\ – Concept and design of shared communication languages between humans and AVs.\\ – Cultural, geographical, and environmental factors in interaction design.\\ 5. AI in Communication:\\ – Role of conventional and LLM-based AI in HMI design and dialogue management.\\ 6. Standards and Case Studies:\\ – AVSC best practices, SAE standards, and real-world HMI deployments. | |
| | **Learning methods** | to be added || | ^ **Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation | |
| | **AI involvement** | Explicit list of AI tools and application mtehods | | | ^ **Learning methods** | **Lecture** — Cover theories of human perception, cognitive ergonomics, and communication in autonomous systems.\\ **Lab works** — Practical development of HMI prototypes (e.g., dashboard simulation, pedestrian signaling models) using Python or Unity.\\ **Individual assignments** — Analytical essays or UX evaluations of existing HMI systems.\\ **Self-learning** — Review of international standards and exploration of open-source HMI datasets and design tools. | |
| | **References to\\ literature** | to be added || | ^ **AI involvement** | AI tools may be used to design conversational interfaces, simulate interaction scenarios, and analyze user feedback. Students must disclose AI assistance transparently and validate all outputs to maintain research and ethical standards. | |
| | **Lab equipment** | to be added || | ^ **Recommended tools and environments** | Unity, MATLAB, ROS2 | |
| | **Virtual lab** | to be added || | ^ **Verification and Validation focus** | | |
| | **MOOC course** | MOOC Courses hosting for SafeAV, IOT-OPEN.EU Reloaded, and Multiasm grants: http://edu.iot-open.eu/course/index.php?categoryid=3 || | ^ **Relevant standards and regulatory frameworks** | AVSC, SAE ITC | |
| |