| Next revision | Previous revision |
| en:safeav:curriculum:as [2025/09/24 13:13] – created larisas | en:safeav:curriculum:as [2025/11/05 08:56] (current) – airi |
|---|
| ====== Module: Autonomous Vehicles ====== | ====== Module: Autonomous Vehicles ====== |
| | **Study level** | Bachelor 1 || | |
| | **ECTS credits** | 3-6 || | ^ **Study level** | Bachelor | |
| | **Study forms** | Hybrid or fully online || | ^ **ECTS credits** | 1 ECTS | |
| | **Module aims** | to be added || | ^ **Study forms** | Hybrid or fully online | |
| | **Pre-requirements** | Motivation to study AV, recommended to have basics on programming, electronics and mechatronics || | ^ **Module aims** | The aim of the module is to introduce the fundamental concepts, architectures and application domains of autonomous vehicles across ground, aerial and marine systems. The course develops students’ system-level understanding of the autonomy stack from perception and localisation to planning and control, highlighting the role of AI, safety and basic verification considerations in real-world deployment. | |
| | **Learning outcomes** | After completing this module (for every topic listed below), the student:\\ - knows x\\ - knows y\\ - understands z\\ - can w || | ^ **Pre-requirements** | Interest in autonomous systems and basic knowledge of programming, signals and control, and electronics or mechatronics. Prior exposure to robotics concepts and Linux/ROS environments, as well as familiarity with linear algebra and probability, is recommended. | |
| | ** Topics ** | __Topic AV1 __ (1 ECTS) \\ \\ __Topic AV2 __ (2 ECTS)) \\ \\ __Topic AV3 __ (2 ECTS)\\ \\ __Topic AV4 __ (1 ECTS) \\ || | ^ **Learning outcomes** | **Knowledge**\\ • Explain the Sense–Plan–Act paradigm and the layered autonomy stack.\\ • Describe and contrast middleware/architectures.\\ • Summarize AI/ML roles in perception and decision-making, plus limits and safety implications.\\ • Identify V&V concepts and domain-specific safety standards.\\ **Skills**\\ • Build a minimal autonomy pipeline in simulation and tune it for a given ODD.\\ • Integrate modules via publish/subscribe interfaces and evaluate latency, determinism, and fault-tolerance trade-offs.\\ • Design basic experiments to validate algorithms and interpret results.\\ **Understanding**\\ • Reason about distributed vs. centralized architectures and their impact on scalability and reliability.\\ • Appraise governance, legal/ethical constraints, and cybersecurity risks for AV deployment. | |
| | **Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation || | ^ **Topics** | 1. Introduction to autonomous systems and autonomy definitions\\ 2. Sense–Plan–Act and data flow in autonomous vehicles; centralized vs. distributed designs; safety & redundancy\\ 3. Reference architectures and middleware: ROS/ROS2 (DDS), AUTOSAR Adaptive, JAUS, MOOS-IvP\\ 4. Application domains: ground, aerial, and marine; domain challenges\\ 5. AI/ML for perception and decision-making; hybrid model-based, learning-based stacks\\ 6. Validation and Verification introduction (ODD, coverage, field response); simulation, SIL/HIL; safety standards\\ 7. Governance, legal and ethical frameworks for autonomy\\ 8. Cybersecurity for autonomous systems: electronics/firmware, communication, control, operations | |
| | **Learning methods** | to be added || | ^ **Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation. | |
| | **AI involvement** | Explicit list of AI tools and application mtehods | | | ^ **Learning methods** | **Lecture** — Conceptual foundations (architectures, middleware, SPA, safety/V&V, governance) with case studies from ground, aerial, and marine domains.\\ **Lab works** — Hands-on exercises in simulation (ROS2/Autoware/PX4 or MOOS-IvP) to assemble perception-planning-control pipelines and evaluate behavior.\\ **Individual assignments** — Focused mini-projects (e.g., perception module, path planner, DDS QoS study) with short reports on design and results.\\ **Self-learning** — Guided readings and video demos on standards and frameworks; independent experimentation to deepen understanding of chosen topics. | |
| | **References to\\ literature** | to be added || | ^ **AI involvement** | Deep learning for perception (object detection, semantic segmentation, tracking); learning-based prediction; SLAM and sensor fusion with ML components; reinforcement/behavior-tree hybrids for decision-making; data-centric evaluation in simulation. | |
| | **Lab equipment** | to be added || | ^ **Recommended tools and environments** | ROS/ROS2, MOOS-IvP, Autoware, PX4/ArduPilot | |
| | **Virtual lab** | to be added || | ^ **Verification and Validation focus** | | |
| | **MOOC course** | MOOC Courses hosting for SafeAV, IOT-OPEN.EU Reloaded, and Multiasm grants: http://edu.iot-open.eu/course/index.php?categoryid=3 || | ^ **Relevant standards and regulatory frameworks** | ISO 26262, DO-178C, AUTOSAR, JAUS | |
| |