This is an old revision of the document!


Module: Autonomy Validation Tools

Study level Master
ECTS credits 1 ECTS
Study forms Hybrid or fully online
Module aims This master’s-level module aims to develop advanced understanding of tools, frameworks, and methodologies used in verification and validation (V&V) of autonomous and cyber-physical systems. Students will study how safety-critical validation processes evolve across industries—automotive, aerospace, and robotics—covering both physical and virtual testing. The course introduces global standards such as ISO 26262, ISO 21448, DO-178C, and UL 4600, alongside cutting-edge toolchains (Pegasus, ASAM OpenSCENARIO, CARLA, and ZalaZONE). Through simulations and case studies, students will gain expertise in developing test campaigns, ensuring system reliability, and aligning validation strategies with regulatory requirements.
Pre-requirements Strong foundation in systems engineering, control theory, and AI algorithms. Experience with programming (Python, C++, MATLAB), embedded systems, and simulation platforms (ROS 2, CARLA, Simulink). Familiarity with safety standards such as ISO 26262, ISO 21448 (SOTIF), or DO-178C is recommended. Prior knowledge of testing frameworks and data-driven development will be beneficial.
Learning outcomes Knowledge:
• Explain the role and structure of verification and validation in the autonomy lifecycle.
• Describe international standards (ISO 26262, ISO 21448, DO-178C, UL 4600) and their influence on testing processes.
• Understand the architecture of physical, virtual, and hybrid test environments for autonomous systems.
• Identify limitations and emerging research trends in simulation-based validation and safety case generation.
Skills:
• Design and execute test plans using real and simulated environments (Scenic, CARLA, rFpro, ZalaZONE).
• Apply AI-driven methods for scenario generation, coverage analysis, and failure detection.
• Integrate ASAM OpenSCENARIO and Pegasus toolchains into validation workflows.
• Assess compliance and produce documentation aligned with certification processes.
Understanding/Attitudes:
• Appreciate the interdependence of testing, regulation, and ethical assurance in autonomous systems.
• Recognize challenges of validating stochastic, learning-based algorithms.
• Demonstrate accountability, transparency, and critical thinking in evaluating safety and validation data.
Topics 1. Overview of Verification and Validation (V&V):
– History and evolution from traditional software testing to AI-based autonomy validation.
– Key principles: verification vs validation, safety cases, and traceability.
2. International Standards:
– ISO 26262, ISO 21448 (SOTIF), DO-178C, UL 4600, and IEEE P2851.
– Harmonization of global V&V requirements.
3. Physical and Virtual Testing Environments:
– Real-world validation sites (ZalaZONE, MCity) and virtual tools (CARLA, rFpro, IPG CarMaker).
– HIL/SIL/MIL testing, sensor simulation, and environmental modeling.
4. Scenario-Based Validation:
– Pegasus and ASAM OpenSCENARIO framework for scenario design and coverage.
– Edge case generation, fault injection, and adversarial testing.
5. AI-Enhanced Validation:
– AI for test optimization, uncertainty quantification, and robustness analysis.
6. Certification and Compliance:
– Safety argumentation, data transparency, and audit readiness.
– Ethical and governance challenges in autonomous validation.
Type of assessment The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation
Learning methods Lectures: Present theories, standards, and frameworks governing verification and validation of autonomous systems.
Lab works: Conduct simulation-based testing using CARLA, MATLAB/Simulink, and OpenSCENARIO; perform hardware-in-the-loop experiments.
Individual assignments: Develop validation plans, compare standards, and write safety assurance documentation.
Self-learning: Review case studies from Pegasus and ZalaZONE; analyze real certification reports and research papers.
AI involvement Yes — AI tools may assist in generating test scenarios, automating fault detection, and analyzing coverage metrics. Students must document AI involvement transparently and validate all outputs against engineering and ethical standards.
Recommended tools and environments
Verification and Validation focus
Relevant standards and regulatory frameworks
en/safeav/curriculum/avt-m.1762164387.txt.gz · Last modified: 2025/11/03 10:06 by raivo.sell
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0