Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
en:safeav:curriculum:maps-m [2025/09/24 13:32] – created larisasen:safeav:curriculum:maps-m [2025/11/05 09:17] (current) airi
Line 1: Line 1:
 ====== Module: Perception, Mapping, and Localization (Part 2) ====== ====== Module: Perception, Mapping, and Localization (Part 2) ======
-**Study level**                 | Bachelor 1                                                                                                                                                || + 
-**ECTS credits**                | 3-6                                                                                                                                                       || +**Study level** | Master 
-**Study forms**                 | Hybrid or fully online                                                                                                                                    |+**ECTS credits** | 1 ECTS 
-**Module aims**                 | to be added                                                                                                                                               |+**Study forms** | Hybrid or fully online | 
-**Pre-requirements**            Motivation to study AVrecommended to have basics on programming, electronics and mechatronics                                                           |+**Module aims** | The aim of the module is to introduce instability and uncertainty aspects in perception, mapping and localisation for autonomous systems. The course develops students’ ability to model sensor noise and uncertainty, design robust perception and fusion algorithms, and assess system behaviour in challenging and safety-critical conditions in line with relevant standards. 
-**Learning outcomes**           After completing this module (for every topic listed below), the student:\\  - knows x\\  - knows y\\  understands z\\  can w                         |+**Pre-requirements** | Basic knowledge of probability and statisticslinear algebra and perception or sensor fusion concepts, as well as programming skills in Python or C++. Familiarity with roboticscomputer vision, control theory, machine learning or ROS-based tools is recommended but not mandatory. 
-** Topics **                    __Topic AV1 __ (ECTS) \\ \\ __Topic AV2 __ (ECTS)) \\ \\ __Topic AV3 __ (2 ECTS)\\ \\ __Topic AV4 __ (1 ECTS) \\                                      |+**Learning outcomes** | **Knowledge**\\ • Distinguish between aleatoric and epistemic uncertainty and describe their impact on perception and mapping.\\ • Explain sources of instability such as sensor noiseocclusions, quantization, and adversarial attacks.\\ • Understand safety frameworks relevant to uncertainty handling.\\ • Describe the role of sensor fusion and redundancy in mitigating uncertainty and maintaining localization accuracy.\\ **Skills**\\ • Model sensor noise and environmental uncertainty using statistical and probabilistic approaches.\\ • Apply sensor fusion algorithms for robust localization.\\ • Evaluate system robustness against occlusions, reflection errors, and adversarial perturbations.\\ • Design and conduct experiments to quantify uncertainty and validate robustness using simulation and real-world datasets.\\ **Understanding**\\ • Appreciate the trade-offs between computational complexity and robustness in multi-sensor systems.\\ • Recognize the ethical and safety implications of unstable or unreliable perception in autonomous systems.\\ • Demonstrate awareness of international safety standards and adopt responsible practices for system validation. 
-**Type of assessment**          | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation     |+**Topics** | 1. Sources of Instability and Uncertainty:\\    – Aleatoric vs epistemic uncertainty, stochastic processes, and measurement noise.\\    – Quantization effects, sensor noise modeling, and environmental randomness.\\ 2. Sensor Noise and Fusion:\\    – Multi-sensor integration: LiDAR, radar, GNSS, IMU, camera.\\    – Noise filtering and smoothing techniques.\\ 3. Occlusions and Partial Observability:\\    – Handling occlusions, weather effects, and incomplete sensor coverage.\\    – Tracking and prediction in uncertain environments.\\ 4. Adversarial Robustness:\\    – Adversarial attacks on perception networks and their detection.\\    – SOTIF (ISO 21448and safety verification of intended functionality.\\ 5. Validation and Safety Assessment:\\    – Simulation-based validation and uncertainty quantification.\\    – Evaluation metrics for perception and localization under uncertainty.\\ 6. Real-world Case Studies:\\    – Sensor degradation in autonomous vehicles, calibration drift, and redundancy design. 
-**Learning methods**            | to be added                                                                                                                                               || +**Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation
-| **AI involvement**              | Explicit list of AI tools and application mtehods                                                                                                      |   | +**Learning methods** | **Lecture** — Explore theoretical principles of uncertainty, noise modeling, and perception instability.\\ **Lab works** — Implement sensor fusion, uncertainty quantification, and robustness evaluation in ROS2 or MATLAB.\\ **Individual assignments** — Research and report on adversarial attacks, occlusion handling, and noise modeling strategies.\\ **Self-learning** — Study international standards and open-source datasets. | 
-**References to\\ literature**  to be added                                                                                                                                               || +^ **AI involvement** AI tools may assist in simulating uncertainty propagation, detecting adversarial patterns, and performing sensitivity analysis. Students must critically validate results, document methodology, and ensure reproducibility and compliance with academic integrity standards. 
-**Lab equipment**               | to be added                                                                                                                                               || +**Recommended tools and environments** | ROS2, MATLAB, KITTI, NuScenes, Waymo 
-**Virtual lab**                 | to be added                                                                                                                                               || +**Verification and Validation focus** |  
-**MOOC course**                 MOOC Courses hosting for SafeAVIOT-OPEN.EU Reloaded, and Multiasm grants: http://edu.iot-open.eu/course/index.php?categoryid=3                          ||+**Relevant standards and regulatory frameworks** | ISO 26262ISO 21448 (SOTIF) |
  
en/safeav/curriculum/maps-m.1758720766.txt.gz · Last modified: 2025/09/24 13:32 by larisas
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0