Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
en:safeav:curriculum:maps-m [2025/11/03 10:04] raivo.sellen:safeav:curriculum:maps-m [2025/11/05 09:17] (current) airi
Line 4: Line 4:
 ^ **ECTS credits** | 1 ECTS | ^ **ECTS credits** | 1 ECTS |
 ^ **Study forms** | Hybrid or fully online | ^ **Study forms** | Hybrid or fully online |
-^ **Module aims** | To provide advanced theoretical and practical understanding of the instability and uncertainty challenges in perception, mappingand localization for autonomous systems. The module explores how aleatoric and epistemic uncertainties affect sensing, object recognition, and environment modeling. Students will learn to model sensor noise, analyze data uncertainty, and design robust algorithms to ensure system stability in uncertain and adversarial environments. Emphasis is placed on real-world validation, robustness testing, and standards such as ISO 26262 and ISO 21448 (Safety Of The Intended Functionality - SOTIF). | +^ **Module aims** | The aim of the module is to introduce instability and uncertainty aspects in perception, mapping and localisation for autonomous systems. The course develops students’ ability to model sensor noise and uncertainty, design robust perception and fusion algorithms, and assess system behaviour in challenging and safety-critical conditions in line with relevant standards. | 
-^ **Pre-requirements** | Solid background in probability theory, statistics, and linear algebra. Familiarity with robotics, sensor fusion, and computer vision. Experience with Python or C++ programming and tools such as ROS 2OpenCVor MATLAB. Knowledge of control theory and machine learning principles is advantageous for modeling and mitigating uncertainty in perception systems. | +^ **Pre-requirements** | Basic knowledge of probability and statistics, linear algebra and perception or sensor fusion conceptsas well as programming skills in Python or C++. Familiarity with roboticscomputer vision, control theorymachine learning or ROS-based tools is recommended but not mandatory. | 
-^ **Learning outcomes** | Knowledge:\\ • Distinguish between aleatoric and epistemic uncertainty and describe their impact on perception and mapping.\\ • Explain sources of instability such as sensor noise, occlusions, quantization, and adversarial attacks.\\ • Understand safety frameworks like ISO 26262 and ISO 21448 (SOTIF) relevant to uncertainty handling.\\ • Describe the role of sensor fusion and redundancy in mitigating uncertainty and maintaining localization accuracy.\\ Skills:\\ • Model sensor noise and environmental uncertainty using statistical and probabilistic approaches.\\ • Apply sensor fusion algorithms (e.g., Kalman filter, particle filter) for robust localization.\\ • Evaluate system robustness against occlusions, reflection errors, and adversarial perturbations.\\ • Design and conduct experiments to quantify uncertainty and validate robustness using simulation and real-world datasets.\\ Understanding/Attitudes:\\ • Appreciate the trade-offs between computational complexity and robustness in multi-sensor systems.\\ • Recognize the ethical and safety implications of unstable or unreliable perception in autonomous systems.\\ • Demonstrate awareness of international safety standards and adopt responsible practices for system validation. | +^ **Learning outcomes** | **Knowledge**\\ • Distinguish between aleatoric and epistemic uncertainty and describe their impact on perception and mapping.\\ • Explain sources of instability such as sensor noise, occlusions, quantization, and adversarial attacks.\\ • Understand safety frameworks relevant to uncertainty handling.\\ • Describe the role of sensor fusion and redundancy in mitigating uncertainty and maintaining localization accuracy.\\ **Skills**\\ • Model sensor noise and environmental uncertainty using statistical and probabilistic approaches.\\ • Apply sensor fusion algorithms for robust localization.\\ • Evaluate system robustness against occlusions, reflection errors, and adversarial perturbations.\\ • Design and conduct experiments to quantify uncertainty and validate robustness using simulation and real-world datasets.\\ **Understanding**\\ • Appreciate the trade-offs between computational complexity and robustness in multi-sensor systems.\\ • Recognize the ethical and safety implications of unstable or unreliable perception in autonomous systems.\\ • Demonstrate awareness of international safety standards and adopt responsible practices for system validation. | 
-^ **Topics** | 1. Sources of Instability and Uncertainty:\\    – Aleatoric vs epistemic uncertainty, stochastic processes, and measurement noise.\\    – Quantization effects, sensor noise modeling, and environmental randomness.\\ 2. Sensor Noise and Fusion:\\    – Multi-sensor integration: LiDAR, radar, GNSS, IMU, camera.\\    – Noise filtering and smoothing techniques (Kalman, particle, Bayesian filters).\\ 3. Occlusions and Partial Observability:\\    – Handling occlusions, weather effects, and incomplete sensor coverage.\\    – Tracking and prediction in uncertain environments.\\ 4. Adversarial Robustness:\\    – Adversarial attacks on perception networks and their detection.\\    – SOTIF (ISO 21448) and safety verification of intended functionality.\\ 5. Validation and Safety Assessment:\\    – Simulation-based validation and uncertainty quantification.\\    – Evaluation metrics for perception and localization under uncertainty.\\ 6. Real-world Case Studies:\\    – Sensor degradation in autonomous vehicles, calibration drift, and redundancy design. |+^ **Topics** | 1. Sources of Instability and Uncertainty:\\    – Aleatoric vs epistemic uncertainty, stochastic processes, and measurement noise.\\    – Quantization effects, sensor noise modeling, and environmental randomness.\\ 2. Sensor Noise and Fusion:\\    – Multi-sensor integration: LiDAR, radar, GNSS, IMU, camera.\\    – Noise filtering and smoothing techniques.\\ 3. Occlusions and Partial Observability:\\    – Handling occlusions, weather effects, and incomplete sensor coverage.\\    – Tracking and prediction in uncertain environments.\\ 4. Adversarial Robustness:\\    – Adversarial attacks on perception networks and their detection.\\    – SOTIF (ISO 21448) and safety verification of intended functionality.\\ 5. Validation and Safety Assessment:\\    – Simulation-based validation and uncertainty quantification.\\    – Evaluation metrics for perception and localization under uncertainty.\\ 6. Real-world Case Studies:\\    – Sensor degradation in autonomous vehicles, calibration drift, and redundancy design. |
 ^ **Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation. | ^ **Type of assessment** | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation. |
-^ **Learning methods** | Lectures: Explore theoretical principles of uncertainty, noise modeling, and perception instability.\\ Lab worksImplement sensor fusion, uncertainty quantification, and robustness evaluation in ROS 2 or MATLAB.\\ Individual assignmentsResearch and report on adversarial attacks, occlusion handling, and noise modeling strategies.\\ Self-learningStudy international standards (ISO 26262, ISO 21448) and open-source datasets (KITTI, NuScenes, Waymo). |+^ **Learning methods** | **Lecture** — Explore theoretical principles of uncertainty, noise modeling, and perception instability.\\ **Lab works** — Implement sensor fusion, uncertainty quantification, and robustness evaluation in ROS2 or MATLAB.\\ **Individual assignments** — Research and report on adversarial attacks, occlusion handling, and noise modeling strategies.\\ **Self-learning** — Study international standards and open-source datasets. |
 ^ **AI involvement** | AI tools may assist in simulating uncertainty propagation, detecting adversarial patterns, and performing sensitivity analysis. Students must critically validate results, document methodology, and ensure reproducibility and compliance with academic integrity standards. | ^ **AI involvement** | AI tools may assist in simulating uncertainty propagation, detecting adversarial patterns, and performing sensitivity analysis. Students must critically validate results, document methodology, and ensure reproducibility and compliance with academic integrity standards. |
-^ **Recommended tools and environments** |  |+^ **Recommended tools and environments** | ROS2, MATLAB, KITTI, NuScenes, Waymo |
 ^ **Verification and Validation focus** |  | ^ **Verification and Validation focus** |  |
-^ **Relevant standards and regulatory frameworks** |  |+^ **Relevant standards and regulatory frameworks** | ISO 26262, ISO 21448 (SOTIF) |
  
en/safeav/curriculum/maps-m.1762164292.txt.gz · Last modified: 2025/11/03 10:04 by raivo.sell
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0