| Study level | Master |
|---|---|
| ECTS credits | 1 ECTS |
| Study forms | Hybrid or fully online |
| Module aims | The aim of the module is to introduce instability and uncertainty aspects in perception, mapping and localisation for autonomous systems. The course develops students’ ability to model sensor noise and uncertainty, design robust perception and fusion algorithms, and assess system behaviour in challenging and safety-critical conditions in line with relevant standards. |
| Pre-requirements | Basic knowledge of probability and statistics, linear algebra and perception or sensor fusion concepts, as well as programming skills in Python or C++. Familiarity with robotics, computer vision, control theory, machine learning or ROS-based tools is recommended but not mandatory. |
| Learning outcomes | Knowledge • Distinguish between aleatoric and epistemic uncertainty and describe their impact on perception and mapping. • Explain sources of instability such as sensor noise, occlusions, quantization, and adversarial attacks. • Understand safety frameworks relevant to uncertainty handling. • Describe the role of sensor fusion and redundancy in mitigating uncertainty and maintaining localization accuracy. Skills • Model sensor noise and environmental uncertainty using statistical and probabilistic approaches. • Apply sensor fusion algorithms for robust localization. • Evaluate system robustness against occlusions, reflection errors, and adversarial perturbations. • Design and conduct experiments to quantify uncertainty and validate robustness using simulation and real-world datasets. Understanding • Appreciate the trade-offs between computational complexity and robustness in multi-sensor systems. • Recognize the ethical and safety implications of unstable or unreliable perception in autonomous systems. • Demonstrate awareness of international safety standards and adopt responsible practices for system validation. |
| Topics | 1. Sources of Instability and Uncertainty: – Aleatoric vs epistemic uncertainty, stochastic processes, and measurement noise. – Quantization effects, sensor noise modeling, and environmental randomness. 2. Sensor Noise and Fusion: – Multi-sensor integration: LiDAR, radar, GNSS, IMU, camera. – Noise filtering and smoothing techniques. 3. Occlusions and Partial Observability: – Handling occlusions, weather effects, and incomplete sensor coverage. – Tracking and prediction in uncertain environments. 4. Adversarial Robustness: – Adversarial attacks on perception networks and their detection. – SOTIF (ISO 21448) and safety verification of intended functionality. 5. Validation and Safety Assessment: – Simulation-based validation and uncertainty quantification. – Evaluation metrics for perception and localization under uncertainty. 6. Real-world Case Studies: – Sensor degradation in autonomous vehicles, calibration drift, and redundancy design. |
| Type of assessment | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation. |
| Learning methods | Lecture — Explore theoretical principles of uncertainty, noise modeling, and perception instability. Lab works — Implement sensor fusion, uncertainty quantification, and robustness evaluation in ROS2 or MATLAB. Individual assignments — Research and report on adversarial attacks, occlusion handling, and noise modeling strategies. Self-learning — Study international standards and open-source datasets. |
| AI involvement | AI tools may assist in simulating uncertainty propagation, detecting adversarial patterns, and performing sensitivity analysis. Students must critically validate results, document methodology, and ensure reproducibility and compliance with academic integrity standards. |
| Recommended tools and environments | ROS2, MATLAB, KITTI, NuScenes, Waymo |
| Verification and Validation focus | |
| Relevant standards and regulatory frameworks | ISO 26262, ISO 21448 (SOTIF) |