This is an old revision of the document!
| Study level | Master | |
| ECTS credits | 1 ECTS | |
| Study forms | Hybrid or fully online | |
| Module aims | To provide advanced theoretical and practical understanding of the instability and uncertainty challenges in perception, mapping, and localization for autonomous systems. The module explores how aleatoric and epistemic uncertainties affect sensing, object recognition, and environment modeling. Students will learn to model sensor noise, analyze data uncertainty, and design robust algorithms to ensure system stability in uncertain and adversarial environments. Emphasis is placed on real-world validation, robustness testing, and standards such as ISO 26262 and ISO 21448 (Safety Of The Intended Functionality - SOTIF). | |
| Pre-requirements | Solid background in probability theory, statistics, and linear algebra. Familiarity with robotics, sensor fusion, and computer vision. Experience with Python or C++ programming and tools such as ROS 2, OpenCV, or MATLAB. Knowledge of control theory and machine learning principles is advantageous for modeling and mitigating uncertainty in perception systems. | |
| Learning outcomes | Knowledge: • Distinguish between aleatoric and epistemic uncertainty and describe their impact on perception and mapping. • Explain sources of instability such as sensor noise, occlusions, quantization, and adversarial attacks. • Understand safety frameworks like ISO 26262 and ISO 21448 (SOTIF) relevant to uncertainty handling. • Describe the role of sensor fusion and redundancy in mitigating uncertainty and maintaining localization accuracy. Skills: • Model sensor noise and environmental uncertainty using statistical and probabilistic approaches. • Apply sensor fusion algorithms (e.g., Kalman filter, particle filter) for robust localization. • Evaluate system robustness against occlusions, reflection errors, and adversarial perturbations. • Design and conduct experiments to quantify uncertainty and validate robustness using simulation and real-world datasets. Understanding/Attitudes: • Appreciate the trade-offs between computational complexity and robustness in multi-sensor systems. • Recognize the ethical and safety implications of unstable or unreliable perception in autonomous systems. • Demonstrate awareness of international safety standards and adopt responsible practices for system validation. |
|
| Topics | 1. Sources of Instability and Uncertainty: – Aleatoric vs epistemic uncertainty, stochastic processes, and measurement noise. – Quantization effects, sensor noise modeling, and environmental randomness. 2. Sensor Noise and Fusion: – Multi-sensor integration: LiDAR, radar, GNSS, IMU, camera. – Noise filtering and smoothing techniques (Kalman, particle, Bayesian filters). 3. Occlusions and Partial Observability: – Handling occlusions, weather effects, and incomplete sensor coverage. – Tracking and prediction in uncertain environments. 4. Adversarial Robustness: – Adversarial attacks on perception networks and their detection. – SOTIF (ISO 21448) and safety verification of intended functionality. 5. Validation and Safety Assessment: – Simulation-based validation and uncertainty quantification. – Evaluation metrics for perception and localization under uncertainty. 6. Real-world Case Studies: – Sensor degradation in autonomous vehicles, calibration drift, and redundancy design. |
|
| Type of assessment | The prerequisite of a positive grade is a positive evaluation of module topics and presentation of practical work results with required documentation. | |
| Learning methods | Lectures: Explore theoretical principles of uncertainty, noise modeling, and perception instability. Lab works: Implement sensor fusion, uncertainty quantification, and robustness evaluation in ROS 2 or MATLAB. Individual assignments: Research and report on adversarial attacks, occlusion handling, and noise modeling strategies. Self-learning: Study international standards (ISO 26262, ISO 21448) and open-source datasets (KITTI, NuScenes, Waymo). |
|
| AI involvement | Yes — AI tools may assist in simulating uncertainty propagation, detecting adversarial patterns, and performing sensitivity analysis. Students must critically validate results, document methodology, and ensure reproducibility and compliance with academic integrity standards. | |
| References to literature | 1. Thrun, S., Burgard, W., & Fox, D. (2005). Probabilistic Robotics. MIT Press. 2. Kendall, A., & Gal, Y. (2017). What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision? NeurIPS. 3. Goodfellow, I., Shlens, J., & Szegedy, C. (2015). Explaining and Harnessing Adversarial Examples. ICLR. 4. ISO 21448 (2022). Road Vehicles – Safety Of The Intended Functionality (SOTIF). 5. ISO 26262 (2018). Road Vehicles – Functional Safety. 6. Cadena, C., et al. (2016). Past, Present, and Future of SLAM. IEEE Transactions on Robotics. 7. Zhang, J., & Singh, S. (2017). LOAM: Lidar Odometry and Mapping in Real-time. RSS. 8. Razdan, R., & Sell, R. (2025). Uncertainty-Aware Perception and Mapping Frameworks for Autonomous Systems. IEEE Access (forthcoming). |
|
| Lab equipment | Yes | |
| Virtual lab | Yes | |
| MOOC course | Suggested MOOC: 'Probabilistic Robotics' (edX, Georgia Institute of Technology) or 'AI for Self-Driving Cars' (Coursera, University of Toronto). | |