Reliable localization and mapping are foundational for autonomous vehicles and robots, yet camera-centric systems often degrade in fog, rain, snow, glare or low light—exactly when robust perception is most critical. This talk explores radar-based simultaneous localization and mapping (SLAM) as a complementary and, in some scenarios, alternative for building maps and estimating position under adverse conditions. We’ll review the core principles of radar-based SLAM, what makes radar sensing fundamentally different from images and the practical challenges of working with sparse, noisy radar returns. We’ll then examine recent advances in radar-only SLAM, including novel “iterative closest point” approaches for pose estimation and implementations optimized for real-time performance. Attendees will leave with a clear understanding of when radar SLAM outperforms vision-only methods, what accuracy is achievable today and which applications can benefit most from radar SLAM.

