WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments
Abstract
We present WildGS-SLAM, a robust and efficient monocular RGB SLAM system designed to handle dynamic environments by leveraging uncertainty-aware geometric mapping. Unlike traditional SLAM systems, which assume static scenes, our approach integrates depth and uncertainty information to enhance tracking, mapping, and rendering performance in the presence of moving objects. We introduce an uncertainty map, predicted by a shallow multi-layer perceptron and DINOv2 features, to guide dynamic object removal during both tracking and mapping. This uncertainty map enhances dense bundle adjustment and Gaussian map optimization, improving reconstruction accuracy. Our system is evaluated on multiple datasets and demonstrates artifact-free view synthesis. Results showcase WildGS-SLAM's superior performance in dynamic environments compared to state-of-the-art methods.
Community
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Embracing Dynamics: Dynamics-aware 4D Gaussian Splatting SLAM (2025)
- GI-SLAM: Gaussian-Inertial SLAM (2025)
- Deblur Gaussian Splatting SLAM (2025)
- 4D Gaussian Splatting SLAM (2025)
- RGB-Only Gaussian Splatting SLAM for Unbounded Outdoor Scenes (2025)
- STAMICS: Splat, Track And Map with Integrated Consistency and Semantics for Dense RGB-D SLAM (2025)
- GigaSLAM: Large-Scale Monocular SLAM with Hierachical Gaussian Splats (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 1
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper