Papers
arxiv:2506.22973

Confident Splatting: Confidence-Based Compression of 3D Gaussian Splatting via Learnable Beta Distributions

Published on Jun 28
· Submitted by AmirHossein-razlighi on Jul 2
Authors:
,

Abstract

A novel lossy compression method using learnable confidence scores improves storage and computational efficiency in 3D Gaussian Splatting without sacrificing visual quality.

AI-generated summary

3D Gaussian Splatting enables high-quality real-time rendering but often produces millions of splats, resulting in excessive storage and computational overhead. We propose a novel lossy compression method based on learnable confidence scores modeled as Beta distributions. Each splat's confidence is optimized through reconstruction-aware losses, enabling pruning of low-confidence splats while preserving visual fidelity. The proposed approach is architecture-agnostic and can be applied to any Gaussian Splatting variant. In addition, the average confidence values serve as a new metric to assess the quality of the scene. Extensive experiments demonstrate favorable trade-offs between compression and fidelity compared to prior work. Our code and data are publicly available at https://github.com/amirhossein-razlighi/Confident-Splatting

Community

Paper author Paper submitter

A new, lossy approach towards compressing 3DGS scenes with learnable confidence score and prunning in test-time (From 6M splats to 3M splats with less than 0.1 PSNR drop!)

Please visit the paper's website for code, the newly released dataset and interactive visualization!
https://amirhossein-razlighi.github.io/Confident-Splatting/

Also, our method is architecture-agnostic. meaning it works on original 3DGS, Gaussian Splatting MCMC, and all other architectures of Gaussian Splatting with no change!

Sign up or log in to comment

Models citing this paper 1

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2506.22973 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.