Papers
arxiv:2512.13796

Nexels: Neurally-Textured Surfels for Real-Time Novel View Synthesis with Sparse Geometries

Published on Dec 15
Authors:
,
,
,
,
,
,
,

Abstract

A new representation using surfels and a neural field achieves high perceptual quality with fewer primitives and less memory compared to Gaussian splatting, while also rendering faster.

AI-generated summary

Though Gaussian splatting has achieved impressive results in novel view synthesis, it requires millions of primitives to model highly textured scenes, even when the geometry of the scene is simple. We propose a representation that goes beyond point-based rendering and decouples geometry and appearance in order to achieve a compact representation. We use surfels for geometry and a combination of a global neural field and per-primitive colours for appearance. The neural field textures a fixed number of primitives for each pixel, ensuring that the added compute is low. Our representation matches the perceptual quality of 3D Gaussian splatting while using 9.7times fewer primitives and 5.5times less memory on outdoor scenes and using 31times fewer primitives and 3.7times less memory on indoor scenes. Our representation also renders twice as fast as existing textured primitives while improving upon their visual quality.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2512.13796 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2512.13796 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2512.13796 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.