Communications in Information and Systems

Volume 23 (2023)

Number 4

PaletteNeRF: palette-based color editing for NeRFs

Pages: 447 – 475

DOI: https://dx.doi.org/10.4310/CIS.2023.v23.n4.a4

Authors

Qiling Wu (Key Laboratory of Pervasive Computing, Ministry of Education, Beijing National Research Center for Information Science & Technology; and Department of Computer Science & Technology, Tsinghua University, Beijing, China)

Jianchao Tan (Kuaishou Technology, Beijing, China)

Kun Xu (Key Laboratory of Pervasive Computing, Ministry of Education, Beijing National Research Center for Information Science & Technology; and Department of Computer Science & Technology, Tsinghua University, Beijing, China)

Abstract

Neural Radiance Field (NeRF) is a powerful tool to faithfully generate novel views for scenes with only sparse captured images. Despite its strong capability for representing 3D scenes and their appearance, its editing ability is very limited. In this paper, we propose a simple but effective extension of vanilla NeRF, named PaletteNeRF, to enable efficient color editing on NeRF-represented scenes. Motivated by recent palette-based image decomposition works, we approximate each pixel color as a sum of palette colors modulated by additive weights. Instead of predicting pixel colors as in vanilla NeRFs, our method predicts additive weights. The underlying NeRF backbone could also be replaced with more recent NeRF models such as KiloNeRF to achieve real-time editing. Experimental results demonstrate that our method achieves efficient, view-consistent, and artifact-free color editing on a wide range of NeRF-represented scenes.

The full text of this article is unavailable through your IP address: 172.17.0.1

This work is supported by the National Natural Science Foundation of China (Project Number: 61932003).

Received 16 September 2023

Published 21 May 2024