Open Access iconOpen Access

ARTICLE

NGP-ERGAS: Revisit Instant Neural Graphics Primitives with the Relative Dimensionless Global Error in Synthesis

Dongheng Ye1, Heping Li2,3, Ning An2,3, Jian Cheng2,3, Liang Wang1,4,*

1 College of Information Science and Technology, Beijing University of Technology, Beijing, 100124, China
2 Research Institute of Mine Artificial Intelligence, China Coal Research Institute, Beijing, 100013, China
3 State Key Laboratory of Intelligent Coal Mining and Strata Control, Beijing, 100013, China
4 Engineering Research Center of Digital Community, Ministry of Education, Beijing, 100124, China

* Corresponding Author: Liang Wang. Email: email

Computers, Materials & Continua 2025, 84(2), 3731-3747. https://doi.org/10.32604/cmc.2025.063693

Abstract

The newly emerging neural radiance fields (NeRF) methods can implicitly fulfill three-dimensional (3D) reconstruction via training a neural network to render novel-view images of a given scene with given posed images. The Instant Neural Graphics Primitives (Instant-NGP) method further improves the position encoding of NeRF. It obtains state-of-the-art efficiency. However, only a local pixel-wised loss is considered when training the Instant-NGP while overlooking the nonlocal structural information between pixels. Despite a good quantitative result, it leads to a poor visual effect, especially the completeness. Inspired by the stochastic structural similarity (S3IM) method that exploits nonlocal structural information of groups of pixels, this paper proposes a new method to improve the completeness of fast novel view synthesis. The proposed method first extends the thread-wised processing of the Instant-NGP to the processing in a custom thread block (i.e., a group of threads). Then, the relative dimensionless global error in synthesis, i.e., Erreur Relative Globale Adimensionnelle de Synthese (ERGAS), of a group of pixels corresponding to a group of threads is computed and incorporated into the loss function. Extensive experiments validate the proposed method. It can obtain better quantitative results than the original Instant-NGP with fewer iteration steps. PSNR is increased by 1%. Amazing qualitative results are obtained, especially for delicate structures and details such as lines and continuous structures. With the dramatic improvements in the visual effects, our method can boost the practicability of implicit 3D reconstruction in applications such as self-driving and augmented reality.

Keywords

Neural radiance fields; novel view synthesis; 3D reconstruction; graphic processing unit

Cite This Article

APA Style
Ye, D., Li, H., An, N., Cheng, J., Wang, L. (2025). NGP-ERGAS: Revisit Instant Neural Graphics Primitives with the Relative Dimensionless Global Error in Synthesis. Computers, Materials & Continua, 84(2), 3731–3747. https://doi.org/10.32604/cmc.2025.063693
Vancouver Style
Ye D, Li H, An N, Cheng J, Wang L. NGP-ERGAS: Revisit Instant Neural Graphics Primitives with the Relative Dimensionless Global Error in Synthesis. Comput Mater Contin. 2025;84(2):3731–3747. https://doi.org/10.32604/cmc.2025.063693
IEEE Style
D. Ye, H. Li, N. An, J. Cheng, and L. Wang, “NGP-ERGAS: Revisit Instant Neural Graphics Primitives with the Relative Dimensionless Global Error in Synthesis,” Comput. Mater. Contin., vol. 84, no. 2, pp. 3731–3747, 2025. https://doi.org/10.32604/cmc.2025.063693



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 461

    View

  • 213

    Download

  • 0

    Like

Share Link