Open Access iconOpen Access

ARTICLE

crossmark

Unsupervised Satellite Low-Light Image Enhancement Based on the Improved Generative Adversarial Network

Ming Chen1,*, Yanfei Niu2, Ping Qi1, Fucheng Wang1

1 School of Mathematics and Computer Science, Tongling University, Tongling, 244061, China
2 College of Software Engineering, Zhengzhou University of Light Industry, Zhengzhou, 450000, China

* Corresponding Author: Ming Chen. Email: email

Computers, Materials & Continua 2025, 85(3), 5015-5035. https://doi.org/10.32604/cmc.2025.067951

Abstract

This research addresses the critical challenge of enhancing satellite images captured under low-light conditions, which suffer from severely degraded quality, including a lack of detail, poor contrast, and low usability. Overcoming this limitation is essential for maximizing the value of satellite imagery in downstream computer vision tasks (e.g., spacecraft on-orbit connection, spacecraft surface repair, space debris capture) that rely on clear visual information. Our key novelty lies in an unsupervised generative adversarial network featuring two main contributions: (1) an improved U-Net (IU-Net) generator with multi-scale feature fusion in the contracting path for richer semantic feature extraction, and (2) a Global Illumination Attention Module (GIA) at the end of the contracting path to couple local and global information, significantly improving detail recovery and illumination adjustment. The proposed algorithm operates in an unsupervised manner. It is trained and evaluated on our self-constructed, unpaired Spacecraft Dataset for Detection, Enforcement, and Parts Recognition (SDDEP), designed specifically for low-light enhancement tasks. Extensive experiments demonstrate that our method outperforms the baseline EnlightenGAN, achieving improvements of 2.7% in structural similarity (SSIM), 4.7% in peak signal-to-noise ratio (PSNR), 6.3% in learning perceptual image patch similarity (LPIPS), and 53.2% in DeltaE 2000. Qualitatively, the enhanced images exhibit higher overall and local brightness, improved contrast, and more natural visual effects.

Keywords

Global illumination attention; generative adversarial networks; low-light enhancement; global-local discriminator; multi-scale feature fusion

Cite This Article

APA Style
Chen, M., Niu, Y., Qi, P., Wang, F. (2025). Unsupervised Satellite Low-Light Image Enhancement Based on the Improved Generative Adversarial Network. Computers, Materials & Continua, 85(3), 5015–5035. https://doi.org/10.32604/cmc.2025.067951
Vancouver Style
Chen M, Niu Y, Qi P, Wang F. Unsupervised Satellite Low-Light Image Enhancement Based on the Improved Generative Adversarial Network. Comput Mater Contin. 2025;85(3):5015–5035. https://doi.org/10.32604/cmc.2025.067951
IEEE Style
M. Chen, Y. Niu, P. Qi, and F. Wang, “Unsupervised Satellite Low-Light Image Enhancement Based on the Improved Generative Adversarial Network,” Comput. Mater. Contin., vol. 85, no. 3, pp. 5015–5035, 2025. https://doi.org/10.32604/cmc.2025.067951



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 472

    View

  • 189

    Download

  • 0

    Like

Share Link