Open Access
ARTICLE
Unsupervised Satellite Low-Light Image Enhancement Based on the Improved Generative Adversarial Network
1 School of Mathematics and Computer Science, Tongling University, Tongling, 244061, China
2 College of Software Engineering, Zhengzhou University of Light Industry, Zhengzhou, 450000, China
* Corresponding Author: Ming Chen. Email:
Computers, Materials & Continua 2025, 85(3), 5015-5035. https://doi.org/10.32604/cmc.2025.067951
Received 16 May 2025; Accepted 08 August 2025; Issue published 23 October 2025
Abstract
This research addresses the critical challenge of enhancing satellite images captured under low-light conditions, which suffer from severely degraded quality, including a lack of detail, poor contrast, and low usability. Overcoming this limitation is essential for maximizing the value of satellite imagery in downstream computer vision tasks (e.g., spacecraft on-orbit connection, spacecraft surface repair, space debris capture) that rely on clear visual information. Our key novelty lies in an unsupervised generative adversarial network featuring two main contributions: (1) an improved U-Net (IU-Net) generator with multi-scale feature fusion in the contracting path for richer semantic feature extraction, and (2) a Global Illumination Attention Module (GIA) at the end of the contracting path to couple local and global information, significantly improving detail recovery and illumination adjustment. The proposed algorithm operates in an unsupervised manner. It is trained and evaluated on our self-constructed, unpaired Spacecraft Dataset for Detection, Enforcement, and Parts Recognition (SDDEP), designed specifically for low-light enhancement tasks. Extensive experiments demonstrate that our method outperforms the baseline EnlightenGAN, achieving improvements of 2.7% in structural similarity (SSIM), 4.7% in peak signal-to-noise ratio (PSNR), 6.3% in learning perceptual image patch similarity (LPIPS), and 53.2% in DeltaE 2000. Qualitatively, the enhanced images exhibit higher overall and local brightness, improved contrast, and more natural visual effects.Keywords
Cite This Article
Copyright © 2025 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools