TY - EJOU AU - Chen, Ming AU - Niu, Yanfei AU - Qi, Ping AU - Wang, Fucheng TI - Unsupervised Satellite Low-Light Image Enhancement Based on the Improved Generative Adversarial Network T2 - Computers, Materials \& Continua PY - 2025 VL - 85 IS - 3 SN - 1546-2226 AB - This research addresses the critical challenge of enhancing satellite images captured under low-light conditions, which suffer from severely degraded quality, including a lack of detail, poor contrast, and low usability. Overcoming this limitation is essential for maximizing the value of satellite imagery in downstream computer vision tasks (e.g., spacecraft on-orbit connection, spacecraft surface repair, space debris capture) that rely on clear visual information. Our key novelty lies in an unsupervised generative adversarial network featuring two main contributions: (1) an improved U-Net (IU-Net) generator with multi-scale feature fusion in the contracting path for richer semantic feature extraction, and (2) a Global Illumination Attention Module (GIA) at the end of the contracting path to couple local and global information, significantly improving detail recovery and illumination adjustment. The proposed algorithm operates in an unsupervised manner. It is trained and evaluated on our self-constructed, unpaired Spacecraft Dataset for Detection, Enforcement, and Parts Recognition (SDDEP), designed specifically for low-light enhancement tasks. Extensive experiments demonstrate that our method outperforms the baseline EnlightenGAN, achieving improvements of 2.7% in structural similarity (SSIM), 4.7% in peak signal-to-noise ratio (PSNR), 6.3% in learning perceptual image patch similarity (LPIPS), and 53.2% in DeltaE 2000. Qualitatively, the enhanced images exhibit higher overall and local brightness, improved contrast, and more natural visual effects. KW - Global illumination attention; generative adversarial networks; low-light enhancement; global-local discriminator; multi-scale feature fusion DO - 10.32604/cmc.2025.067951