Open Access
ARTICLE
An Image Inpainting Approach Based on Parallel Dual-Branch Learnable Transformer Network
1 School of Software, Changsha Social Work College, Changsha, 410004, China
2 School of Computer Science and Technology, Changsha University of Science and Technology, Changsha, 410076, China
3 Department of Computer Engineering, INHA University, Incheon, 22201, Republic of Korea
* Corresponding Author: Yan Li. Email:
# These authors contributed equally to this work
(This article belongs to the Special Issue: Omnipresent AI in the Cloud Era Reshaping Distributed Computation and Adaptive Systems for Modern Applications)
Computers, Materials & Continua 2025, 85(1), 1221-1234. https://doi.org/10.32604/cmc.2025.066842
Received 18 April 2025; Accepted 30 June 2025; Issue published 29 August 2025
Abstract
Image inpainting refers to synthesizing missing content in an image based on known information to restore occluded or damaged regions, which is a typical manifestation of this trend. With the increasing complexity of image in tasks and the growth of data scale, existing deep learning methods still have some limitations. For example, they lack the ability to capture long-range dependencies and their performance in handling multi-scale image structures is suboptimal. To solve this problem, the paper proposes an image inpainting method based on the parallel dual-branch learnable Transformer network. The encoder of the proposed model generator consists of a dual-branch parallel structure with stacked CNN blocks and Transformer blocks, aiming to extract global and local feature information from images. Furthermore, a dual-branch fusion module is adopted to combine the features obtained from both branches. Additionally, a gated full-scale skip connection module is proposed to further enhance the coherence of the inpainting results and alleviate information loss. Finally, experimental results from the three public datasets demonstrate the superior performance of the proposed method.Keywords
Cite This Article
Copyright © 2025 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools