The practice of integrating images from two or more sensors collected from the same area or object is known as image fusion. The goal is to extract more spatial and spectral information from the resulting fused image than from the component images. The images must be fused to improve the spatial and spectral quality of both panchromatic and multispectral images. This study provides a novel picture fusion technique that employs L0 smoothening Filter, Non-subsampled Contour let Transform (NSCT) and Sparse Representation (SR) followed by the Max absolute rule (MAR). The fusion approach is as follows: first, the multispectral and panchromatic images are divided into lower and higher frequency components using the L0 smoothing filter. Then comes the fusion process, which uses an approach that combines NSCT and SR to fuse low frequency components. Similarly, the Max-absolute fusion rule is used to merge high frequency components. Finally, the final image is obtained through the disintegration of fused low and high frequency data. In terms of correlation coefficient, Entropy, spatial frequency, and fusion mutual information, our method outperforms other methods in terms of image quality enhancement and visual evaluation.
Distinct cameras are used to acquire different views of the same region from a satellite. The most frequent forms of images captured by cameras are panchromatic and multispectral images. The entire intensity falling on each pixel is captured as an image, which is called a panchromatic image because it is panchromatic in all small bands of visible spectrum, including Red Green Blue (RGB) and infrared. Electromagnetic bandwidth collected by the sensor is referred to as spectral resolution. Sensors on satellites may detect wavelengths from three or more bands [
Panchromatic and multispectral images, respectively give high spatial and spectral information. Multispectral images are considered to have limited spatial resolution even with precise color data, but panchromatic images are grey scale images with higher spatial resolution. Both spectral and spatial information are necessary to gain more information. The fused image, regardless of individual panchromatic and multispectral photos should provide additional information about the satellite image. Image fusions can be classified into four categories: pixel level, object level, feature level, and decision level [
The level of fusion will be determined by the intended application of the fused image [
The topic of research is developing fusion algorithms for IKONOs and Landsat -7 ETM + satellite Data. The PAN and MS pictures are fused using a multi-scale edge-preserving decomposition with L0 smoothing, followed by NSCT and SR models. The suggested method is inspired by determining the error distribution in multispectral images [
Panchromatic | (PAN) |
Multispectral | (MS) |
Brovey Ransform | (BT) |
Principal Component Analysis | (PCA) |
Intensity Hue Saturation | (HIS) |
Guided FILTER | (GF) |
Non-Subsampled Contourlet Transform | (NSCT) |
Sparse Representation | (SR) |
Low Frequency | (LF) |
High Frequency | (HF) |
Max-Absolute Fusion Rule | (MAR) |
Low Frequency Data | (LFD) |
High Frequency Data | (HFD) |
Sub Low Frequency Data | (SLD) |
Sub High Frequency Data | (SHD) |
Red, Green, and BLue | (RGB) |
Nonsubsampled Pyramid Structure | (NSPS) |
Nonsubsampled Directional Filter Bank | (NSDFB) |
K |
(K-SVD) |
Orthogonal Matching Pursuit Algorithm | (OMP) |
Dictionary Learning | (DL) |
Correlation Coefficient | (CC) |
Standard Deviation | (SD) |
Spatial Frequency | (SF) |
Fusion Mutual Information | (FMI) |
High Frequency Data Multispectral | (HFDM) |
High Frequency Data Panchromatic | (HFDP) |
Sharpening filters are used to emphasize intensity shifts. Sharpening is used in a variety of applications from electronic printing to medical imaging to industrial inspections. First order derivative-Gradient Operator, Second order derivative-Laplacian Operator, Unsharp Masking, and Highboost Filtering are examples of sharpening filters.
A L0 gradient minimization has been given here for a broad input signal. Consider Sig as the filter’s input signal and F as the filter’s output. The gradient of the output F is denoted by F. The following equation can be used to illustrate the L0 gradient minimization formula
where length of the signal is denoted by M and kth neighbor set is denoted by Nk. The neighboring relationship between Fk and Fj is counted twice so, λ is divided by 2. For each case like 1Dim, 2Dim, 3Dim the neighboring set Nk is defined [
The spatial structures of images are captured along smooth contours, and any number of directions is allowed in each scale, making NSCT flexible and efficient for representation of 2-D objects. Nonsubsampled Pyramid Structure (NSP) and Nonsubsampled Directional Filter Bank (NSDFB) in conjunction create NSCT, as illustrated in
In sparse representation dictionary learning plays a very important role. In recent years for adaptive dictionary learning (DL) K
The initial over-complete dictionary
The algorithm of K-SVD consists of two steps that together to make the iteration of the algorithm. Two steps as per below: Based on current dictionary estimation, the samples in Z are sparse coded, for generating the matrix of sparse representations The atoms of dictionary are updated according to current sparse representation.
The implementation of sparse coding is achieved by using Orthogonal Matching Pursuit Alogorithm (OMP); the dictionary updating is done single atom at a time, which optimizes the target function independently for atom while leaving the rest unchanged these two things is shown in algorithm 1, in that line 5 is the sparse coding and line 6–13 is the dictionary update.
The key advancement in the K-SVD algorithm is the atom update step, which is accomplished while preserving the restriction in
For both the atom and the related coefficient row in
rank-1 given by
The aim of the OMP algorithm is the providing of approximate solution to following two problems; one is the sparsity constrained sparse coding problem expressed by
For simple form, we presume that
The greedy OMP algorithm chooses the atom with the high correlation to the actual residual at each stage. Once the atom has been picked, the signal is applied orthogonally to the period of the chosen atoms, the residual is reputed and the cycle continues [
The algorithm OMP is a greedy algorithm where attempts are made to locate a sparse representation of a sample provided a specific dictionary. The algorithm tries to locate the strongest base vectors (atoms) iteratively, such that the representation error is decreased in each iteration [ Choose the atom that has full residual projection Update Residual updating
Few atoms of a redundant dictionary form a linear combination from which representation of many signals in assumption and that is relied by the sparse model [
SR and compressed sensing based on recent developments minimization problem in L0 that is non-convex in
Linear programming methods give the Solutions [
The suggested method uses image fusion with panchromatic and multispectral images from IKONOS and Quick Bird. This unique fusion method yields improved spectral and spatial information. As previously indicated, the fusion procedure used the L0 smoothing filter, NSCT, and SR representation. The final image provides more information about the satellite image than the multispectral and panchromatic images combined. The source image is deconstructed in multi-classes utilising a multi-scale decomposition model for efficient extraction of required high-resolution data from PAN and MS images. By using this procedure, specific information is kept in a unique way, and the quality of fused images is improved. A multi-scale decomposition is proposed based on the L0 smoothening filter. For low-frequency data, the NSCT-SR based image fusion technique may be used to preserve the structure and detail information of each channel of the MS and PAN images, while for high-frequency data, the max-absolute fusion rule can be used to remove redundant information. Each channel of the MS image, such as R, G, and B, is separated, and then NSCT-SR/MAR is applied to each channel of the MS picture and PAN image. For natural appearance of the fused images, the lower frequency data and high frequency data fused in previous process is fused further. Also, we use the different existing techniques for image fusion namely guided, IHS fusion, Principal component analysis, Brovey (BT) and NSCT Transforms have been evaluated for comparison. The fusion process in statistically evaluated by four commonly used parameters like correlation coefficient (CC), entropy/noise, Spatial Frequency (SF) and Fusion mutual information (FMI).
The proposed system
Images multiple features at different scales are obtained by decomposing the source image. In existing decomposition models the fused image always suffers from artifacts since they mostly adapt linear filters. Here L0 smoothening filter is adapted for decomposing the source image that reduces the artifacts. Here low frequency and series of high frequency data are obtained by decomposing of image. τ is the parameter used for decomposing the source image. Below
S = L0Smooth (Iorg,
The source images such as MS and PAN images are decomposed into low frequency data (LFD) and high frequency data (HFD) using L0 smoothening filter. Then further LF data of both the images and HF data of both the images are fused together.
Complete demonstration of LF data fusion is given in
NSCT decomposition is used for decomposition of LF to SLD
Then used the sliding window technique is used for splitting sublowA and sublowB into patches with size
The patches are rearranged to column vector for every position i. Then as follows in eq
Nx1 column vector is denoted by
The sparse coefficient vectors are calculated for the above
The sparse coefficient vectorsare fused using max-L1 rule
And the result of the final fusion model is
For all patches iteration is performed from
The SHF data are fused together with the help of max-absolute rule with a
Finally, Applying inverse NSCT for getting output LF fused data. Here, LF Data–Low Frequency Data; A–MS Image; B–PAN Image; SLF D–Sub Low frequency Data; SHF D-Sub High frequency Data.
HF data coefficients are fused using max absolute rule. It performs selecting maximum in pixel by pixel fusion rule. The high frequency data adopted by the two source images are represented by High Frequency Data Multispectral (HFDM) and High frequency Data Panchromatic (HFDP) and that is formed by using L0 smoothing filter.
At final step of fusion high frequency data fused image by max absolute rule and low frequency fused data from combined NSCT and SR are taken for final fusion of low frequency and high frequency data and addition operation is performed between both the images, which gives the final fused image with higher information on spectral and spatial data [
The two datasets used for above process is Landsat-7ETM + and IKONOS. Every images are captured by different satellites. One sample of each image has been used.
Dictionary learning | Dictionary size = 256 |
L0 Smoothing filter | Lambda = 2e-2 |
NSCT | nlevels–[2,3,3,4] |
Landsat-7 ETM + satellite dataset covered the area of Girona, Spain, taken in 2008. Landsat-7 obtained the PAN image has the spatial resolution of 15 m and MS image has the spatial resolution of 30 m. Landsat-7 PAN images are acquired in 0.52–0.90 μm spectral range. Landsat-7 MS images are acquired in 0.63–0.69 μm (Red), 0.52–0.60 μm (Green), 0.45–0.52 μm (Blue), 0.76–0.90 μm (Near-IR), 1.55–1.75 μm (Mid-IR) and 2.08–2.35 μm (Shortwave-IR) six spectral range. Subscenes of MS and PAN raw images are used in our experiment.
The IKONOS PAN image has a spatial resolution of 1 m that is acquired in a 0.45 to 0.90 μm spectral range, while the MS image has a spatial resolution of 4 m that is obtained in four spectral ranges of 0.63–0.69 μm (Red), 0.52–0.60 μm (Green), 0.45–0.52 μm (Blue), while 0.76–0.90 μm (NIR) respectively. Our case study is comprised of subset images from the City of Fredericton, New Brunswick, Canada obtained by IKONOS in October 2001. The data collection consists of images of urban area and consists of different features such as lane, house, parking, fruit, grass etc.
The fused image quality was evaluated by some statistical parameters such as correlation coefficient (CC), Standard Diviation (SD), entropy/noise, spatial frequency (SF) and fusion mutual information (FMI) have been used.
Hence full scale parameters (Entropy, spatial frequency) are obtained with resultant image alone and degraded scale parameters (CC, FMI) are obtained by using a reference image along with the resultant image.
Method | Brovey | HIS | Guided | NSCT | PCA | Proposed |
---|---|---|---|---|---|---|
CC | 0.9626 | 0.9717 | 0.9735 | 0.9714 | 0.6983 | |
Entropy | 7.4179 | 7.8228 | 7.1771 | 7.7360 | 7.4646 | |
SF | 0.6243 | 0.5878 | 0.6411 | 0.6036 | 0.6522 | |
FMI | 0.8249 | 0.8275 | 0.8075 | 0.8372 | 0.7897 |
Method | Brovey | HIS | Guided | NSCT | PCA | Proposed |
---|---|---|---|---|---|---|
CC | 0.8643 | 0.8404 | 0.5734 | 0.9143 | 0.9341 | |
Entropy | 6.2073 | 6.5034 | 5.9382 | 5.9246 | 6.1695 | |
SF | 0.6208 | 0.6461 | 0.6658 | 0.6418 | 0.6468 | |
FMI | 0.9035 | 0.8985 | 0.8562 | 0.9073 | 0.8955 |
In this research, we proposed a novel method for satellite image fusion that fuses Multispectral and Panchromatic image to obtain high information on spectral and spatial details of the satellite image. The method used for image fusion includes L0 smoothening filter that gives a better decomposition of low and high frequency data for satellite image. And for fusion of low frequency components obtained from L0 Filter is fused by combination of NSCT and SR and Max absolute rule has been applied for high frequency used give an optimum fusion of images. And the result has been compared with Brovey, HIS, GUIDED, NSCT and PCA by means of CC, Entropy, SF and FMI which shows in
The authors received no specific funding for this study.
The authors declare that they have no conflicts of interest to report regarding the present study.