|
|
|
|
|
|
Infrared and Visible Image Fusion Based on Improved Latent Low-Rank
and Unsharp Masks |
FENG Zhun-ruo1, LI Yun-hong1*, CHEN Wei-zhong1, SU Xue-ping1, CHEN Jin-ni1, LI Jia-peng1, LIU Huan1, LI Shi-bo2 |
1. School of Electronic Information, Xi'an Polytechnic University, Xi'an 710048, China
2. School of Science, Xi'an Polytechnic University, Xi'an 710048, China
|
|
|
Abstract To address the challenges of incomplete salient information extraction and detail degradation in infrared and visible light image fusion under low-light conditions, we propose an enhanced fusion algorithm that integrates Latent Low-Rank Representation (LatLRR) with Anisotropic Diffusion-Based Unsharp Mask(ADUSM). Initially, we apply block-wise segmentation and vectorization to the infrared and visible images, subsequently inputting them into the LatLRR model. Through an inverse reconstruction operation, we extract low-rank components from the infrared images and obtain basic salient components from the visible images. Next, the basic salient components undergo processing with ADUSM for pixel-wise differencing, allowing for further decomposition into deep salient detail components and multi-level detail features. Subsequently, the low-rank components are fused utilizing a visual saliency map rule, which enhances the retention and visibility of salient targets in the resultant fused image. For the deep salient detail components, we employ local entropy maximization for fusion, establishing a maximum activity coefficient to preserve the deep salient details effectively, thereby improving the overall quality and visual richness of the fused image. The multi-level detail features are fused using a weighted average strategy based on maximum spatial frequency, which adapts to the multi-level detail features of the input images, thus enhancing the overall clarity and contrast. Finally, we conduct a comparative analysis of our proposed method against Bayesian, Wavelet, LatLRR, MSVD, and MDLatLRR algorithms using the TNO and M3FD datasets. Experimental results demonstrate that our algorithm significantly outperforms traditional low-rank algorithms in average gradient methods, achieving enhancements of 31%, 2.1%, 4.4%, and 34% in average gradient, information entropy, standard deviation, and spatial frequency metrics. Comprehensive subjective and objective evaluations indicate that the fused images produced by our method not only exhibit rich texture details and clear salient targets but also present substantial advantages over various competing methods. This study effectively addresses the issue of incomplete salient information extraction in low-light environments, exhibiting robust generalization capabilities. The integration of improved Latent Low-Rank and ADUSM filtering is demonstrated to be both effective and feasible in the realm of infrared and visible light image fusion, offering significant scientific contributions to the advancement and application of this technology.
|
Received: 2024-10-30
Accepted: 2025-02-25
|
|
Corresponding Authors:
LI Yun-hong
E-mail: hitliyunhong@163.com
|
|
[1] JI Jing-yu, ZHANG Yu-hua, XING Na, et al(冀鲸宇, 张玉华, 邢 娜, 等). Spectroscopy and Spectral Analysis(光谱学与光谱分析),2024, 44(5): 1425.
[2] Xing J, Liu Y, Zhang G. Sensors, 2024, 24: 2759.
[3] Liu J, Zhou W, Zhang Y, et al. Optics and Lasers in Engineering, 2024, 179: 108260.
[4] LI Yun-hong, CAO Bin, SU Xue-ping, et al(李云红, 曹 彬, 苏雪平, 等). Infrared and Laser Engineering(红外与激光工程), 2024, 53(10): 117.
[5] Liu X, Huo H, Li J, et al. Information Fusion, 2024, 108: 102352.
[6] Wang L, Zhao P, Chu N, et al. IEEE Sensors Journal, 2022, 22(19): 18815.
[7] Ma J, Ma Y, Li C. Information Fusion, 2019, 45: 153.
[8] Li G, Lin Y, Qu X. Information Fusion, 2021, 71: 109.
[9] Naidu V P S. Defence Science Journal, 2011, 61(5): 479.
[10] Ma J, Zhou Z, Wang B, et al. Infrared Physics & Technology, 2017, 82: 8.
[11] LI Yun-hong, LI Jia-peng, SU Xue-ping, et al(李云红, 李嘉鹏, 苏雪平, 等). Laser and Infrared(激光与红外), 2023, 53(9): 1441.
[12] Liu G, Yan S. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, International Conference on Computer Vision. IEEE, 2011: 1615.
[13] Al-Ameen Z, Al-Healy M A, Hazim R A. Journal of Soft Computing and Decision Support Systems, 2020, 7(1): 7.
[14] Wang W, Zhang J, Liu H, et al. Infrared Physics & Technology, 2023, 133: 104828.
[15] Toet A. Data in Brief, 2017, 15: 249.
[16] Liu J, Fan X, Huang Z, et al. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022: 5802 (arXiv. 2203.16220).
[17] Zhao Z, Xu S, Zhang C, et al. Signal Processing, 2020, 177: 107734.
[18] Pajares G, de la Cruz J M. Pattern Recognition, 2004, 37(9): 1855.
[19] You C Z, Palade V, Wu X J. Engineering Applications of Artificial Intelligence, 2019, 77: 117.
[20] Li H, Wu X J, Kittler J. IEEE Transactions on Image Processing, 2020, 29: 4733.
|
[1] |
ZHU Rong1, ZHENG Wan-bo1, 2, 3*, WANG Yao2, 3, TAN Chun-lin2, 3. Improved Fusion Algorithm for Infrared and Visible Images Based on
Image Enhancement and Convolutional Sparse Representation[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2025, 45(02): 558-568. |
[2] |
JI Jing-yu, ZHANG Yu-hua, XING Na, WANG Chang-long, LIN Zhi-long*, YAO Jiang-yi. Three-Scale Deconstruction and Sparse Representation of Infrared and Visible Image Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2024, 44(05): 1425-1438. |
[3] |
LI Si-yuan, JIAO Jian-nan, WANG Chi*. Specular Reflection Removal Method Based on Polarization Spectrum
Fusion and Its Application in Vegetation Health Monitoring[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2023, 43(11): 3607-3614. |
[4] |
ZHU Wen-qing1, 2, 3, ZHANG Ning1, 2, 3, LI Zheng1, 2, 3*, LIU Peng1, 3, TANG Xin-yi1, 3. A Multi-Task Convolutional Neural Network for Infrared and Visible Multi-Resolution Image Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2023, 43(01): 289-296. |
[5] |
XU Xue-bin1, 2, XING Xiao-min1, 2*, AN Mei-juan1, 2, CAO Shu-xin1, 2, MENG Kan1, 2, LU Long-bin1, 2. Palmprint Recognition Method Based on Multispectral Image Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2022, 42(11): 3615-3625. |
[6] |
CUI Xiao-rong, SHEN Tao*, HUANG Jian-lu, SUN Bin-bin. Infrared Mid-Wave and Long-Wave Image Fusion Based on FABEMD and Improved Local Energy Window[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2021, 41(07): 2043-2049. |
[7] |
SHEN Yu, YUAN Yu-bin*, PENG Jing. Research on Near Infrared and Color Visible Fusion Based on PCNN in Transform Domain[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2021, 41(07): 2023-2027. |
[8] |
ZHANG Jin1, WANG Jie1, SHEN Yan3, ZHANG Jin-bo4, CUI Hong-liang1,2*, SHI Chang-cheng2*. Wavelet-Based Image Fusion Method Applied in the Terahertz Nondestructive Evaluation[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2017, 37(12): 3683-3688. |
[9] |
LIU Feng1, SHEN Tong-sheng2, GUO Shao-jun1,ZHANG Jian3. Multi-Spectral Ship Target Recognition Based on Feature Level Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2017, 37(06): 1934-1940. |
[10] |
LIU Jia-ni, JIN Wei-qi*, LI Li, WANG Xia . Visible and Infrared Thermal Image Fusion Algorithm Based on Self-Adaptive Reference Image [J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2016, 36(12): 3907-3914. |
[11] |
LIN Su-zhen, WANG Dong-juan, WANG Xiao-xia, ZHU Xiao-hong. Multi-Band Texture Image Fusion Based on the Embedded Multi-Scale Decomposition and Possibility Theory[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2016, 36(07): 2337-2343. |
[12] |
LIN Su-zhen, YANG Feng-bao, CHEN Lei . Fusion of Dual Color MWIR Images Based on Support Value Transform and top-hat Decomposition [J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2014, 34(04): 1144-1150. |
[13] |
SHEN Yu1, DANG Jian-wu1, FENG Xin2, WANG Yang-ping1, HOU Yue1 . Infrared and Visible Images Fusion Based on Tetrolet Transform [J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2013, 33(06): 1506-1511. |
[14] |
DOU Wen1, SUN Hong-quan2, CHEN Yun-hao2* . Comparison among Remotely Sensed Image Fusion Methods Based on Spectral Response Function[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2011, 31(03): 746-752. |
[15] |
ZHANG Guo-kun1,2,CHEN Chun1,XING Fu3,ZHANG Hong-yan1*,ZHAO Yun-sheng1 . Spectral Radiometric Calibration Research of Quick Bird Digital Image[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2008, 28(03): 494-498. |
|
|
|
|