光谱学与光谱分析 |
|
|
|
|
|
Multi-Band Texture Image Fusion Based on the Embedded Multi-Scale Decomposition and Possibility Theory |
LIN Su-zhen, WANG Dong-juan, WANG Xiao-xia, ZHU Xiao-hong |
School of Computer and Control Engineering, North University of China, Taiyuan 030051, China |
|
|
Abstract The combination of multi-scale transform and the rules which are “high-frequency coefficients combined by selecting the maximum gray value or energy” and “low-pass ones combined by weighting average” is an effective method in dual-band image fusion. However, when these methods are used to fuse multi-band images, sequential weighted average often leads the weakening of the inherent different information of original images, which affects the subsequent target recognition and scene understanding. The problem is more obvious when fusing multi-band images with texture features. In order to describe the scene in a more comprehensive and precise way, a new multi-band texture image fusion method based on embedded multi-scale decomposition and possibility theory is proposed. The method consists of three parts. The original multi-band images are decomposed into their high- and low-frequency components through a multi-scale transform. The high-frequency components are fused per-pixel by extracting the maximum gray value, whereas the last layer of low-frequency components of original multi-band images with the largest standard deviation is blocked through the another multi-scale transform. Based on the specific sizes and positions of these blocks, the remaining two original images are divided. All the blocks from three bands are traversely fused according to the possibility theory, and the low-frequency image is formed by mosaicing these fused blocks. Then, this image is inversely transformed with its high-frequency counterparts to get the final fusion image. This method not only integrates the pixel-level with feature-level fusion methods, but also integrates the space domain with transform domain technologies together, and solves the problem of sawtooth effect on the edge of the target through the different fusion rules with the different sizes of blocks. The validity of the method proposed is proved.
|
Received: 2015-06-02
Accepted: 2015-10-11
|
|
Corresponding Authors:
LIN Su-zhen
E-mail: lsz@nuc.edu.cn
|
|
[1] Cetin A E, Dimitropoulos K, Gouverneur B. Digital Signal Processing, 2013, 23(6): 1827. [2] Rogalski A. Progress in Quantum Electronics, 2012, 36(2-3): 342. [3] Yuhendra, Alimuddin I, Josaphat T S, et al. International Journal of Applied Earth Observation and Geoinformation, 2012, 18: 165. [4] Gangapure V N, Banerjee S, Chowdhury A S, et al. Inf. Fusion, 2015, 23: 99. [5] Ellmauthaler A, Pagliari C L, DaSilva E A B. IEEE Transactions on Image Processing, 2013, 22(3):1005. [6] Lin S Z, Zhu X H, Wang D J, et al. Journal of Computer Research and Development, 2015, 52(4): 952. [7] Jiang Y, Wang M H. Inf. Fusion, 2014, 18: 107. [8] Adu J H, Gan J H, Wang Y, et al. Infrared Physics & Technology, 2013, 61: 94. [9] Yang F B, Wei H. Infrared Physics & Technology, 2013, 60:235. [10] Ji L N, Yang F B, Wang X X, et al. Optik, 2014, 125(16): 4583. [11] Fay D A, Waxman A M, Aguilar M, et al. Proceedings of the 3rd International Conference on Information Fusion, 2000, 1: TuD33.
|
[1] |
SHEN Yu, YUAN Yu-bin*, PENG Jing. Research on Near Infrared and Color Visible Fusion Based on PCNN in Transform Domain[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2021, 41(07): 2023-2027. |
[2] |
CUI Xiao-rong, SHEN Tao*, HUANG Jian-lu, SUN Bin-bin. Infrared Mid-Wave and Long-Wave Image Fusion Based on FABEMD and Improved Local Energy Window[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2021, 41(07): 2043-2049. |
[3] |
ZHANG Jin1, WANG Jie1, SHEN Yan3, ZHANG Jin-bo4, CUI Hong-liang1,2*, SHI Chang-cheng2*. Wavelet-Based Image Fusion Method Applied in the Terahertz Nondestructive Evaluation[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2017, 37(12): 3683-3688. |
[4] |
LIU Feng1, SHEN Tong-sheng2, GUO Shao-jun1,ZHANG Jian3. Multi-Spectral Ship Target Recognition Based on Feature Level Fusion[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2017, 37(06): 1934-1940. |
[5] |
LIU Jia-ni, JIN Wei-qi*, LI Li, WANG Xia . Visible and Infrared Thermal Image Fusion Algorithm Based on Self-Adaptive Reference Image [J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2016, 36(12): 3907-3914. |
[6] |
LIN Su-zhen, YANG Feng-bao, CHEN Lei . Fusion of Dual Color MWIR Images Based on Support Value Transform and top-hat Decomposition [J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2014, 34(04): 1144-1150. |
[7] |
SHEN Yu1, DANG Jian-wu1, FENG Xin2, WANG Yang-ping1, HOU Yue1 . Infrared and Visible Images Fusion Based on Tetrolet Transform [J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2013, 33(06): 1506-1511. |
[8] |
DOU Wen1, SUN Hong-quan2, CHEN Yun-hao2* . Comparison among Remotely Sensed Image Fusion Methods Based on Spectral Response Function[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2011, 31(03): 746-752. |
[9] |
ZHANG Guo-kun1,2,CHEN Chun1,XING Fu3,ZHANG Hong-yan1*,ZHAO Yun-sheng1 . Spectral Radiometric Calibration Research of Quick Bird Digital Image[J]. SPECTROSCOPY AND SPECTRAL ANALYSIS, 2008, 28(03): 494-498. |
|
|
|
|