Learning to Diversify Deep Belief Networks for Hyperspectral Image Classification

被引:293
作者
Zhong, Ping [1 ]
Gong, Zhiqiang [1 ]
Li, Shutao [2 ]
Schoenlieb, Carola-Bibiane [3 ]
机构
[1] Natl Univ Def Technol, Coll Elect Sci & Engn, Changsha 410073, Hunan, Peoples R China
[2] Hunan Univ, Coll Elect & Informat Engn, Changsha 410082, Hunan, Peoples R China
[3] Univ Cambridge, Dept Appl Math & Theoret Phys, Cambridge CB3 0WA, England
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2017年 / 55卷 / 06期
基金
英国工程与自然科学研究理事会;
关键词
Deep belief network (DBN); diversity; hyperspectral image; image classification; SELECTION;
D O I
10.1109/TGRS.2017.2675902
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
In the literature of remote sensing, deep models with multiple layers have demonstrated their potentials in learning the abstract and invariant features for better representation and classification of hyperspectral images. The usual supervised deep models, such as convolutional neural networks, need a large number of labeled training samples to learn their model parameters. However, the real-world hyperspectral image classification task provides only a limited number of training samples. This paper adopts another popular deep model, i.e., deep belief networks (DBNs), to deal with this problem. The DBNs allow unsupervised pretraining over unlabeled samples at first and then a supervised fine-tuning over labeled samples. But the usual pretraining and fine-tuning method would make many hidden units in the learned DBNs tend to behave very similarly or perform as "dead" (never responding) or "potential over-tolerant" (always responding) latent factors. These results could negatively affect description ability and thus classification performance of DBNs. To further improve DBN's performance, this paper develops a new diversified DBN through regularizing pretraining and fine-tuning procedures by a diversity promoting prior over latent factors. Moreover, the regularized pretraining and fine-tuning can be efficiently implemented through usual recursive greedy and back-propagation learning framework. The experiments over real-world hyperspectral images demonstrated that the diversity promoting prior in both pretraining and fine-tuning procedure lead to the learned DBNs with more diverse latent factors, which directly make the diversified DBNs obtain much better results than original DBNs and comparable or even better performances compared with other recent hyperspectral image classification methods.
引用
收藏
页码:3516 / 3530
页数:15
相关论文
共 46 条
  • [1] [Anonymous], P NEUR INF PROC SYST
  • [2] [Anonymous], 2008, IEEE INT GEOSCI REMO
  • [3] [Anonymous], P EUR C COMP VIS ECC
  • [4] [Anonymous], P IEEE APPL IM PATT
  • [5] [Anonymous], 2005, NEURAL NETWORKS PATT
  • [6] [Anonymous], P INT C MACH LEARN
  • [7] [Anonymous], IEEE T GEOSCIENCE RE
  • [8] [Anonymous], 2012, ABS12070580 CORR
  • [9] Hyperspectral Remote Sensing Data Analysis and Future Challenges
    Bioucas-Dias, Jose M.
    Plaza, Antonio
    Camps-Valls, Gustavo
    Scheunders, Paul
    Nasrabadi, Nasser M.
    Chanussot, Jocelyn
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING MAGAZINE, 2013, 1 (02) : 6 - 36
  • [10] Kernel-based methods for hyperspectral image classification
    Camps-Valls, G
    Bruzzone, L
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2005, 43 (06): : 1351 - 1362