Low-rank Bilinear Pooling for Fine-Grained Classification

被引:270
作者
Kong, Shu [1 ]
Fowlkes, Charless [1 ]
机构
[1] Univ Calif Irvine, Dept Comp Sci, Irvine, CA 92697 USA
来源
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017) | 2017年
基金
美国国家科学基金会;
关键词
D O I
10.1109/CVPR.2017.743
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
Pooling second-order local feature statistics to form a high-dimensional bilinear feature has been shown to achieve state-of-the-art performance on a variety of fine-grained classification tasks. To address the computational demands of high feature dimensionality, we propose to represent the covariance features as a matrix and apply a low-rank bilinear classifier. The resulting classifier can be evaluated without explicitly computing the bilinear feature map which allows for a large reduction in the compute time as well as decreasing the effective number of parameters to be learned. To further compress the model, we propose a classifier co-decomposition that factorizes the collection of bilinear classifiers into a common factor and compact perclass terms. The co-decomposition idea can be deployed through two convolutional layers and trained in an end-to-end architecture. We suggest a simple yet effective initialization that avoids explicitly first training and factorizing the larger bilinear classifiers. Through extensive experiments, we show that our model achieves state-of-theart performance on several public datasets for fine-grained classification trained with only category labels. Importantly, our final model is an order of magnitude smaller than the recently proposed compact bilinear model [8], and three orders smaller than the standard bilinear CNN model [19].
引用
收藏
页码:7025 / 7034
页数:10
相关论文
共 38 条
[1]
[Anonymous], 2013, Tech. rep.
[2]
[Anonymous], 2016, CVPR
[3]
[Anonymous], AIS TATS
[4]
[Anonymous], 2016, CVPR
[5]
[Anonymous], 2011, Technical Report CNS-TR-2011-001
[6]
[Anonymous], 2016, CVPR
[7]
[Anonymous], J MACHINE LEARNING R
[8]
[Anonymous], ICLR WORKSH
[9]
[Anonymous], 2016, KDD
[10]
[Anonymous], 2016, CVPR