Multi-scale Orderless Pooling of Deep Convolutional Activation Features

Yunchao Gong1, Liwei Wang2, Ruiqi Guo2, and Svetlana Lazebnik2

1Department of Computer Science, University of North Carolina at Chapel Hill

2Department of Computer Science, University of Illinois at Urbana-Champaign

ECCV 2014

 

Abstract: Deep convolutional neural networks (CNN) have shown their promise as a universal representation for recognition. However, global CNN activations lack geometric invariance, which limits their robustness for classification and matching of highly variable scenes. To improve the invariance of CNN activations without degrading their discriminative power, this paper presents a simple but effective scheme called multi-scale orderless pooling (MOP-CNN). This scheme extracts CNN activations for local patches at multiple scale levels, performs orderless VLAD pooling of these activations at each level separately, and concatenates the result. The resulting MOP-CNN representation can be used as a generic feature for either supervised or unsupervised recognition tasks, from image classification to instance-level retrieval; it consistently outperforms global CNN activations without requiring any joint training of prediction layers for a particular target dataset. In absolute terms, it achieves state-of-the-art results on the challenging SUN397 and MIT Indoor Scenes classification datasets, and competitive results on ILSVRC2012/2013 classification and INRIA Holidays retrieval datasets.

 

 

Paper:


Yunchao Gong, Liwei Wang, Ruiqi Guo, and Svetlana Lazebnik. Multi-scale Orderless Pooling of Deep Convolutional Activation Features. In ECCV 2014. [PDF LINK] *This version of the paper is corrected from the camera-ready version (see footnote on p. 4).

 

Code

To download the whole implementation package in Matlab, please click the download link below. It has two folders. One is for extracting deep activation features based on the Caffe; the other one is for learning the final MOPCNN feature from patch features. Details are illustrated in "read me" in both folders. - package size: 990KB.

[Download]