Deep learned compact binary descriptor with a lightweight network-in-network architecture for visual description

Document Type

Article

Publication Date

2-1-2021

Abstract

Binary descriptors have been widely used for real-time image retrieval and correspondence matching. However, most of the learned descriptors are obtained using a large deep neural network (DNN) with several million parameters, and the learned binary codes are generally not invariant to many geometrical variances which is crucial for accurate correspondence matching. To address this problem, we proposed a new learning approach using a lightweight DNN architecture via a stack of multiple multilayer perceptrons based on the network in network (NIN) architecture, and a restricted Boltzmann machine (RBM). The latter is used for mapping the features to binary codes, and carry out the geometrically invariant correspondence matching task. Our experimental results on several benchmark datasets (e.g., Brown, Oxford, Paris, INRIA Holidays, RomePatches, HPatches, and CIFAR-10) show that the proposed approach produces the learned binary descriptor that outperforms other baseline self-supervised binary descriptors in terms of correspondence matching despite the smaller size of its DNN. Most importantly, the proposed approach does not freeze the features that are obtained while pre-training the NIN model. Instead, it fine-tunes the features while learning the features needed for binary mapping through the RBM. Additionally, its lightweight architecture makes it suitable for resource-constrained devices.

Keywords

Binary descriptor, Network-in-network, Restricted Boltzmann machine, Correspondence matching, Lightweight deep neural network

Divisions

Computer

Funders

Senate Research Council, University of Moratuwa, Sri Lanka [SRC-16-1],National Research Council, Sri Lanka [12-017]

Publication Title

Visual Computer

Volume

37

Issue

2

Publisher

Springer

Publisher Location

ONE NEW YORK PLAZA, SUITE 4600, NEW YORK, NY, UNITED STATES

This document is currently not available here.

Share

COinS