logo
banner

Journals & Publications

Journals Publications Papers

Papers

Adaptive bit allocation product quantization
Dec 11, 2015Author:
PrintText Size A A

Title: Adaptive bit allocation product quantization

Authors: Guo, QZ; Zeng, Z; Zhang, SW; Zhang, GX; Zhang, Y

Author Full Names: Guo, Qin-Zhen; Zeng, Zhi; Zhang, Shuwu; Zhang, Guixuan; Zhang, Yuan

Source: NEUROCOMPUTING, 171 866-877; 10.1016/j.neucom.2015.07.062 JAN 1 2016

ISSN: 0925-2312

eISSN: 1872-8286

Unique ID: WOS:000364883900086

 

Abstract:

Product quantization (PQ) is a popular vector quantization method for approximate nearest neighbor search. The key idea of PQ is to decompose the original data space into the Cartesian product of some low-dimensional subspaces and then every subspace is quantized separately with the same number of codewords. However, the performance of PQ depends largely on the distribution of the original data. If the energies of subspaces are extremely unbalanced, PQ will achieve bad results. In this paper, we propose an adaptive bit allocation product quantization (BAPQ) method to deal with the problem of unbalanced energies of subspaces in PQ. In BAPQ we adaptively allocate different numbers of codewords (or bits) to subspaces to quantize data for minimizing the total quantization distortion. The essence of our method is to find the optimal bit allocation scheme. To this end, we formulate an objective function about minimizing quantization distortion with respect to bit allocation scheme and adopt a greedy algorithm to find the near-optimal solution. BAPQ can achieve lower quantization distortion than PQ and optimized product quantization (OPQ). Besides, both bias and variance of difference between the true distance and the BAPQs estimated distance are reduced from those of PQ and OPQ Extensive experiments have verified the superiority of BAPQ over state-of-the-art approaches. (C) 2015 Elsevier B.V. All rights reserved.

 

*Click Here to View Full Record