4.7 Article

Distributed Adaptive Binary Quantization for Fast Nearest Neighbor Search

期刊

IEEE TRANSACTIONS ON IMAGE PROCESSING
卷 26, 期 11, 页码 5324-5336

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2017.2729896

关键词

Locality-sensitive hashing; nearest neighbor search; binary quantization; distributed learning; product quantization

资金

  1. National Natural Science Foundation of China [61370125, 61402026]
  2. Beijing Municipal Science and Technology Commission [Z171100000117022]
  3. Foundation of State Key Laboratory of Software Development Environment [SKLSDE-2016ZX04]
  4. Foundation of Shaanxi Key Industrial Innovation Chain [2017ZDCXL-GY-05-04-02]

向作者/读者索取更多资源

Hashing has been proved an attractive technique for fast nearest neighbor search over big data. Compared with the projection based hashing methods, prototype-based ones own stronger power to generate discriminative binary codes for the data with complex intrinsic structure. However, existing prototype-based methods, such as spherical hashing and K-means hashing, still suffer from the ineffective coding that utilizes the complete binary codes in a hypercube. To address this problem, we propose an adaptive binary quantization (ABQ) method that learns a discriminative hash function with prototypes associated with small unique binary codes. Our alternating optimization adaptively discovers the prototype set and the code set of a varying size in an efficient way, which together robustly approximate the data relations. Our method can be naturally generalized to the product space for long hash codes, and enjoys the fast training linear to the number of the training data. We further devise a distributed framework for the large-scale learning, which can significantly speed up the training of ABQ in the distributed environment that has been widely deployed in many areas nowadays. The extensive experiments on four large-scale (up to 80 million) data sets demonstrate that our method significantly outperforms state-of-the-art hashing methods, with up to 58.84% performance gains relatively.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据