Journal
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Volume 31, Issue 12, Pages 2143-2157Publisher
IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2009.151
Keywords
Metric learning; similarity search; locality-sensitive hashing; LogDet divergence; kernel learning; image search
Funding
- US National Science Foundation (NSF) [0747356, EIA-0303609]
- Microsoft Research
- US Defense Advanced Research Projects Agency (DARPA) VIRAT
- Henry Luce Foundation
- Div Of Information & Intelligent Systems
- Direct For Computer & Info Scie & Enginr [0747356] Funding Source: National Science Foundation
Ask authors/readers for more resources
We introduce a method that enables scalable similarity search for learned metrics. Given pairwise similarity and dissimilarity constraints between some examples, we learn a Mahalanobis distance function that captures the examples' underlying relationships well. To allow sublinear time similarity search under the learned metric, we show how to encode the learned metric parameterization into randomized locality-sensitive hash functions. We further formulate an indirect solution that enables metric learning and hashing for vector spaces whose high dimensionality makes it infeasible to learn an explicit transformation over the feature dimensions. We demonstrate the approach applied to a variety of image data sets, as well as a systems data set. The learned metrics improve accuracy relative to commonly used metric baselines, while our hashing construction enables efficient indexing with learned distances and very large databases.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available