Abstract
Maximum inner product search (MIPS) in high-dimensional spaces has wide applications but is computationally expensive due to the curse of dimensionality. Existing studies employ asymmetric transformations that reduce the MIPS problem to a nearest neighbor search (NNS) problem, which can be solved using locality-sensitive hashing (LSH). However, these studies usually maintain multiple hash tables and locally examine them one by one, which may cause additional costs on probing unnecessary points. In addition, LSH is applied without taking into account the properties of the inner product. In this paper, we develop a fast search framework FARGO for MIPS on large-scale, high-dimensional data. We propose a global multi-probing (GMP) strategy that exploits the properties of the inner product to globally examine high quality candidates. In addition, we develop two optimization techniques. First, different with existing transformations that introduce either distortion errors or data distribution imbalances, we design a novel transformation, called random XBOX transformation, that avoids the negative effects of data distribution imbalances. Second, we propose a global adaptive early termination condition that finds results quickly and offers theoretical guarantees. We conduct extensive experiments with real-world data that offer evidence that FARGO is capable of outperforming existing proposals in terms of both accuracy and efficiency.
Original language | English |
---|---|
Journal | Proceedings of the VLDB Endowment |
Volume | 16 |
Issue number | 5 |
Pages (from-to) | 1100-1112 |
Number of pages | 13 |
ISSN | 2150-8097 |
DOIs | |
Publication status | Published - 2023 |
Event | 49th International Conference on Very Large Data Bases, VLDB 2023 - Vancouver, Canada Duration: 28 Aug 2023 → 1 Sept 2023 |
Conference
Conference | 49th International Conference on Very Large Data Bases, VLDB 2023 |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 28/08/2023 → 01/09/2023 |
Bibliographical note
Publisher Copyright:© 2023, VLDB Endowment. All rights reserved.