Distributed k-Nearest Neighbor Queries in Metric Spaces

Xin Ding, Yuanliang Zhang, Lu Chen, Yujun Gao, Baihua Zheng

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

5 Citations (Scopus)

Abstract

Metric k nearest neighbor (MkNN) queries have applications in many areas such as multimedia retrieval, computational biology, and location-based services. With the growing volumes of data, a distributed method is required. In this paper, we propose an Asynchronous Metric Distributed System (AMDS), which uniformly partitions the data with the pivot-mapping technique to ensure the load balancing, and employs publish/subscribe communication model to asynchronously process large scale of queries. The employment of asynchronous processing model also improves robustness and efficiency of AMDS. In addition, we develop an efficient estimation based MkNN method using AMDS to improve the query efficiency. Extensive experiments using real and synthetic data demonstrate the performance of MkNN using AMDS. Moreover, the AMDS scales sub-linearly with the growing data size.
Original languageEnglish
Title of host publicationWeb and Big Data - Second International Joint Conference, APWeb-WAIM 2018, Proceedings
EditorsJianliang Xu, Yoshiharu Ishikawa, Yi Cai
Number of pages17
Volume1
PublisherSpringer
Publication date2018
Pages236-252
ISBN (Electronic)978-3-319-96890-2
DOIs
Publication statusPublished - 2018
EventSecond International Joint Conference, APWeb-WAIM 2018 - Macau, China
Duration: 23 Jul 201825 Jul 2018

Conference

ConferenceSecond International Joint Conference, APWeb-WAIM 2018
Country/TerritoryChina
CityMacau
Period23/07/201825/07/2018
SeriesLecture Notes in Computer Science
Volume10987
ISSN0302-9743

Keywords

  • Algorithm
  • Metric space
  • Publish/subscribe
  • Query processing
  • k nearest neighbor query

Fingerprint

Dive into the research topics of 'Distributed k-Nearest Neighbor Queries in Metric Spaces'. Together they form a unique fingerprint.

Cite this