Abstract
Many navigation applications take natural language speech as input, which avoids users typing in words and thus improves traffic safety. However, navigation applications often fail to understand a user’s free-form description of a route. In addition, they only support input of a specific source or destination, which does not enable users to specify additional route requirements. We propose a SpeakNav framework that enables users to describe intended routes via speech and then recommends appropriate routes. Specifically, we propose a novel Route Template based Bidirectional Encoder Representation from Transformers (RT-BERT) model that supports the understanding of natural language route descriptions. The model enables extraction of information of intended POI keywords and related distances. Then we formalize a template-driven path query that uses the extracted information. To enable efficient query processing, we develop a hybrid label index for computing network distances between POIs, and we propose a branch-and-bound algorithm along with a pivot reverse B-tree (PB-tree) index. Experiments with real and synthetic data indicate that RT-BERT offers high accuracy and that the proposed algorithm is capable of outperforming baseline algorithms.
Originalsprog | Engelsk |
---|---|
Tidsskrift | Proceedings of the VLDB Endowment |
Vol/bind | 14 |
Udgave nummer | 12 |
Sider (fra-til) | 3056-3068 |
Antal sider | 13 |
ISSN | 2150-8097 |
DOI | |
Status | Udgivet - 2021 |
Begivenhed | 47th International Conference on Very Large Data Bases, VLDB 2021 - Virtual, Online Varighed: 16 aug. 2021 → 20 aug. 2021 |
Konference
Konference | 47th International Conference on Very Large Data Bases, VLDB 2021 |
---|---|
By | Virtual, Online |
Periode | 16/08/2021 → 20/08/2021 |
Bibliografisk note
Funding Information:This research is supported in part by the NSFC (Grants No. 61902134, 62011530437), the Hubei Natural Science Foundation (Grant No. 2020CFB871), and the Fundamental Research Funds for the Central Universities (HUST: Grants No. 2019kfyXKJC021, 2019kfyXJJS091).
Publisher Copyright:
© The authors.