A hierarchical recurrent encoder-decoder for generative context-aware query suggestion
Publikation: Bidrag til bog/antologi/rapport › Konferencebidrag i proceedings › Forskning › fagfællebedømt
Dokumenter
- A Hierarchical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion
Accepteret manuskript, 943 KB, PDF-dokument
Users may strive to formulate an adequate textual query for
their information need. Search engines assist the users by
presenting query suggestions. To preserve the original search
intent, suggestions should be context-aware and account for
the previous queries issued by the user. Achieving context
awareness is challenging due to data sparsity. We present
a probabilistic suggestion model that is able to account for
sequences of previous queries of arbitrary lengths. Our novel
hierarchical recurrent encoder-decoder architecture allows
the model to be sensitive to the order of queries in the context
while avoiding data sparsity. Additionally, our model
can suggest for rare, or long-tail, queries. The produced suggestions
are synthetic and are sampled one word at a time,
using computationally cheap decoding techniques. This is in
contrast to current synthetic suggestion models relying upon
machine learning pipelines and hand-engineered feature sets.
Results show that it outperforms existing context-aware approaches
in a next query prediction setting. In addition to
query suggestion, our model is general enough to be used in
a variety of other applications.
their information need. Search engines assist the users by
presenting query suggestions. To preserve the original search
intent, suggestions should be context-aware and account for
the previous queries issued by the user. Achieving context
awareness is challenging due to data sparsity. We present
a probabilistic suggestion model that is able to account for
sequences of previous queries of arbitrary lengths. Our novel
hierarchical recurrent encoder-decoder architecture allows
the model to be sensitive to the order of queries in the context
while avoiding data sparsity. Additionally, our model
can suggest for rare, or long-tail, queries. The produced suggestions
are synthetic and are sampled one word at a time,
using computationally cheap decoding techniques. This is in
contrast to current synthetic suggestion models relying upon
machine learning pipelines and hand-engineered feature sets.
Results show that it outperforms existing context-aware approaches
in a next query prediction setting. In addition to
query suggestion, our model is general enough to be used in
a variety of other applications.
Originalsprog | Engelsk |
---|---|
Titel | CIKM '15 Proceedings of the 24th ACM International on Conference on Information and Knowledge Management |
Antal sider | 10 |
Forlag | Association for Computing Machinery |
Publikationsdato | 2015 |
Sider | 553-562 |
ISBN (Elektronisk) | 978-1-4503-3794-6 |
DOI | |
Status | Udgivet - 2015 |
Begivenhed | CIKM 2015: ACM International Conference on Information and Knowledge Management - Melbourne, Australien Varighed: 19 okt. 2015 → 23 okt. 2015 |
Konference
Konference | CIKM 2015 |
---|---|
Land | Australien |
By | Melbourne |
Periode | 19/10/2015 → 23/10/2015 |
Links
- http://arxiv.org/pdf/1507.02221v1.pdf
Accepteret manuskript
Antal downloads er baseret på statistik fra Google Scholar og www.ku.dk
Ingen data tilgængelig
ID: 159746271