Contextually propagated term weights for document representation

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Word embeddings predict a word from its neighbours by learning small, dense embedding vectors. In practice, this prediction corresponds to a semantic score given to the predicted word (or term weight). We present a novel model that, given a target word, redistributes part of that word's weight (that has been computed with word embeddings) across words occurring in similar contexts as the target word. Thus, our model aims to simulate how semantic meaning is shared by words occurring in similar contexts, which is incorporated into bag-of-words document representations. Experimental evaluation in an unsupervised setting against 8 state of the art baselines shows that our model yields the best micro and macro F1 scores across datasets of increasing difficulty.

Original languageEnglish
Title of host publicationSIGIR 2019 - Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval
PublisherAssociation for Computing Machinery
Publication date18 Jul 2019
Pages897-900
ISBN (Electronic)9781450361729
DOIs
Publication statusPublished - 18 Jul 2019
Event42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2019 - Paris, France
Duration: 21 Jul 201925 Jul 2019

Conference

Conference42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2019
LandFrance
ByParis
Periode21/07/201925/07/2019
SponsorACM SIGIR
SeriesSIGIR 2019 - Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval

    Research areas

  • Contextual semantics, Document representation, Word embeddings

Links

ID: 239566043