Towards Useful Word Embeddings: Evaluation on Information Retrieval, Text Classification, and Language Modeling

Warning

This publication doesn't include Institute of Computer Science. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

NOVOTNÝ Vít ŠTEFÁNIK Michal LUPTÁK Dávid SOJKA Petr

Year of publication 2020
Type Article in Proceedings
Conference Proceedings of the Fourteenth Workshop on Recent Advances in Slavonic Natural Language Processing, RASLAN 2020
MU Faculty or unit

Faculty of Informatics

Citation
Web
Keywords Evaluation; word vectors; word2vec; fastText; information retrieval; text classification; language modeling
Description

Since the seminal work of Mikolov et al. (2013), word vectors of log-bilinear models have found their way into many NLP applications and were extended with the positional model.

Although the positional model improves accuracy on the intrinsic English word analogy task, prior work has neglected its evaluation on extrinsic end tasks, which correspond to real-world NLP applications.

In this paper, we describe our first steps in evaluating positional weighting on the information retrieval, text classification, and language modeling extrinsic end tasks.

Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.

More info