Publication Date:
2021
abstract:
In Information Retrieval and Natural Language Processing, representation of discrete objects, e.g., words, usually relies on embedding in vector space; this representation typically ignores sequential information. One instance of such sequential information is temporal evolution. For example, when discrete objects are words, their meaning may smoothly change over time. For this reason, previous works proposed dynamic word embeddings to model this sequential information in word representation explicitly. This paper introduces a representation that relies on sinusoidal functions to capture the sequential order of discrete objects in vector space.
Iris type:
04.01 - Contributo in atti di convegno
Keywords:
Dynamic word embedding; Sequential modeling; Sinusoidal functions; Vector space
List of contributors:
Wang, B.; Di Buccio, E.; Melucci, M.
Book title:
CEUR Workshop Proceedings
Published in: