CUED Publications database

Context selection for embedding models

Liu, LP and Ruiz, FJR and Athey, S and Blei, DM (2017) Context selection for embedding models. In: UNSPECIFIED pp. 4817-4826..

Full text not available from this repository.


Word embeddings are an effective tool to analyze language. They have been recently extended to model other types of data beyond text, such as items in recommendation systems. Embedding models consider the probability of a target observation (a word or an item) conditioned on the elements in the context (other words or items). In this paper, we show that conditioning on all the elements in the context is not optimal. Instead, we model the probability of the target conditioned on a learned subset of the elements in the context. We use amortized variational inference to automatically choose this subset. Compared to standard embedding models, this method improves predictions and the quality of the embeddings.

Item Type: Conference or Workshop Item (UNSPECIFIED)
Divisions: Div F > Computational and Biological Learning
Depositing User: Cron Job
Date Deposited: 29 Oct 2018 20:08
Last Modified: 15 Apr 2021 06:55