Rémi Philippe Lebret, Ronan Collobert
Recent works on word representations mostly rely on predictive models. Distributed word representations (aka word embeddings) are trained to optimally predict the contexts in which the corresponding words tend to appear. Such models have succeeded in captu ...
Springer International Publishing2015