=Paper= {{Paper |id=Vol-2077/keynote2 |storemode=property |title=None |pdfUrl=https://ceur-ws.org/Vol-2077/keynote2.pdf |volume=Vol-2077 }} ==None== https://ceur-ws.org/Vol-2077/keynote2.pdf
    Word embeddings, information retrieval and textual
                       entailment

                                              Eric Gaussier
                                       University of Grenoble Alps
                                          eric.gaussier@imag.fr




Abstract
Word embeddings currently are one of the preferred representation for words in various NLP and IR tasks. In
this talk, we will review the main embeddings used in IR and see to which extent they lead to improved IR
performance. We will also discuss the possibility to extend current word embeddings with syntactic information
and see the impact of doing so on several NLP tasks.




Short Bio
Prof. Eric Gaussier is known for his work on the intersection of Artificial Intelligence (AI) and Data Science
(DS), in particular for his contributions on models and alogrithms to extract information, insights and knowledge
from data in various forms. He has worked on three main subfields of AI and DS: machine learning, information
retrieval and computational linguistics. He is also interested in modeling how (textual) information is shared in
social (content) networks, and how such networks evolve over time. More recently, He has also been working
on improving job scheduling techniques through machine learning, and in learning representations for different
types of sequences, as texts and Time series.