Retour à la vue des calendrier
Lundi 10 Février
Heure: 12:30 - 13:30
Lieu: Salle B107, bâtiment B, Université de Villetaneuse
Résumé: Attention is all I need
Description: José Angel Gonzalez-Barba The use of attention mechanisms has been widespreaded along all the natural language processing tasks. These kind of mechanisms have increased the capacity of Deep Learning models allowing them to focus explicitly on the most discriminant relationships and properties for a given task. Recently, the Transformer model have replaced Convolutional and Recurrent neural networks in many NLP tasks, mainly due to its capability of modeling sequences, avoiding the sequential processing by using only attention mechanisms. In this talk I will speak about the application of the Transformer encoders to text classification in social media (Sentiment Analysis and Irony Detection in Twitter) and its application in a novel framework for extractive summarization.

About the author:
José Angel just finished his PhD and he is going to start a PostDoc in the group of Yoshua Bengio at the University of Montreal. His works on Spanish NLP have been very promising and he developed some state-of-the-art systems for sentiment analysis and summarization.