DIKU Bits: Leveraging Wikipedia Hyperlinks to Ground Word Representations Across Languages

SpeakerPortræt af Tommaso Pasini

Tommaso Pasini, Postdoc​ in the Natural Language Processing Section at the Department of Computer Science. 

Abstract

Language models have quickly become the de facto standard when processing text. Recently, several approaches have been proposed to further enrich their representations with external knowledge sources. However, these models are usually suited for monolingual use only. This presentation will go through the most successful neural architectures for grounding languages on knowledge bases and explore how to make such models more robust at representing word semantics across languages.

Zooming in on Tommaso

Which courses do you teach?:
None

Which technology/research/projects/startup are you excited to see the evolution of?
An obvious one is DeepMind, they are making so many breakthroughs across different fields (from games to biology), which is hard not to get excited about! Further, I am very curious to see how more collaborations between AI/ML experts and specialists in other fields such as Medicine, Biology, Law, etc., can contribute to advance these areas in the next 5/10 years.

What is your favorite sketch from the DIKUrevy?
“We shall code it in C”. I really laughed at it as I also studied C as a first programming language but differently from the 3 guys in the sketch I believe that learning python as first language 1) allows not to turn crazy after the 1st year 2) to learn programming logic much faster 3) to implement cool stuff (e.g., ML, simple games, etc.) much sooner than if you were stuck at finding out why of each segmentation fault!