DIKU Bits: Meaning Representation and Parsing
Speaker
Daniel Hershcovich, postdoc in the Machine Learning section at DIKU.
Abstract
Natural Language Processing models can complete your sentences surprisingly well. Do they capture what you mean, or simply exploit statistical tendencies in their massive training data? Is there a difference?
I will present symbolic frameworks that represent sentence meaning, exposing challenges in language understanding and helping AI systems learn how to reason, interface with knowledge-bases and even translate between languages.
Zooming in on Daniel Hershcovich
Which courses do you teach? (BSc and MSc)
None yet, but stay tuned!
Which technology/research/projects/startup are you excited to see the evolution of?
Virtual assistants. Looking forward to when they can effortlessly answer complex queries that may come up during a conversation (like “most prolific Harry Potter director”), so we won’t have to pull out our phones and zone out.
What is your favorite sketch from the DIKUrevy?
(Generated by AI, since I haven’t seen it yet).
"Clean on the Inside", which, at the moment, has been passed around the internet as a problem from the movie for a month and is still doing the rounds. It's about how washing a mirror will usually just cause it to get dirty inside. This is probably some kind of thing that the X-Men are really very interested in.