Deep Mind x UCL, Deep Learning Lectures, 7, 12, Deep Learning for Natural Language Processing
This lecture, by DeepMind Research Scientist Felix Hill, is split into three parts. First, he discusses the motivation for modelling language with ANNs: language is highly contextual, typically noncompositional and relies on reconciling many competing sources of information. This section also covers Elman s Finding Structure in Time and simple recurrent networks, the importance of context and transformers. In the second part, he explores unsupervised and representation learning for language from Word2Vec
|
|