Stop Thinking, Just Do!

Sungsoo Kim's Blog

Natural Language Processing with Deep Learning

tagsTags

1 September 2017


Article Source


Natural Language Processing with Deep Learning

Schedule and Syllabus

EventDateDescriptionCourse Materials
Lecture Jan 10 Introduction to NLP and Deep Learning Suggested Readings:
  1. [Linear Algebra Review]
  2. [Probability Review]
  3. [Convex Optimization Review]
  4. [More Optimization (SGD) Review]
[python tutorial]
[slides]
[Lecture Notes 1]
Lecture Jan 12 Word Vector Representations: word2vec Suggested Readings:
  1. [Word2Vec Tutorial - The Skip-Gram Model]
  2. [Distributed Representations of Words and Phrases and their Compositionality]
  3. [Efficient Estimation of Word Representations in Vector Space]
[slides]
Spotlight: [slides] [paper]
A1 released Jan 12 Assignment #1 released [Assignment 1][Written solution]
Lecture Jan 17 Advanced Word Vector Representations Suggested Readings:
  1. [GloVe: Global Vectors for Word Representation]
  2. [Improving Distributional Similarity with Lessons Learned fromWord Embeddings]
  3. [Evaluation methods for unsupervised word embeddings]
[slides]
[Lecture Notes 2]
Spotlight: [slides] [paper]
Lecture Jan 19 Word Window Classification and Neural Networks Suggested Readings:
  1. cs231n notes on [backprop] and [network architectures]
  2. [Review of differential calculus]
  3. [Natural Language Processing (almost) from Scratch]
  4. [Learning Representations by Backpropogating Errors]
[slides]
[Lecture Notes 3]
Lecture Jan 24 Backpropagation and Project Advice Suggested Readings:
  1. [Vector, Matrix, and Tensor Derivatives]
  2. Section 4 of [A Primer on Neural Network Models for Natural Language Processing]
[slides]
Spotlight: [slides] [paper]
Lecture Jan 26 Dependency Parsing Suggested Readings:
  1. Joakim Nivre. 2004. Incrementality in Deterministic Dependency Parsing. Workshop on Incremental Parsing.
  2. Danqi Chen and Christopher D. Manning. 2014. A Fast and Accurate Dependency Parser using Neural Networks. EMNLP 2014.
  3. Sandra Kübler, Ryan McDonald, Joakim Nivre. 2009. Dependency Parsing. Morgan and Claypool. [Free access from Stanford campus, only!]
  4. Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov, and Michael Collins. 2016. Globally Normalized Transition-Based Neural Networks. ACL 2016.
  5. Marie-Catherine de Marneffe, Timothy Dozat, Natalia Silveira, Katri Haverinen, Filip Ginter, Joakim Nivre, and Christopher D. Manning. 2014. Universal Stanford Dependencies: A cross-linguistic typology. Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC-2014). Revised version for UD v1.
  6. Universal Dependencies website
[slides]
[Lecture Notes 4]
Spotlight: [slides] [paper]
A1 Due Jan 26 Assignment #1 due
A2 Released Jan 26 Assignment #2 released [Assignment 2][Written solution]
Lecture Jan 31 Introduction to TensorFlow Suggested Readings:
  1. [TensorFlow Basic Usage]
[slides]
[Lecture Notes Tensorflow]
Spotlight: [slides] [paper]
Lecture Feb 2 Recurrent Neural Networks and Language Models [slides]
[vanishing grad example] [vanishing grad notebook]
Spotlight: [slides] [paper]
Lecture Feb 7 Machine translation and advanced recurrent LSTMs and GRUs [slides]
[Lecture Notes 5]
Spotlight: [slides] [paper 1] [paper 2] [paper 3]
Review Feb 9 Midterm Review [slides]
Project Proposal Due Feb 9 Final project proposal due [Project page]
A2 Due Feb 9 Assignment #2 due
A3 Released Feb 13 Assignment #3 released [Assignment 3][Written solution]
Midterm Feb 14 In-class midterm [Gradient Computation Notes]
Practice midterms: [Midterm 1] [Midterm 2] [Midterm 1 Solutions] [Midterm 2 Solutions]
Lecture Feb 16 Neural Machine Translation and Models with Attention Suggested Readings:
  1. [Sequence to Sequence Learning with Neural Networks]
  2. [Neural Machine Translation by Jointly Learning to Align and Translate]
  3. [Effective Approaches to Attention-based Neural Machine Translation]
[slides]
Spotlight: [slides] [paper]
Lecture Feb 21 Gated recurrent units and further topics in NMT Suggested Readings:
  1. [On Using Very Large Target Vocabulary for Neural Machine Translation]
  2. [Pointing the Unknown Words]
  3. [Neural Machine Translation of Rare Words with Subword Units]
  4. [Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models]
[slides]
[Lecture Notes 6]
Spotlight: [slides] [paper]
Lecture Feb 23 End-to-end models for Speech Processing [slides]
A3 Due Feb 25 Assignment #3 due
A4 Released Feb 25 Assignment #4 released Default final project [Assignment 4]
Lecture Feb 28 Convolutional Neural Networks Suggested Readings:
  1. [A Convolutional Neural Network for Modelling Sentences]
  2. [Convolutional Neural Networks for Sentence Classification]
[slides]
Spotlight: [slides] [paper]
Lecture Mar 2 Tree Recursive Neural Networks and Constituency Parsing Suggested Readings:
  1. [Parsing with Compositional Vector Grammars]
  2. [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank]
  3. [Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks]
[slides]
[Lecture Notes 7]
Spotlight: [slides] [paper]
Lecture Mar 7 Coreference Resolution Suggested Readings:
  1. [Easy Victories and Uphill Battles in Coreference Resolution]
  2. [Deep Reinforcement Learning for Mention-Ranking Coreference Models]
[slides]
Lecture Mar 9 Dynamic Neural Networks for Question Answering [slides]
[Lecture Notes 8]
Spotlight: [slides] [paper]
Lecture Mar 14 Issues in NLP and Possible Architectures for NLP [slides]
Spotlight: [slides] [paper]
Lecture Mar 16 Tackling the Limits of Deep Learning for NLP [slides]
Spotlight: [slides] [paper 1] [paper 2]
Final Project Due Mar 17 Final course project / Assignment #4 due
Poster Presentation Mar 21 Final project poster presentations 12:15-3:15, Lathrop Library Second Floor
[Piazza Post on Logistics] [Facebook Event]

comments powered by Disqus