Introduction to deep learning techniques for NLP. The goals of this course are:
- To understand the structure of neural networks
- To be familiar with central concepts in deep learning for NLP
- To know the most common deep learning models applied in NLP
- To implement your own deep learning models using the pytorch library
News:
- Worth checking out: Troubleshooting Deep Neural Networks
- Moodle page up and running
- Slides and other materials available here after each session
Practical Info
Sharid Loáiciga
loaicigasanchez@uni-potsdam.de
Tuesdays 4:15pm - 6:00pm
Runs from 21.04.2020 to 21.07.2020
(Room 2.14.0.32) ONLINE until further notice
Session | Date | Content | Preparation Material | Release | Due |
---|---|---|---|---|---|
1 | 21.04.2020 | Introduction | YG ch.2 | reaction paragraph not required | |
2 | 28.04.2020 | Revision of linear algebra & statistics ; Pytorch basics | YG ch.2; M&S chs. 2,3,12 | A1 | rp + set up |
3 | 05.05.2020 | Feed forward networks (FFNs) | YG chs. 3&4; Rao ch3; M&S ch.5; derivatives; backpropagation | rp | |
4 | 12.05.2020 | QA + Word embeddings 1 | video; YG ch.8 | rp + A1 | |
5 | 19.05.2020 | Word embeddings 2 + intro to projects | stanford a1 | A2 | — |
6 | 26.05.2020 | NNs training | YG ch.5 | rp | |
7 | 02.06.2020 | no zoom meeting | – | – | A2 |
8 | 09.06.2020 | Recurrent neural networks (RNNs) | Intro to LMs, ML chapter, blog post1, blog post2. One book + one blog necessary for rp. | A3 | rp |
9 | 16.06.2020 | QA + Special RNNs (gradient issues, stacked, GRUs, LSTMs) | YG chs. 14 & 15 | rp | |
10 | 23.06.2020 | More RNNs (seq2seq), attention | dependency parsing, machine translation | rp | |
11 | 30.06.2020 | Paper discussion | Transformer, What does BERT look at? | A3 + rp + group contracts (03.07.20) + pick project topic | |
12 | 07.07.2020 | Convolutional NNs (CNNs), Start 4:30pm | blog post, YG ch.13 | rp | |
13 | 14.07.2020 | Project proposal presentations | |||
14 | 21.07.2020 | Project proposal presentations | – | – | any late assignments (first time submission) |
Reading material
[YG] Goldberg, Yoav (2017). Neural Network Methods in Natural Language Processing. Morgan & Claypool Publishers.
[M&S] Moore, Will H. & David A. Siegel (2013). A Mathematics Course for Political and Social Research. Princeton University Press.
Videos here.
[DR] Rao, Delip & Brian McMahan (2019). Natural language processing with PyTorch: build intelligent language applications using deep learning. Beijing: O’Reilly. IMPORTANT: please choose ‘read online’ in order not to block the book.
All jupyter notebooks used in this course come from the companion repository by Rao & McMahan.
⭐️All the books are available through the UP network.
Examination
The following aspects are needed to pass the course:
- reaction paragraph for each pack of preparation material assigned per week
- 3 assignments completed (A1, A2, A3)
- project work: presentation of a project proposal and written report of completed project.
All hand in deadlines refer to the day at 11:00pm
Late policy for assignments
There will be a second and final deadline for late submissions on July 14 21st, 11:00pm. If you fail to meet the first deadline for reaction paragraphs or assignments, you may use this second deadline.