Home
Projects
Resources
Alternatives
Blog
Sign In
Dl In Nlp 2019
Save
Taking together Stanford cs224n course with support of iPavlov team.
Overview
Reviews
Resources
Project README
dl_in_nlp_2019
Taking together Stanford cs224n course with support of iPavlov team.
News:
https://t.me/dlinnlp
Chat:
https://t.me/joinchat/B-HNhhRwNwBnSgYIkOhyPA
Piazza:
http://piazza.com/mipt/spring2019/ipavlov1
Access code: ipavlov1
Syllabus:
https://docs.google.com/spreadsheets/d/16oB-WgeHqJEm29Er_Zu3X1mQRXfOFidqGnNP0cbbAuo/
Google calendar:
https://clck.ru/FBsQs
Community articles links:
https://www.mendeley.com/community/dl-in-nlp-course/
--------------------------------------------------------------------------------------------------
Week 1
Lecture 2 | Word Vector Representations (word2vec):
https://youtu.be/ERibwqs9p38
Quizz before 07.02.2019 :
https://goo.gl/forms/9EZmQhqfhicW0yaO2
Quizz answers :
Evernote Link
Seminar materials:
Part 1. Conversational Artificial Intelligence.:
https://youtu.be/3nKhzlfaOTE
Part 2. Course intro:
https://youtu.be/U_1xdGUQZ5o
Part 3. Word vector representations :
https://youtu.be/juDdkybtTv0
Slides Part 2 :
https://bit.ly/2Gi9V1z
Slides Part 3 :
https://bit.ly/2WQRtSR
Additional materials:
Lecture 1 | Natural Language Processing with Deep Learning:
https://youtu.be/OQQ-W_63UgQ
Lecture 3 | GloVe: Global Vectors for Word Representation:
https://youtu.be/ASn7ExxLZws
Word2Vec Tutorial - The Skip-Gram Model:
http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/
Efficient Estimation of Word Representations in Vector Space:
https://arxiv.org/pdf/1301.3781.pdf
Distributed Representations of Words and Phrases and their Compositionality:
https://arxiv.org/pdf/1310.4546.pdf
Week 2
Lecture 4 | Word Window Classification and Neural Networks:
https://youtu.be/uc2_iwVqrRI
Lecture 5 (before 48 minute) | Backpropagation and Project Advice:
https://youtu.be/isPiE-DBagM
Quizz before 14.02.2019 :
https://goo.gl/forms/9yDB1KAojvpEABtf2
Seminar materials:
Neural networks Part 1 :
https://youtu.be/92Ctk9OzlDg
Additional materials:
(in Russian) Линейные модели классификации и регрессии:
https://habr.com/post/323890/
Intro in Convolutional Neural Networks for Visual Recognition:
http://cs231n.github.io/neural-networks-1
Week 3
Lecture 5 (cs231n) | Neural Networks Part 2:
https://youtu.be/gYpoJMlgyXA
(Optional) Lecture 4 (cs231n) | Backpropagation, Neural Networks Part 1:
https://youtu.be/i94OvYb6noo
Quizz before 28.02.2019:
https://goo.gl/forms/0dVAtBdUUmeJU7NB2
Seminar materials:
Neural networks Part 2 :
https://youtu.be/1zv1IJAS9r4
Slides :
https://docs.google.com/presentation/d/1f_-1g0bTp8gvX300HUa2aEUXPXq9LUgnM1zHygM1W10/
Additional materials:
(in Russian) Регуляризация нейронных сетей:
https://youtu.be/Zz98nDE5b8E
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift :
https://arxiv.org/abs/1502.03167
How Does Batch Normalization Help Optimization?
https://arxiv.org/abs/1805.11604
Machine Learning Glossary:
https://clck.ru/FFZ2x
Week 4
Lecture 8 | Recurrent Neural Networks and Language Models :
https://youtu.be/Keqep_PKrY8
Lecture 9 (after 40 minute) | Machine Translation and Advanced Recurrent LSTMs and GRUs:
https://youtu.be/QuELiw8tbx8?t=2471
Quizz before 14.03.2019:
https://goo.gl/forms/h6Ugof9QENxAAvEG2
Seminar materials:
Recurrent Neural Networks and Language Models Part 1 :
https://drive.google.com/drive/folders/1-4KBREKificOTissVN7ueNXR8JJYDlqy
Recurrent Neural Networks and Language Models Part 2 :
https://youtu.be/Ms3eOk14Uyc
Slides :
https://docs.google.com/presentation/d/1f_-1g0bTp8gvX300HUa2aEUXPXq9LUgnM1zHygM1W10/
Additional materials:
CS 231n Python & NumPy Tutorial :
https://clck.ru/FKKEy
100 numpy exercises:
https://github.com/rougier/numpy-100
The Unreasonable Effectiveness of Recurrent Neural Networks :
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
The Matrix Calculus You Need For Deep Learning :
https://arxiv.org/abs/1802.01528
Week 5
Deep contextualized word representations :
https://arxiv.org/abs/1802.05365
Universal Language Model Fine-tuning for Text Classification :
https://arxiv.org/abs/1801.06146
ELMo helps to further improve your sentence embeddings :
https://towardsdatascience.com/elmo-helps-to-further-improve-your-word-embeddings-c6ed2c9df95f
Introducing state of the art text classification with universal language models :
http://nlp.fast.ai/
The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning) :
https://jalammar.github.io/illustrated-bert
Lecture 13 | Convolutional Neural Networks :
https://youtu.be/Lg6MZw_OOLI
Seminar materials:
Transfer Learning in NLP :
https://youtu.be/aPNf1IRwqN0
Slides :
https://docs.google.com/presentation/d/1rApVsEi-VDyfqIcfoulYTPppR2yXHDV_X_gYPsq4My4/
Week 6
Lecture 8 (2019) | Sequence-to-sequence models and Attention :
https://youtu.be/7m6noV5-l1E
The Annotated Encoder-Decoder with Attention :
https://clck.ru/FQ8gR
Open Source Agenda is not affiliated with "Dl In Nlp 2019" Project. README Source:
MyLtYkRiTiK/dl_in_nlp_2019
Stars
98
Open Issues
0
Last Commit
5 years ago
Repository
MyLtYkRiTiK/dl_in_nlp_2019
Open Source Agenda Badge
Submit Review
Review Your Favorite Project
Submit Resource
Articles, Courses, Videos
Submit Article
Submit a post to our blog
From the blog
Dec 11, 2022
How to Choose Which Programming Language to Learn First?
From the blog
Dec 11, 2022
How to Choose Which Programming Language to Learn First?
Home
Projects
Resources
Alternatives
Blog
Sign In
Sign In to OSA
I agree with
Terms of Service
and
Privacy Policy
Sign In with Github