SkalskiP Transformers Save

Everything you need to know about Transformers! ๐Ÿค–

Project README

transformers course

make sense logo

๐Ÿ‘‹ hello

I'm Peter, a software engineer embarking on a fresh journey into the world of transformers, and I invite you to join me! The course is a work in progressโ€”itโ€™s free, open-source, and weโ€™ll be building it together, step by step. We'll explore key concepts, tackle practical exercises, and dissect seminal papers, all while learning and growing together. Using YouTube videos for clarity and Jupyter notebooks for hands-on practice, we're set for our collaborative journey into the world of transformers. Let's dive in together! ๐Ÿš€

๐Ÿš€ Course program

๐Ÿ”‘ Key concepts

  • Encoder-decoder architecture
  • Self-attention
  • Multi-head attention
  • Positional encoding
  • Keys, queries, and values
  • Word embeddings
  • Dynamic padding
  • Tokenization

๐Ÿ› ๏ธ Practical exercises

  • Implement self-attention from scratch
  • Implement multi-head attention from scratch
  • Build a simple transformer model for a sequence-to-sequence task
  • Fine-tune a pre-trained model like BERT or GPT-2 on a specific task
  • Use a pre-trained transformer like GPT-2 for text generation
  • Train ViT on custom dataset for image classification

๐Ÿ—ž๏ธ Paper reviews

  • "Attention Is All You Need" (2017) [link]
  • "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding" (2018) [link]
  • "ViT: An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale" (2020) [link]
  • "DETR: End-to-End Object Detection with Transformers" (2020) [link]
  • "CLIP: Learning Transferable Visual Models From Natural Language Supervision" (2021) [link]
  • "GPT-3: Language Models are Few-Shot Learners" (2020) [link]

๐ŸŽฌ Upcoming videos

  • Introduction to the course (coming soon)
  • Self-attention (coming soon)
  • Multi-head attention (coming soon)
  • Paper review: "Attention Is All You Need" (coming soon)

๐Ÿฆธ Contribution

I would love your help in making this repository even better! Whether you want to correct a typo, add some new content, or if you have any suggestions for improvement, feel free to open an issue.

Open Source Agenda is not affiliated with "SkalskiP Transformers" Project. README Source: SkalskiP/transformers
Stars
127
Open Issues
0
Last Commit
6 months ago

Open Source Agenda Badge

Open Source Agenda Rating