π Awesome Treasure of Transformers Models for Natural Language processing contains papers, videos, blogs, official repo along with colab Notebooks. π«βοΈ
Sr No | Algorithm Name | Year | Blog | Video | Official Repo | Code |
---|---|---|---|---|---|---|
1 | GPT-Neo | 2021 | ||||
2 | Transformer | 2017 | ||||
3 | BERT | 2018 | ||||
4 | GPT | 2018 | ||||
5 | Universal Transformer | 2018 | ||||
6 | T-D | 2018 | ||||
7 | GPT-2 | 2019 | ||||
8 | T5 | 2019 | ||||
9 | BART | 2019 | ||||
10 | XLNet | 2019 | ||||
11 | ALBERT | 2019 | ||||
12 | Distil-BERT | 2019 | ||||
13 | Transformer-XL | 2019 | ||||
14 | XLM | 2019 | ||||
15 | ViLBERT | 2019 | ||||
16 | Sparse Transformer | 2019 | ||||
17 | Levenshtein Transformer | 2019 | ||||
18 | CTRL | 2019 | ||||
19 | VideoBERT | 2019 | ||||
20 | Compressive Transformer | 2019 | ||||
21 | CuBERT | 2019 | ||||
22 | BP-Transformer | 2019 | ||||
23 | Adaptively Sparse Transformer | 2019 | ||||
24 | Sandwich Transformer | 2019 | ||||
25 | FSMT | 2019 | ||||
26 | LXMERT | 2019 | ||||
27 | VisualBERT | 2019 | ||||
28 | GPT-3 | 2020 | ||||
29 | ELECTRA | 2020 | ||||
30 | Electric | 2020 | ||||
31 | LongFormer | 2020 | ||||
32 | mBART | 2020 | ||||
33 | Performer | 2020 | ||||
34 | ETC | 2020 | ||||
35 | CodeBERT | 2020 | ||||
36 | mT5 | 2020 | ||||
37 | Reformer | 2020 | ||||
38 | DeBERTa & DeBERTa-v2 | 2020 | ||||
39 | Linformer | 2020 | ||||
40 | RAG | 2020 | ||||
41 | ProphetNet | 2020 | ||||
42 | BigBird | 2020 | ||||
43 | PLATO-2 | 2020 | ||||
44 | Routing Transformer | 2020 | ||||
45 | DeeBERT | 2020 | ||||
46 | DynaBERT | 2020 | ||||
47 | TernaryBERT | 2020 | ||||
48 | MobileBERT | 2020 | ||||
49 | Bort | 2020 | ||||
50 | DeLighT | 2020 | ||||
51 | PAR Transformer | 2020 | ||||
52 | ConvBERT | 2020 | ||||
53 | IB-BERT | 2020 | ||||
54 | MacBERT | 2020 | ||||
55 | RealFormer | 2020 | ||||
56 | Sinkhorn Transformer | 2020 | ||||
57 | SongNet | 2020 | ||||
58 | Funnel Transformer | 2020 | ||||
59 | SC-GPT | 2020 | ||||
60 | SMITH | 2020 | ||||
61 | BinaryBERT | 2020 | ||||
62 | SqueezeBERT | 2020 | ||||
63 | Feedback Transformer | 2020 | ||||
64 | CamemBERT | 2020 | ||||
65 | CPM | 2020 | ||||
66 | DialoGPT | 2020 | ||||
67 | DPR | 2020 | ||||
68 | FlauBERT | 2020 | ||||
69 | HerBERT | 2020 | ||||
70 | LayoutLM | 2020 | ||||
71 | LED | 2020 | ||||
72 | LUKE | 2020 | ||||
73 | M2M100 | 2020 | ||||
74 | MBart and MBart-50 | 2020 | ||||
75 | MegatronBERT | 2020 | ||||
76 | MegatronGPT2 | 2020 | ||||
77 | MPNet | 2020 | ||||
78 | Pegasus | 2020 | ||||
79 | PhoBERT | 2020 | ||||
80 | QDQBERT | 2020 | ||||
81 | RemBERT | 2020 | ||||
82 | RetriBERT | 2020 | ||||
83 | Speech2Text | 2020 | ||||
84 | T5v1.1 | 2020 | ||||
85 | TAPAS | 2020 | ||||
86 | Wav2Vec2 | 2020 | ||||
87 | XLM-ProphetNet | 2020 | ||||
88 | XLM-RoBERTa | 2020 | ||||
89 | XLSR-Wav2Vec2 | 2020 | ||||
90 | Switch Transformer | 2021 | ||||
91 | TNT | 2021 | ||||
92 | Adaptive Span Transformer | 2021 | ||||
93 | Primer | 2021 | ||||
94 | Charformer | 2021 | ||||
95 | MATE | 2021 | ||||
96 | Nystromformer | 2021 | ||||
97 | Subformer | 2021 | ||||
98 | ESACL | 2021 | ||||
99 | PermuteFormer | 2021 | ||||
100 | NormFormer | 2021 | ||||
101 | Fastformer | 2021 | ||||
102 | AutoTinyBERT | 2021 | ||||
103 | EGT | 2021 | ||||
104 | Chinese Pre-trained Unbalanced Transformer | 2021 | ||||
105 | GANFormer | 2021 | ||||
106 | ClipBERT | 2021 | ||||
107 | CodeT5 | 2021 | ||||
108 | I-BERT | 2021 | ||||
109 | ByT5 | 2021 | ||||
110 | CANINE | 2021 | ||||
111 | FNet | 2021 | ||||
112 | LayoutLMV2 | 2021 | ||||
113 | LayoutXLM | 2021 | ||||
114 | GPT-J | 2021 | ||||
115 | Hubert | 2021 | ||||
116 | Perceiver | 2021 | ||||
117 | RoFormer | 2021 | ||||
118 | SegFormer | 2021 | ||||
119 | SEW | 2021 | ||||
120 | SEW-D | 2021 | ||||
121 | Speech2Text2 | 2021 | ||||
122 | Splinter | 2021 | ||||
123 | TrOCR | 2021 | ||||
124 | UniSpeech | 2021 | ||||
125 | UniSpeech-SAT | 2021 | ||||
126 | MarianMT | - |
if you want to contribute on this project please send us email: [email protected] .
ππSpecial Thanks to Komal Lamba for contributing.
Copyright for source code belongs to the original author(s). However, under fair use you are encouraged to fork and contribute minor corrections and updates for the benefit of the reader(s).
Thanks for Reading ...!!!