RWKV LM Versions Save

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

5.00

4 months ago

stable release for rwkv v5

4.00

1 year ago

Just a stable release.

2.00

2 years ago

Attached model : ctx1024-layer6-emb512 on enwik8 with 1.65 dev perplexity (0.72 BPC)

0.02

2 years ago

v0.02 with RWKV, MHA_shift, MHA_rotary, MHA_pro, and time-shift mixing.

0.01

2 years ago

first release with RWKV, MHA_rotary, MHA_pro, and time-shift mixing.