RWKV is an RNN with transformer-level LLM performance. It can be directl...
AI Code Completions
Chinese version of GPT2 training code, using BERT tokenizer.
An implementation of model parallel GPT-2 and GPT-3-style models using t...
LightSeq: A High Performance Library for Sequence Processing and Generation
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Build, customize and control you own LLMs. From data pre-processing to f...
Rust native ready-to-use NLP pipelines and transformer-based models (BER...
Toolkit for Machine Learning, Natural Language Processing, and Text Gene...
Kashgari is a production-level NLP Transfer learning framework built on ...
Large-scale pretraining for dialogue
Simple UI for LLM Model Finetuning
🦄 State-of-the-Art Conversational AI with Transfer Learning
GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支...