中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-...
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
Official 64klang repository
The sources for our intros up to and including Chaos Theory
Source code release for the apEx demotool, actively used for releases be...
Source code from Atlas, our 64k demo presented at Revision 2019 with Mac...