中文Mixtral-8x7B(Chinese-Mixtral-8x7B)
🤖️ an AI chat Telegram bot can Web Search Powered by GPT, Claude2.1/3, ...
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work bas...
Build LLM-powered robots in your garage with MachinaScript For Robots!
Fast Inference of MoE Models with CPU-GPU Orchestration
Examples of RAG using Llamaindex with local LLMs - Gemma, Mixtral 8x7B, ...